Some Research Topics of My Current Interests

Deep Logic Networks

Towards developing an end-to-end learning framework to integrate high-level logic representation and deep learning architectures. Our representative works (with applications to NLP tasks) by far: Our previous attempts on encoding syntactic information for NLP tasks (fine-grained sentiment analysis):

Deep Models Compression

Towards learning compressed deep models while preserving satisfactory prediction performance for edge or resource-limted devices. Our representative works by far:

Distributed (or Federated) Multi-task Learning

Towards developing communication-efficient distributed learning frameworks for multi-task learning (i.e, jointly and efficiently learning multiple tasks in a geo-distributed computing environment). Our representative works by far: Other related works on distributed learning:

Learning from Time Series Data

Towards developing a general learning framework for time series data, where statistical and temporal patterns underlying time series data can be effectively captured. Our representative works (with an application to sensor-based activity recognition) by far: