AdaMix: Mixture-of-Adaptations for Parameter-efficient Model Tuning.Thirty-Seventh AAAI Conference on Artificial Intelligence ( AAAI). Tianci Liu, Haoyu Wang Yaqing Wang, Xiaoqian Wang, Lu Su, Jing Gao. SimFair: A Unified Framework for Fairness-Aware Multi-Label Classification.The IEEE/CVF Conference on Computer Vision and Pattern Recognition ( CVPR).Vancouver, Canada, Jun. Shengkun Tang, Yaqing Wang, Zhenglun Kong, Tianchi Zhang, Yao Li, Caiwen Ding, Yanzhi Wang, Yi Liang, Dongkuan Xu. You Need Multiple Exiting: Dynamic Early Exiting for Accelerating Unified Vision Language Model.Haoyu Wang, Yaqing Wang, Feijie Wu, Hongfei Xue, Jing Gao. Macular: a Multi-Task Adversarial Framework for Cross-Lingual Natural Language Understanding. Haoyu Wang, Ruirui Li, Haoming Jiang, Zhengyang Wang, Xianfeng Tang, Bin Bi, Monica Cheng, Bing Yin, Yaqing Wang, Tuo Zhao, Jing Gao.ΔΆ9th SIGKDD Conference on Knowledge Discovery and Data Mining ( KDD), Long Beach, California, United States, Aug. LightToken: a Task and Model-agnostic Lightweight Token Embedding Framework for Pre-trained Language Models.Hypothesis Generation From Text Based On Co-Evolution Of Biomedical Concepts, KDD 2019.MeSHProbeNet: A Self-attentive Probe Net for MeSH Indexing, Bioinformatics 2019.AutoKnow: Self-Driving Knowledge Collection for Products of Thousands of Types, KDD 2020.To make knowledge resources more findable, accessible, interoperable, and reusable (FAIR), we focus on extracting strcutured knowledge from massive collection of text. Knowledge Discovery: Lots of human knowledge is encoded in text.EANN: Event Adversarial Neural Networks for Multi-Modal Fake News Detection, KDD 2018.Weak Supervision for Fake News Detection via Reinforcement Learning, AAAI 2020.Automatic Validation of Textual Attribute Values in ECommerce Catalog by Learning with Limited Labeled Data, KDD 2020.Multi-modal Emergent Fake News Detection via Meta Neural Process Networks, KDD 2021.To address the domain shift challenge, we develop a series of domain adaption algorithms for applications in various areas. Elastic Learning: One of most important challenges when applying trained models into real-world applications is domain shift, which refers to changes in the data distribution between training dataset, and a dataset models encounter when deployed.Learning from Language Description: Low-shot Named Entity Recognition via Decomposed Framework, EMNLP 2021.Meta Self-training for Few-shot Neural Seuqence Labeling, KDD 2021.LiST: Lite Self-training Makes Efficient Few-shot Learners, NAACL 2021.AdaMix: Mixture-of-Adaptations for Parameter-efficient Model Tuning, EMNLP 2022.To address those issues, we devote our efforts to develop efficient algorithms to achieve the better resource productivity regarding data annotation, model training and deployment costs. Efficient Learning: The ever-growing resource consumption of neural networks generates large carbonfootprint, brings difficulty for academics to engage in research and stops emerging economies from enjoying growing AI benefits.In particular, my research projects are: The primary goal of my research is to develop universal, efficient, reliable and elastic models. My research interests lie at the data mining, natural language processing, and multimodal content understanding. I received my Master of Science degree in Statistics from University of California, San Diego and my Bachelor of Science degree in Mathematics from Shandong University. degree in Electrical and Computer Engineering from Purdue University under the supervision of Prof. Short Biography: I am a Research Scientist at Google Research.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |