Introduction

I am currently a Ph.D. student of the Department of Computing (COMP), The Hong Kong Polytechnic University (funded by HKPFS). Before joining the PolyU, I received my Master degree of Information Technology (with Distinction) from the University of Melbourne, under the supervision of Dr. Lea Frermann. In 2021, I got my bachelor degree of Information Security from Shanghai Jiao Tong University. I am a self-motivated person and have strong passion for scientific research. Currently, my research interest lies in Natural Language Processing, Drug Discovery, and Recommender Systems. I am always welcoming collaboration from solid partners.

Research Interest

  • Natural Language Processing
  • Drug Discovery
  • Recommender Systems

News

  • Our paper, “Empowering Molecule Discovery for Molecule-Caption Translation with Large Language Models: A ChatGPT Perspective” is accpeted by IEEE TKDE. Paper Link
  • Our paper, “Large Language Models are In-Context Molecule Learners” is released on arXiv. Paper Link We also release the model weights on Huggingface Link.
  • Our paper, “Recommender Systems in the Era of Large Language Models (LLMs)” is accepted by IEEE TKDE! More
  • Our paper, “Fast graph condensation with structure-based neural tangent kernel” is accepted by WWW 24. More
  • Our tutorial on “Recommender Systems in the Era of Large Language Models (LLMs)” is accepted by ICDM 2023. More

Publication

  • Jiatong Li, Wei Liu, Zhihao Ding, Wenqi Fan, Yuqiang Li, Qing Li. (2024). Large Language Models are In-Context Molecule Learners. arXiv preprint arXiv:2403.04197.
  • Lin Wang, Wenqi Fan, Jiatong Li, Yao Ma, and Qing Li. (2023). Fast graph condensation with structure-based neural tangent kernel. arXiv preprint arXiv:2310.11046. (Accepted by WWW 24)
  • Wenqi Fan, Zihuai Zhao, Jiatong Li, Yunqing Liu, Xiaowei Mei, Yiqi Wang, Jiliang Tang, and Qing Li. (2023). Recommender Systems in the Era of Large Language Models (LLMs). arXiv preprint arXiv:2307.02046. (Accepted by IEEE TKDE)
  • Jiatong Li, Yunqing Liu, Wenqi Fan, Xiao-yong Wei, Hui Liu, Jiliang Tang, Qing Li. (2023). Empowering Molecule Discovery for Molecule-Caption Translation with Large Language Models: A ChatGPT Perspective. arXiv preprint arXiv:2306.06615. (Accepted by IEEE TKDE)
  • Lea Ferrmann, Jiatong Li, Shima Khanehzar, Gosia Mikolajczak. (2023). Conflicts, Villains, Resolutions: Towards models of Narrative Media Framing. ACL 2023. (Oral Presentation)
  • Wenqi Fan, Chengyi Liu, Yunqing Liu, Jiatong Li, Hang Li, Hui Liu, Jiliang Tang, Qing Li. (2023). Generative Diffusion Models on Graphs: Methods and Applications. IJCAI 2023.
  • Mao, Qinghua, Jiatong Li, and Kui Meng. (2022). Improving Chinese Named Entity Recognition by Search Engine Augmentation. arXiv preprint arXiv:2210.12662.
  • Jiatong Li, Bin He, and Fei Mi. (2022). Exploring Effective Information Utilization in Multi-Turn Topic-Driven Conversations. arXiv preprint arXiv:2209.00250.
  • Jiatong Li, Kui Meng. (2021). MFE-NER: Multi-feature Fusion Embedding for Chinese Named Entity Recognition. arXiv preprint arXiv:2109.07877.
  • Chaowang Zhao, Jian Yang*, Jiatong Li. (2021). Generation of Hospital Emergency Department Layouts Based on Generative Adversarial Networks. Journal of Building Engineering, 43, 102539.
  • Chaowang Zhao, Jian Yang*, Wuyue Xiong, Jiatong Li. (2021). Two Generative Design Methods of Hospital Operating Department Layouts Based on Healthcare Systematic Layout Planning and Generative Adversarial Network. Journal of Shanghai Jiaotong University (Science), 26, 103-115.

Scholarship

  • Hong Kong PhD Fellowship Scheme
  • Melbourne Graduate Grant

Awards

  • Second Prize, Aecore Cup Digital Twin Application Competition 2021
  • Finalist Award, Mathematical Contest in Modelling (MCM), 2020

Contact with me

Welcome to contact with me via email jiatong.li AT connect.polyu.hk