Skip to content

yihong-chen/yihong-chen

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

12 Commits
Β 
Β 

Repository files navigation

Welcome! πŸ‘‹

My name is Yihong Chen. I research on AI knowledge acquisition, specifically on how different AI systems can learn to abstract, represent and use concepts/symbols efficiecntly.

I am open to collaborations on topics related to embedding learning, link prediction, and language modeling. If you would like to get in touch, you can reach me by emailing yihong-chen AT outlook DOT com, or simply booking a Zoom meeting with me.

Looking for Some Inspirations?

πŸ’₯ Mar 2024, Quanta Magazine covers our research on periodical embedding forgetting. Check out the article here.

πŸ’₯ Dec 2023, I will present our forgetting paper at NeurIPS 2023. Check out the poster here.

πŸ’₯ Sep 2023, our latest work Improving Language Plasticity via Pretraining with Active Forgetting is accepted by NeurIPS 2023!

πŸ’₯ Sep 2023, I presented our latest work on forgetting at IST-Unbabel seminar.

πŸ’₯ Jul 2023, I presented our latest work on forgetting language modelling at ELLIS Unconference 2023. The slides are available here. Feel free to leave your comments.

πŸ’₯ Jul 2023, discover the power of forgetting in language modelling! Our latest work, Improving Language Plasticity via Pretraining with Active Forgetting, shows how pretraining a language model with active forgetting can help it quickly learn new languages. You'll be amazed by the model plasticity imbued via pretraining with forgetting. Check it out :)

πŸ’₯ Nov 2022, our paper, REFACTOR GNNS: Revisiting Factorisation-based Models from a Message-Passing Perspective, will appear in NeurIPS 2022! If you're interested in understanding why FMs can be some special GNNs and make them usable on new graphs, check it out!

πŸ’₯ Jun 2022, if you're looking for a hands-on repo to start experimenting with link prediction, check out our repo ssl-relation-prediction. Simple code, easy to hack πŸš€

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published