Skip to content

Codebase for Talk Is Deep paper. Generate 1 million, 10-turn scientific conversations and train LLMs with the data.

License

Notifications You must be signed in to change notification settings

tomorrow-computing-company/talk-is-deep

Repository files navigation

talk-is-deep

This repository contains the code and resources for the "Talk is Deep" project, which aims to generate 1 million scientific conversations and train large language models on them, to see if multi-turn conversations with no additional data improve performance.

About

Codebase for Talk Is Deep paper. Generate 1 million, 10-turn scientific conversations and train LLMs with the data.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published