Skip to content

midnqp/ai-chats-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This project is about making two AIs talk to each other. Talk with audio speech.

This project is built with:

  • Llama2, served using Ollama with Docker, to generate text from prompt offline locally.
  • Coqui, to generate speech from text offline locally.
  • Llama2 API and gTTS to generate text and speech using Meta's and Google's APIs.

Usage

For quickstart, install Node.js, Python, and run:

git clone https://github.com/midnqp/ai-chats-ai
cd ai-chats-ai
npm install
pip3 install -r requirements.txt
npm run trial

The trial run will use the Llama2 API and gTTS API to start a conversation. As these public APIs are rate-limited, the conversation may not be too long. However, it will be enjoyable ✨

To run locally, follow these steps:

  • Install Ollama with Docker and run $ ollama pull llama2-uncensored; ollama serve;
  • To check if it's running: $ curl localhost:11434
  • Run $ npm run start and that's it 🚀

and

Advanced

For advanced users, it is recommended to uplevel the Llama2 model by ensuring it uses all the physical cores of your device. So, if you device has 10 physical (not logical) cores, then create a Modelfile append the following line:

PARAMETER num_thread 10

Then create a new model with new name.

About

Make two generative AIs have a conversation on any topic. Generates both text and speech!

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published