Skip to content
/ o_o Public
forked from tcsenpai/multi1

o_o: Using Ollama to create o1-like reasoning chains

License

Notifications You must be signed in to change notification settings

fengwang/o_o

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

o_o: Ollama powered O1-like reasing chains


Build up the docker image

docker build --file Dockerfile . -t  o_o

Bring up the docker container

docker run --rm -it  -e OLLAMA_URL=<YOUR_OLLAMA_URL> -e OLLAMA_MODEL=<YOUR_OLLAMA_MODEL> -p <PORT>:8501 o_o

in which YOUR_OLLAMA_URL is the URL of your Ollama server (such as http://1.2.3.4:5678), YOUR_OLLAMA_MODEL is the model you want to use (should be llama3.1:70b, but gemma2:27b is also good, and should having been installed in your Ollama server), and <PORT> is the port on which you want to run the server.

Then visit http://<YOUR_IP_ADDRESS>:<PORT> in your browser to see the app.

Examples

In the context of Lie Group and Lie Algebra, let $R \in E$ be an irreducible root system. Show that then $E$ is an irreducible representation of the Weyl group $W$.

LieGroup LieGroup

Which number is larger? 9.8 or 9.11?

compare compare

What would it cost if cover the surface of the earth with a layer of gold 1 meter in thickness?

gold

Credits

About

o_o: Using Ollama to create o1-like reasoning chains

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 97.0%
  • Dockerfile 3.0%