Skip to content

A demo of how to run and use the SLM Phi-3 locally, using Ollama or LM Studio, with examples of also invoking plugins/functions

Notifications You must be signed in to change notification settings

bronthulke/Phi3ConsoleDemo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Demo branches

  1. main - the initial "cut" of the demo, where we just use LM Studio (or Ollama) to chat with the phi-3 model
  2. step-2 - the next step, where we start using the AI to gather intent of a request, and then manually invoke functions to do our bidding
  3. step-3 - the final step (for now!), where we switch to using Azure Open AI with "automatic invoking" of the functions (plugins), together with a planner to orchestrate it all

Helpful links

Here are some helpful links for learning more about Phi-3, Semantic Kernal, plugins etc

and then within that whole Learn section, the following are particularly useful articles, including code examples that you can build to get going quickly:

And the ultimage goal...

About

A demo of how to run and use the SLM Phi-3 locally, using Ollama or LM Studio, with examples of also invoking plugins/functions

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages