You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I was recently trying VS code with the Continue Plugin, configured to use my own OLLAMA server and LLMs (https://ollama.ai) and was amazed how well this works.
I'm not a big fan of shipping my code to companies to train the models they sell, so I'm not a fan of copilot and the like. But with the option of local open-source LLMs this becomes a game changer.
Related to #20632
The text was updated successfully, but these errors were encountered:
Hi, I was recently trying VS code with the Continue Plugin, configured to use my own OLLAMA server and LLMs (https://ollama.ai) and was amazed how well this works.
I'm not a big fan of shipping my code to companies to train the models they sell, so I'm not a fan of copilot and the like. But with the option of local open-source LLMs this becomes a game changer.
Related to #20632
The text was updated successfully, but these errors were encountered: