@vkash16 and I showed our AI assistant to @bcristei
@altryne in the @SHACK15sf hackathon today. We used @OpenInterpreter , @GroqInc , @ollama with llama3 to have an AI agent control our computer without typing. We aim to shift the paradigm of voice assistants to handle agentic workflows automating tasks like an actual personal assistant. Here’s our demo examples:
— Lorenze Jay (@lorenzejayTech) View our twitter post here
Most people have a smartphone, and instead of an additional device, would prefer connecting with their AI agents through the device they already own.
Our focus is on agentic accessibility through the users current mobile device to help reshape the paradigm of how we interact with our mobile command interfaces.
Our first example is leveraging open interpreter (and future local AI agents) triggered by your phone for a hands/typing free user experience.
uvicorn server:app --host 0.0.0.0 --port 8000
ngrok http --domain=<your-domain-set> 8000
- ngrok
- open interpreter
- fastapi
https://www.icloud.com/shortcuts/f0618b8008654d6d93a4422cb1564dd6