IntraAI is an intelligent chat platform that adapts to users' knowledge levels and helps them learn over time. It provides accessible AI explanations that are understandable and usable by everyone, regardless of age, background, or ability.
Note: IntraAI's UI is built on top of chatbot-ui, an excellent open-source AI chat interface. We're grateful to Mckay Wrigley and the chatbot-ui community for providing such a solid foundation for our project.
- Learning Companion: Interactive AI partner that adapts to your knowledge level
- Learning Portrait: Track your learning progress and patterns over time
- Deep Dive Analysis: Understand why AI responds the way it does with detailed explanations
- Context Exploration: Explore related topics and connections
- Content Comparison: Compare multiple AI responses side-by-side
- Industry-Specific Analysis: Get tailored evaluations for different industries
- Performance Metrics: Detailed metrics including response time, token count, and quality scores
- Customizable AI Personas: Define AI identity, style, level, values, and task types
- Flexible Communication: Adapt AI responses to your preferences and needs
- Semantic Analysis: Analyze text importance and semantic awareness
- Visualization: Visual representation of word importance and context
- Alignment Signal: Analyze the alignment signal of the text
- BERT Model: The trained BERT model for alignment signal analysis is available for download at Google Drive
Follow these steps to get your own IntraAI instance running locally.
git clone https://github.com/wad3birch/IntraAI.git
cd intraai/chatbot-uiOpen a terminal in the root directory of your local IntraAI repository and run:
npm installIntraAI uses Supabase for secure data storage, enabling multi-modal use cases and providing a robust backend for learning analytics and user preferences.
You will need to install Docker to run Supabase locally. You can download it here for free.
MacOS/Linux
brew install supabase/tap/supabaseWindows
scoop bucket add supabase https://github.com/supabase/scoop-bucket.git
scoop install supabaseIn your terminal at the root of your local IntraAI repository, run:
supabase startIn your terminal at the root of your local IntraAI repository, run:
cp .env.local.example .env.localGet the required values by running:
supabase statusNote: Use API URL from supabase status for NEXT_PUBLIC_SUPABASE_URL
Now go to your .env.local file and fill in the values.
If the environment variable is set, it will disable the input in the user settings.
In the 1st migration file supabase/migrations/20240108234540_setup.sql you will need to replace 2 values with the values you got above:
project_url(line 53):http://supabase_kong_chatbotui:8000(default) can remain unchanged if you don't change yourproject_idin theconfig.tomlfileservice_role_key(line 54): You got this value from runningsupabase status
This prevents issues with storage files not being deleted properly.
Follow the instructions here.
In your terminal at the root of your local IntraAI repository, run:
npm run chatYour local instance of IntraAI should now be running at http://localhost:3000. Be sure to use a compatible node version (i.e. v18).
You can view your backend GUI at http://localhost:54323/project/default/editor.
Follow these steps to get your own IntraAI instance running in the cloud.
Repeat steps 1-4 in "Local Quickstart" above.
You will want separate repositories for your local and hosted instances.
Create a new repository for your hosted instance of IntraAI on GitHub and push your code to it.
Go to Supabase and create a new project.
Once you are in the project dashboard, click on the "Project Settings" icon tab on the far bottom left.
Here you will get the values for the following environment variables:
Project Ref: Found in "General settings" as "Reference ID"Project ID: Found in the URL of your project dashboard (Ex: https://supabase.com/dashboard/project/<YOUR_PROJECT_ID>/settings/general)
While still in "Settings" click on the "API" text tab on the left.
Here you will get the values for the following environment variables:
Project URL: Found in "API Settings" as "Project URL"Anon key: Found in "Project API keys" as "anon public"Service role key: Found in "Project API keys" as "service_role" (Reminder: Treat this like a password!)
Next, click on the "Authentication" icon tab on the far left.
In the text tabs, click on "Providers" and make sure "Email" is enabled.
We recommend turning off "Confirm email" for your own personal instance.
Open up your repository for your hosted instance of IntraAI.
In the 1st migration file supabase/migrations/20240108234540_setup.sql you will need to replace 2 values with the values you got above:
project_url(line 53): Use theProject URLvalue from aboveservice_role_key(line 54): Use theService role keyvalue from above
Now, open a terminal in the root directory of your local IntraAI repository. We will execute a few commands here.
Login to Supabase by running:
supabase loginNext, link your project by running the following command with the "Project ID" you got above:
supabase link --project-ref <project-id>Your project should now be linked.
Finally, push your database to Supabase by running:
supabase db pushYour hosted database should now be set up!
Go to Vercel and create a new project.
In the setup page, import your GitHub repository for your hosted instance of IntraAI. Within the project Settings, in the "Build & Development Settings" section, switch Framework Preset to "Next.js".
In environment variables, add the following from the values you got above:
NEXT_PUBLIC_SUPABASE_URLNEXT_PUBLIC_SUPABASE_ANON_KEYSUPABASE_SERVICE_ROLE_KEYNEXT_PUBLIC_OLLAMA_URL(only needed when using local Ollama models; default:http://localhost:11434)
You can also add API keys as environment variables.
OPENAI_API_KEYAZURE_OPENAI_API_KEYAZURE_OPENAI_ENDPOINTAZURE_GPT_45_VISION_NAME
For the full list of environment variables, refer to the '.env.local.example' file. If the environment variables are set for API keys, it will disable the input in the user settings.
Click "Deploy" and wait for your frontend to deploy.
Once deployed, you should be able to use your hosted instance of IntraAI via the URL Vercel gives you.
In your terminal at the root of your local IntraAI repository, run:
npm run updateIf you run a hosted instance you'll also need to run:
npm run db-pushto apply the latest migrations to your live database.
If you encounter a coherence JSON error or similar data consistency issues, open your browser's developer console and run the following code:
localStorage.clear();
sessionStorage.clear();
document.cookie.split(";").forEach(function(c) {
document.cookie = c.replace(/^ +/, "").replace(/=.*/, "=;expires=" + new Date().toUTCString() + ";path=/");
});
if ('indexedDB' in window) {
indexedDB.databases().then(databases => {
databases.forEach(db => {
indexedDB.deleteDatabase(db.name);
});
});
}
window.location.reload(true);This will clear all local storage, session storage, cookies, and IndexedDB data, then reload the page. This should resolve most data consistency issues.
We restrict "Issues" to actual issues related to the codebase.
We're getting excessive amounts of issues that amount to things like feature requests, cloud provider issues, etc.
If you are having issues with things like setup, please refer to the "Help" section in the "Discussions" tab above.
Issues unrelated to the codebase will likely be closed immediately.
We highly encourage you to participate in the "Discussions" tab above!
Discussions are a great place to ask questions, share ideas, and get help.
Odds are if you have a question, someone else has the same question.
We are working on a guide for contributing.
See the license file for details.

