Skip to content

Azure-Samples/AI-KnowlEDGE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

76 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Contributors Forks Stargazers Issues MIT License LinkedIn


Logo

AIknowlEDGE

AIknowlEDGE is a desktop application built with Electron.js and Python FastAPI to showcase Disconnected Containers.
Explore the docs »

View Demo · Report Bug · Request Feature

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Roadmap
  5. Contributing
  6. License
  7. Contact
  8. Acknowledgments

About The Project

Product Name Screen Shot

Azure AI containers gives you the flexibility to run Azure AI services locally in containers. Connected containers run locally in your environment and send usage information to the cloud for billing. Disconnected containers are intended for scenarios where no connectivity with the cloud is needed for the containers to run.

(back to top)

Built With

  • Python
  • Fastapi
  • Docker

(back to top)

Getting Started

Follow these steps to run the project locally.

Prerequisites

  • Python 3.12+
  • Docker
  • VS Code
  • Ollama

Installation

  1. Clone the repository:

    git clone https://github.com/Azure-Samples/AI-knowlEDGE.git
    cd AI-knowlEDGE
    
  2. Install Docker and start. Then open the cmd and pull the container by running the following command:

    docker pull mcr.microsoft.com/azure-cognitive-services/textanalytics/summarization:cpu
  3. Get your Cognitive Services keys and endpoints for the two products (available on Azure) - Azure Document Intelligence and Azure AI Language:

    AZURE_DOCUMENT_ANALYSIS_ENDPOINT
    AZURE_DOCUMENT_ANALYSIS_KEY
    LANGUAGE_ENDPOINT
    LANGUAGE_KEY
    
  4. Create a folder on your C:/ drive named ExtractiveModel.

  5. Download the SLMs for the Summarization Service. Start Docker and run:

    docker run -v C:\ExtractiveModel:/models mcr.microsoft.com/azure-cognitive-services/textanalytics/summarization:cpu downloadModels=ExtractiveSummarization billing=LANGUAGE_ENDPOINT apikey=LANGUAGE_KEY
    
  6. Set up the Python environment and install dependencies:

    cd backend
    python -m venv venv
    venv\Scripts\activate  # On Linux use `source venv/bin/activate`
    pip install -r requirements.txt
  7. Create a docker-compose.yml with the following:

    version: "3.9"
    services:
      azure-form-recognizer-read:
        container_name: azure-form-recognizer-read
        image: mcr.microsoft.com/azure-cognitive-services/form-recognizer/read-3.1
        environment:
          - EULA=accept
          - billing=<document-intelligence-endpoint>
          - apiKey=<document-intelligence-key>
        ports:
          - "5000:5000"
        networks:
          - ocrvnet
    
      textanalytics:
        image: mcr.microsoft.com/azure-cognitive-services/textanalytics/summarization:cpu
        environment:
          - eula=accept
          - rai_terms=accept
          - billing=<language-endoint>
          - apikey=<language-key> 
        volumes:
          - "C:\\ExtractiveModel:/models"
        ports:
          - "5001:5000"
    
    networks:
      ocrvnet:
        driver: bridge
  8. Create the container by running:

    docker-compose up
  9. Make a .env file:

    AZURE_DOCUMENT_ANALYSIS_ENDPOINT=http://localhost:5000
    AZURE_DOCUMENT_ANALYSIS_KEY=<document-intelligence-key>
    LANGUAGE_ENDPOINT=http://localhost:5001
    LANGUAGE_KEY=<language-key>
    
  10. Download Ollama and install at least one SLM and one embedding model:

    ollama pull phi3
    ollama pull nomic-embed-text
  11. Start the application from VS Code: F5 or Run > Start Debugging

(back to top)

Debugging

  1. Start the FastAPI backend:

    cd backend
    uvicorn main:app --port 8000
  2. Start the Streamlit app:

    cd frontend
    streamlit run app.py --server.port=8501

(back to top)

Roadmap

  • Other Containers Integration
    • Speech service
    • Translation service
  • Other Use Cases Integration
  • Packaging
    • Single-click installation
    • Cross-platform installation

See the open issues for proposed features and known issues.

(back to top)

Contributing

Contributions make open source great. We appreciate all contributions.

  1. Fork this repo
  2. Create a Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

(back to top)

License

Distributed under the MIT License. See LICENSE.txt for more info.

(back to top)

Contact

Project Link: AIKnowlEDGE

(back to top)

Acknowledgments

  • Microsoft France

(back to top)

About

No description, website, or topics provided.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages