Skip to content

mongodb-developer/Google-Cloud-Semantic-Search

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MongoDB Atlas and Google Cloud Vector Search

This is a demo of vector search using MongoDB Atlas and Google Cloud. The dataset is a catalogue of books. The project uses Node.js and express for the server and Angular for the client.

Prerequisites

  1. Node.js LTS.

Setup

Follow the instructions below to run the demo locally.

Import the dataset into MongoDB Atlas

  1. Clone the project.

    git clone https://github.com/mongodb-developer/Google-Cloud-Semantic-Search
  2. Navigate to the prepare-data directory and install the dependencies.

    npm install
  3. Create a free MongoDB Atlas account.

  4. Deploy a free M0 database cluster in a region of your choice.

  5. Complete the security quickstart.

  6. Add your connection string to prepare-data/.env.
    Make sure to replace the placeholders with credentials of the database user you created in the security quickstart.

    prepare-data/.env

    ATLAS_URI="<your-connection-string>"
    

    Note that you will have to create the file .env in the prepare-data folder.

  7. Run the script for importing the dataset into your database.

    node ./prepare-data/import-data.js
  8. Navigate to your MongoDB Atlas database deployment and verify that the data is loaded successfully.

Generate embeddings

  1. Create a new Google Cloud project with billing enabled.

  2. Enable the Vertex AI and Cloud Functions APIs.

  3. Deploy a public 2nd generation Google Cloud Function with the following implementation:

    Replace the PROJECT_ID and LOCATION placeholders in the file google-cloud-functions/generate-embeddings/main.py before deploying the function. Remember to also update the Entry Point to generate_embeddings.

    Note: The LOCATION parameter defines the region where the cloud function will run, make sure this region supports VertexAI Model Garden. europe-west1 does not.

    If you have the gcloud CLI installed, run the following deployment command.

    gcloud functions deploy generate-embeddings \
      --region=us-central1 \
      --gen2 \
      --runtime=python311 \
      --source=./google-cloud-functions/generate-embeddings/ \
      --entry-point=generate_embeddings \
      --trigger-http \
      --allow-unauthenticated
  4. Add the deployed function URL to prepare-data/.env.

    prepare-data/.env

    ATLAS_URI="<your-connection-string>"
    EMBEDDING_ENDPOINT="<your-cloud-function-url>"
    
  5. Run the embeddings generation script.

    node ./prepare-data/create-embeddings.js

    Note that Vertex AI has a limitation for generating 600 embeddings per minute. If you're getting 403 errors, wait for a minute and rerun the script. Repeat until all documents are 'vectorized'.

  6. Go back to your MongoDB Atlas project and open the deployed database cluster. Verify that the bookstore.books collection has a new text_embedding field containing a multi-dimensional vector.

  7. Navigate to the Atlas Search Tab and click on Create Search Index.

  8. Select JSON Editor under Atlas Vector Search and then click on Next.

  9. Select the Database and Collection and then insert the following index definition and click 'Save'.

    {
      "fields": [
        {
          "numDimensions": 768,
          "path": "text_embedding",
          "similarity": "euclidean",
          "type": "vector"
        }
      ]
    }

Running the project

  1. Navigate to the server directory.

  2. Copy the prepare-data/.env.

    cp ../prepare-data/.env .
    
  3. Install the dependencies and run the application.

    npm install && npm start
    
  4. Open a new terminal window to run the client application.

  5. In the new window, navigate to the client directory.

  6. Install the dependencies and run the project.

    npm install && npm start
    
  7. Open the browser at localhost:4200 and find books using the power of vector search!

Disclaimer

Use at your own risk; not a supported MongoDB product