Skip to content

sarvex/awadb

 
 

Repository files navigation

AwaDB - the AI Native database for embedding vectors.
Store and search embedding vectors for LLM Applications!

# Install awadb
pip3 install awadb -r requirements.txt 

Now support Linux(python>=3.6), MacOSX(x86-64 architecture, m1 or m2 arm64 architecture python>=3.8)

The core API is only 4 steps:

import awadb
# 1. Initialize awadb client!
awadb_client = awadb.Client()

# 2. Create table!
awadb_client.Create("testdb")

# 3. Add docs to the table. Can also update and delete the doc!
awadb_client.Add([{'name':'jim'}, {'age':39}, 'hello', [1, 3.5, 3]])
awadb_client.Add([{'name':'vincent'}, {'age':28}, 'world', [1, 3.4, 2]])
awadb_client.Add([{'name':'david'}, {'age':45}, 'hi',  [1, 2.4, 4]])
awadb_client.Add([{'name':'tom'}, {'age':25}, 'dolly', [1.3, 2.9, 8.9]])

# 4. Search by specified vector query and the most TopK similar results
results = awadb_client.Search([3.0, 3.1, 4.2], 3)

# Output the results
print(results)

You can also directly use awadb to do the text semantic retrieval
Here the text is embedded by SentenceTransformer which is supported by Hugging Face

import awadb
# 1. Initialize awadb client!
awadb_client = awadb.Client()

# 2. Create table
awadb_client.Create("test_llm1") 

# 3. Add sentences, the sentence is embedded with SentenceTransformer by default
#    You can also embed the sentences all by yourself with OpenAI or other LLMs
awadb_client.Add([{'embedding_text':'The man is happy'}, {'source' : 'pic1'}])
awadb_client.Add([{'embedding_text':'The man is very happy'}, {'source' : 'pic2'}])
awadb_client.Add([{'embedding_text':'The cat is happy'}, {'source' : 'pic3'}])
awadb_client.Add(['The man is eating', 'pic4'])

# 4. Search the most Top3 sentences by the specified query
query = "The man is happy"
results = awadb_client.Search(query, 3)

# Output the results
print(results)

What are the Embeddings?

Any unstructured data(image/text/audio/video) can be transferred to vectors which are generally understanded by computers through AI(LLMs or other deep neural networks).

For example, "The man is happy"-this sentence can be transferred to a 384-dimension vector(a list of numbers [0.23, 1.98, ....]) by SentenceTransformer language model. This process is called embedding.

More detailed information about embeddings can be read from OpenAI

Awadb uses Sentence Transformers to embed the sentence by default, while you can also use OpenAI or other LLMs to do the embeddings according to your needs.

Combined with LLMs(OpenAI, Llama, Vicuna, Alpaca, ChatGLM, Dolly)

Examples of combining LLaMa or quantized Alpaca with llama.cpp to do local knowledge database please see here
Examples of combining ChatGLM to do local knowledge database please see here

Get involved

License

Apache 2.0

Docs

docs

Community

Join the AwaDB community to share any problem, suggestion, or discussion with us:

About

AI Native database for embedding vectors

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 74.8%
  • Python 20.7%
  • CMake 4.1%
  • Other 0.4%