Skip to content

llmco/llamaapi-python

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Llama API Client

LlamaAPI is a Python SDK for interacting with the Llama API. It abstracts away the handling of aiohttp sessions and headers, allowing for a simplified interaction with the API.

Installation

You can install the LlamaAPI SDK using pip:

pip install llamaapi

Usage

After installing the SDK, you can use it in your Python projects like so:

import json
from llamaapi import LlamaAPI

# Initialize the llamaapi with your api_token
llama = LlamaAPI("<your_api_token>")

# Define your API request
api_request_json = {
  "messages": [
    {"role": "user", "content": "Extract the desired information from the following passage.:\n\nHi!"},
  ],
  "functions": [
        {'name': 'information_extraction',
         'description': 'Extracts the relevant information from the passage.',
         'parameters': {
             'type': 'object',
             'properties': {
                 'sentiment': {
                    'title': 'sentiment',
                    'type': 'string',
                    'description': 'the sentiment encountered in the passage'
                    },
                 'aggressiveness': {
                    'title': 'aggressiveness',
                    'type': 'integer',
                    'description': 'a 0-10 score of how aggressive the passage is'
                    },
                 'language': {
                    'title': 'language',
                    'type': 'string',
                    'description': 'the language of the passage'
                    }
             },
             'required': ['sentiment', 'aggressiveness', 'language']
         }
      }
    ],
  "stream": False,
  "function_call": {"name": "information_extraction"},
}

# Make your request and handle the response
response = llama.run(api_request_json)
print(json.dumps(response.json(), indent=2))

Other parameters that you can pass in the request json is:

{
  ...
  "max_length" = 500,
  "temperature"= 0.1,
  "top_p"= 1.0,
  "frequency_penalty"=1.0
  ...
}

Note: Stream is still not working, so it is recommended to submit with stream: False.

Change Log

Version 0.1: Initial release

Contributing

We welcome contributions to this project. Please see the Contributing Guidelines for more details.

License

llamaapi SDK is licensed under the MIT License. Please see the License File for more details.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages