diff --git a/src/oss/python/integrations/chat/fireworks.mdx b/src/oss/python/integrations/chat/fireworks.mdx index 27dce88ef2..17a8421ce1 100644 --- a/src/oss/python/integrations/chat/fireworks.mdx +++ b/src/oss/python/integrations/chat/fireworks.mdx @@ -12,7 +12,7 @@ Fireworks AI is an AI inference platform to run and customize models. For a list | Class | Package | Local | Serializable | [JS support](https://js.langchain.com/docs/integrations/chat/fireworks) | Downloads | Version | | :--- | :--- | :---: | :---: | :---: | :---: | :---: | -| [ChatFireworks](https://python.langchain.com/api_reference/fireworks/chat_models/langchain_fireworks.chat_models.ChatFireworks.html) | [langchain-fireworks](https://python.langchain.com/api_reference/fireworks/index.html) | ❌ | beta | ✅ | ![PyPI - Downloads](https://img.shields.io/pypi/dm/langchain-fireworks?style=flat-square&label=%20) | ![PyPI - Version](https://img.shields.io/pypi/v/langchain-fireworks?style=flat-square&label=%20) | +| [`ChatFireworks`](https://docs.langchain.com/oss/python/integrations/chat/fireworks) | [`langchain-fireworks`](https://pypi.org/project/langchain-fireworks/) | ❌ | beta | ✅ | ![PyPI - Downloads](https://img.shields.io/pypi/dm/langchain-fireworks?style=flat-square&label=%20) | ![PyPI - Version](https://img.shields.io/pypi/v/langchain-fireworks?style=flat-square&label=%20) | ### Model features @@ -61,7 +61,7 @@ Now we can instantiate our model object and generate chat completions: from langchain_fireworks import ChatFireworks llm = ChatFireworks( - model="accounts/fireworks/models/llama-v3-70b-instruct", + model="accounts/fireworks/models/kimi-k2-instruct-0905", # Model library in: https://app.fireworks.ai/models temperature=0, max_tokens=None, timeout=None, @@ -85,7 +85,7 @@ ai_msg ``` ```output -AIMessage(content="J'adore la programmation.", response_metadata={'token_usage': {'prompt_tokens': 35, 'total_tokens': 44, 'completion_tokens': 9}, 'model_name': 'accounts/fireworks/models/llama-v3-70b-instruct', 'system_fingerprint': '', 'finish_reason': 'stop', 'logprobs': None}, id='run-df28e69a-ff30-457e-a743-06eb14d01cb0-0', usage_metadata={'input_tokens': 35, 'output_tokens': 9, 'total_tokens': 44}) +AIMessage(content="J'adore la programmation.", additional_kwargs={}, response_metadata={'token_usage': {'prompt_tokens': 31, 'total_tokens': 41, 'completion_tokens': 10}, 'system_fingerprint': '', 'finish_reason': 'stop', 'logprobs': None, 'model_provider': 'fireworks', 'model_name': 'accounts/fireworks/models/kimi-k2-instruct-0905'}, id='lc_run--a2bdeca3-6394-4c80-97ad-2fc8db9f54bb-0', usage_metadata={'input_tokens': 31, 'output_tokens': 10, 'total_tokens': 41}) ``` ```python @@ -98,7 +98,7 @@ J'adore la programmation. ## API reference -For detailed documentation of all ChatFireworks features and configurations head to the API reference: [python.langchain.com/api_reference/fireworks/chat_models/langchain_fireworks.chat_models.ChatFireworks.html](https://python.langchain.com/api_reference/fireworks/chat_models/langchain_fireworks.chat_models.ChatFireworks.html) +For detailed documentation of all ChatFireworks features and configurations head to the [API reference](https://reference.langchain.com/python/integrations/langchain_fireworks/) @@ -129,14 +129,15 @@ pip install langchain-fireworks set FIREWORKS_API_KEY=your_api_key ``` -2. Set up your model using a model id. If the model is not set, the default model is `fireworks-llama-v2-7b-chat`. See the full, most up-to-date model list on [fireworks.ai](https://fireworks.ai/models). +2. Set up your model using a model id. If the model is not set, the default model is `fireworks-llama-v2-7b-chat`. See the full, most up-to-date model list on [fireworks.ai](https://app.fireworks.ai/models). ```python import getpass import os +from langchain_fireworks import ChatFireworks # Initialize a Fireworks model -llm = Fireworks( +llm = ChatFireworks( model="accounts/fireworks/models/llama-v3p1-8b-instruct", base_url="https://api.fireworks.ai/inference/v1/completions", ) diff --git a/src/oss/python/integrations/llms/fireworks.mdx b/src/oss/python/integrations/llms/fireworks.mdx index a93a89cfdd..2170d5d76a 100644 --- a/src/oss/python/integrations/llms/fireworks.mdx +++ b/src/oss/python/integrations/llms/fireworks.mdx @@ -18,14 +18,14 @@ This example goes over how to use LangChain to interact with `Fireworks` models. | Class | Package | Local | Serializable | [JS support](https://js.langchain.com/v0.1/docs/integrations/llms/fireworks/) | Downloads | Version | | :--- | :--- | :---: | :---: | :---: | :---: | :---: | -| [Fireworks](https://python.langchain.com/api_reference/fireworks/llms/langchain_fireworks.llms.Fireworks.html#langchain_fireworks.llms.Fireworks) | [langchain-fireworks](https://python.langchain.com/api_reference/fireworks/index.html) | ❌ | ❌ | ✅ | ![PyPI - Downloads](https://img.shields.io/pypi/dm/langchain_fireworks?style=flat-square&label=%20) | ![PyPI - Version](https://img.shields.io/pypi/v/langchain_fireworks?style=flat-square&label=%20) | +| [`Fireworks`](https://python.langchain.com/api_reference/fireworks/llms/langchain_fireworks.llms.Fireworks.html#langchain_fireworks.llms.Fireworks) | [`langchain-fireworks`](https://pypi.org/project/langchain-fireworks/) | ❌ | ❌ | ✅ | ![PyPI - Downloads](https://img.shields.io/pypi/dm/langchain_fireworks?style=flat-square&label=%20) | ![PyPI - Version](https://img.shields.io/pypi/v/langchain_fireworks?style=flat-square&label=%20) | ## Setup ### Credentials Sign in to [Fireworks AI](http://fireworks.ai) for the an API Key to access our models, and make sure it is set as the `FIREWORKS_API_KEY` environment variable. -3. Set up your model using a model id. If the model is not set, the default model is fireworks-llama-v2-7b-chat. See the full, most up-to-date model list on [fireworks.ai](https://fireworks.ai). +3. Set up your model using a model id. If the model is not set, the default model is fireworks-llama-v2-7b-chat. See the full, most up-to-date model list on [fireworks.ai](https://app.fireworks.ai/models). ```python import getpass @@ -54,7 +54,7 @@ from langchain_fireworks import Fireworks # Initialize a Fireworks model llm = Fireworks( - model="accounts/fireworks/models/llama-v3p1-8b-instruct", + model="accounts/fireworks/models/llama-v3p1-8b-instruct", # Model library in: https://app.fireworks.ai/models base_url="https://api.fireworks.ai/inference/v1/completions", ) ``` @@ -69,7 +69,7 @@ print(output) ``` ```output - If Manningville Station, Lions rookie EJ Manuel's + That's an easy one. It's Aaron Rodgers. Rodgers has consistently been one ``` ### Invoking with multiple prompts @@ -86,7 +86,7 @@ print(output.generations) ``` ```output -[[Generation(text=" We're not just asking, we've done some research. We'")], [Generation(text=' The conversation is dominated by Kobe Bryant, Dwyane Wade,')]] +[[Generation(text=' You could choose one of the top performers in 2016, such as Vir')], [Generation(text=' -- Keith Jackson\nA: LeBron James, Chris Paul and Kobe Bryant are the')]] ``` ### Invoking with additional parameters @@ -145,4 +145,4 @@ for token in chain.stream({"topic": "bears"}): ## API reference -For detailed documentation of all `Fireworks` LLM features and configurations head to the API reference: [python.langchain.com/api_reference/fireworks/llms/langchain_fireworks.llms.Fireworks.html#langchain_fireworks.llms.Fireworks](https://python.langchain.com/api_reference/fireworks/llms/langchain_fireworks.llms.Fireworks.html#langchain_fireworks.llms.Fireworks) +For detailed documentation of all `Fireworks` LLM features and configurations head to the [API reference](https://reference.langchain.com/python/integrations/langchain_fireworks/)