Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Standardize LLM Docs #24803

Open
1 of 2 tasks
efriis opened this issue Jul 30, 2024 · 0 comments
Open
1 of 2 tasks

Standardize LLM Docs #24803

efriis opened this issue Jul 30, 2024 · 0 comments
Labels
🤖:docs Changes to documentation and examples, like .md, .rst, .ipynb files. Changes to the docs/ folder

Comments

@efriis
Copy link
Member

efriis commented Jul 30, 2024

Privileged issue

  • I am a LangChain maintainer, or was asked directly by a LangChain maintainer to create an issue here.

Issue Content

Issue

To make our LLM integrations as easy to use as possible we need to make sure the docs for them are thorough and standardized. There are two parts to this: updating the llm docstrings and updating the actual integration docs.

This needs to be done for each LLM integration, ideally with one PR per LLM.

Related to broader issues #21983 and #22005.

Docstrings

Each LLM class docstring should have the sections shown in the Appendix below. The sections should have input and output code blocks when relevant.

To build a preview of the API docs for the package you're working on run (from root of repo):

make api_docs_clean; make api_docs_quick_preview API_PKG=openai

where API_PKG= should be the parent directory that houses the edited package (e.g. community, openai, anthropic, huggingface, together, mistralai, groq, fireworks, etc.). This should be quite fast for all the partner packages.

Doc pages

Each LLM docs page should follow this template.

  • TODO(Erick): populate a complete example

You can use the langchain-cli to quickly get started with a new chat model integration docs page (run from root of repo):

poetry run pip install -e libs/cli
poetry run langchain-cli integration create-doc --name "foo-bar" --name-class FooBar --component-type LLM --destination-dir ./docs/docs/integrations/llms/

where --name is the integration package name without the "langchain-" prefix and --name-class is the class name without the "VectorStore" prefix. This will create a template doc with some autopopulated fields at docs/docs/integrations/llms/foo_bar.ipynb.

To build a preview of the docs you can run (from root):

make docs_clean
make docs_build
cd docs/build/output-new
yarn
yarn start

Appendix

Expected sections for the LLM class docstring.

    """__ModuleName__ completion model integration.

    # TODO: Replace with relevant packages, env vars.
    Setup:
        Install ``__package_name__`` and set environment variable ``__MODULE_NAME___API_KEY``.

        .. code-block:: bash

            pip install -U __package_name__
            export __MODULE_NAME___API_KEY="your-api-key"

    # TODO: Populate with relevant params.
    Key init args — completion params:
        model: str
            Name of __ModuleName__ model to use.
        temperature: float
            Sampling temperature.
        max_tokens: Optional[int]
            Max number of tokens to generate.

    # TODO: Populate with relevant params.
    Key init args — client params:
        timeout: Optional[float]
            Timeout for requests.
        max_retries: int
            Max number of retries.
        api_key: Optional[str]
            __ModuleName__ API key. If not passed in will be read from env var __MODULE_NAME___API_KEY.

    See full list of supported init args and their descriptions in the params section.

    # TODO: Replace with relevant init params.
    Instantiate:
        .. code-block:: python

            from __module_name__ import __ModuleName__LLM

            llm = __ModuleName__LLM(
                model="...",
                temperature=0,
                max_tokens=None,
                timeout=None,
                max_retries=2,
                # api_key="...",
                # other params...
            )

    Invoke:
        .. code-block:: python

            input_text = "The meaning of life is "
            llm.invoke(input_text)

        .. code-block:: python

            # TODO: Example output.

    # TODO: Delete if token-level streaming isn't supported.
    Stream:
        .. code-block:: python

            for chunk in llm.stream(input_text):
                print(chunk)

        .. code-block:: python

            # TODO: Example output.

        .. code-block:: python

            ''.join(llm.stream(input_text))

        .. code-block:: python

            # TODO: Example output.

    # TODO: Delete if native async isn't supported.
    Async:
        .. code-block:: python

            await llm.ainvoke(input_text)

            # stream:
            # async for chunk in (await llm.astream(input_text))

            # batch:
            # await llm.abatch([input_text])

        .. code-block:: python

            # TODO: Example output.
    """  # noqa: E501
@dosubot dosubot bot added the 🤖:docs Changes to documentation and examples, like .md, .rst, .ipynb files. Changes to the docs/ folder label Jul 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:docs Changes to documentation and examples, like .md, .rst, .ipynb files. Changes to the docs/ folder
Projects
None yet
Development

No branches or pull requests

1 participant