Skip to content

Making text consistent where it is not #14

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion cookbooks/python/langchain/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@ This folder contains examples of how to achieve specific tasks using the LangCha

## Examples

- [LangChain](lc_openai_getting_started.ipynb): Examples of how LangChain can be used with models provided by the GitHub Models service
- [LangChain](lc_openai_getting_started.ipynb): Examples of how LangChain can be used with models provided by the GitHub Models service.
8 changes: 4 additions & 4 deletions cookbooks/python/mistralai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ The samples were modified slightly to better run with the GitHub Models service.

The following cookbook examples are available:

- [Evaluation](evaluation.ipynb): Provides a number of examples for evaluating the performance of a task performed by an LLM, concretely information extraction, code generation, summarization
- [Function Calling](function_calling.ipynb): Simple example to demonstrate how function calling works with Mistral models
- [Prefix: Use Cases](prefix_use_cases.ipynb): Add a prefix to the model's response via the API
- [Prompting Capabilities](prompting_capabilities.ipynb): Example prompts showing classification, summarization, personalization, and evaluation
- [Evaluation](evaluation.ipynb): Provides a number of examples for evaluating the performance of a task performed by an LLM, concretely information extraction, code generation, summarization.
- [Function Calling](function_calling.ipynb): Simple example to demonstrate how function calling works with Mistral models.
- [Prefix: Use Cases](prefix_use_cases.ipynb): Add a prefix to the model's response via the API.
- [Prompting Capabilities](prompting_capabilities.ipynb): Example prompts showing classification, summarization, personalization, and evaluation.
2 changes: 1 addition & 1 deletion cookbooks/python/openai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,4 @@ The samples were modified slightly to better run with the GitHub Models service.
- [How to call functions with chat models](How_to_call_functions_with_chat_models.ipynb): This notebook shows how to get GPT-4o to determing which of a set of functions to call to answer a user's question.
- [Data extraction and transformation](Data_extraction_transformation.ipynb): This notebook shows how to extract data from documents using gpt-4o-mini.
- [How to stream completions](How_to_stream_completions.ipynb): This notebook shows detailed instructions on how to stream chat completions.
- [Developing Hallucination Guardrails](Developing_hallucination_guardrails.ipynb): Develop an output guardrail that specifically checks model outputs for hallucinations
- [Developing Hallucination Guardrails](Developing_hallucination_guardrails.ipynb): Develop an output guardrail that specifically checks model outputs for hallucinations.
2 changes: 1 addition & 1 deletion samples/python/openai/embeddings_getting_started.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,7 @@
"\n",
"See the cookbook [rag_getting_started](../../../cookbooks/python/llamaindex/rag_getting_started.ipynb) for an example of how to do this using the LLamaIndex framework.\n",
"\n",
"To learn more about what you can do with the GitHub models using the OpenAI Python API, [check out theses cookbooks](../../../cookbooks/python/openai/README.md)\n"
"To learn more about what you can do with the GitHub models using the OpenAI Python API, [check out theses cookbooks](../../../cookbooks/python/openai/README.md).\n"
]
}
],
Expand Down