Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build(deps): bump github/codeql-action from 2 to 3 #2041

Merged

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Apr 15, 2024

Bumps github/codeql-action from 2 to 3.

Release notes

Sourced from github/codeql-action's releases.

CodeQL Bundle v2.17.0

Bundles CodeQL CLI v2.17.0

Includes the following CodeQL language packs from github/codeql@codeql-cli/v2.17.0:

CodeQL Bundle v2.16.6

Bundles CodeQL CLI v2.16.6

Includes the following CodeQL language packs from github/codeql@codeql-cli/v2.16.6:

CodeQL Bundle v2.16.5

Bundles CodeQL CLI v2.16.5

Includes the following CodeQL language packs from github/codeql@codeql-cli/v2.16.5:

... (truncated)

Changelog

Sourced from github/codeql-action's changelog.

3.25.0 - 15 Apr 2024

  • The deprecated feature for extracting dependencies for a Python analysis has been removed. #2224

    As a result, the following inputs and environment variables are now ignored:

    • The setup-python-dependencies input to the init Action
    • The CODEQL_ACTION_DISABLE_PYTHON_DEPENDENCY_INSTALLATION environment variable

    We recommend removing any references to these from your workflows. For more information, see the release notes for CodeQL Action v3.23.0 and v2.23.0.

  • Automatically overwrite an existing database if found on the filesystem. #2229

  • Bump the minimum CodeQL bundle version to 2.12.6. #2232

  • A more relevant log message and a diagnostic are now emitted when the file program is not installed on a Linux runner, but is required for Go tracing to succeed. #2234

3.24.10 - 05 Apr 2024

  • Update default CodeQL bundle version to 2.17.0. #2219
  • Add a deprecation warning for customers using CodeQL version 2.12.5 and earlier. These versions of CodeQL were discontinued on 26 March 2024 alongside GitHub Enterprise Server 3.8, and will be unsupported by CodeQL Action versions 3.25.0 and later and versions 2.25.0 and later. #2220
    • If you are using one of these versions, please update to CodeQL CLI version 2.12.6 or later. For instance, if you have specified a custom version of the CLI using the 'tools' input to the 'init' Action, you can remove this input to use the default version.
    • Alternatively, if you want to continue using a version of the CodeQL CLI between 2.11.6 and 2.12.5, you can replace github/codeql-action/*@v3 by github/codeql-action/*@v3.24.10 and github/codeql-action/*@v2 by github/codeql-action/*@v2.24.10 in your code scanning workflow to ensure you continue using this version of the CodeQL Action.

3.24.9 - 22 Mar 2024

  • Update default CodeQL bundle version to 2.16.5. #2203

3.24.8 - 18 Mar 2024

  • Improve the ease of debugging extraction issues by increasing the verbosity of the extractor logs when running in debug mode. #2195

3.24.7 - 12 Mar 2024

  • Update default CodeQL bundle version to 2.16.4. #2185

3.24.6 - 29 Feb 2024

No user facing changes.

3.24.5 - 23 Feb 2024

  • Update default CodeQL bundle version to 2.16.3. #2156

3.24.4 - 21 Feb 2024

  • Fix an issue where an existing, but empty, /sys/fs/cgroup/cpuset.cpus file always resulted in a single-threaded run. #2151

3.24.3 - 15 Feb 2024

  • Fix an issue where the CodeQL Action would fail to load a configuration specified by the config input to the init Action. #2147

3.24.2 - 15 Feb 2024

... (truncated)

Commits
  • 88fafeb Update diagnostics export PR check to use 2.12.6
  • 1a60a91 Remove support for CodeQL v2.12.5 and earlier
  • 2f0d0ea Update PR checks
  • 8f1e244 Bump minimum CodeQL Bundle version to 2.12.6
  • 33e3a7c Merge branch 'main' into RasmusWL/remove-python-dep-inst
  • 84efe24 Merge pull request #2231 from github/redsun82/setup-swift-fix
  • See full diff in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [github/codeql-action](https://github.com/github/codeql-action) from 2 to 3.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](github/codeql-action@v2...v3)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies github_actions Pull requests that update GitHub Actions code labels Apr 15, 2024
Copy link

netlify bot commented Apr 15, 2024

Deploy Preview for localai canceled.

Name Link
🔨 Latest commit 8cf8498
🔍 Latest deploy log https://app.netlify.com/sites/localai/deploys/661d7466057ff900089d441e

@github-actions github-actions bot enabled auto-merge (squash) April 15, 2024 18:50
@github-actions github-actions bot merged commit 320d8a4 into master Apr 15, 2024
50 checks passed
@github-actions github-actions bot deleted the dependabot/github_actions/github/codeql-action-3 branch April 15, 2024 22:02
truecharts-admin added a commit to truecharts/charts that referenced this pull request Apr 27, 2024
…3.0 by renovate (#21421)

This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.12.4-cublas-cuda11-ffmpeg-core` ->
`v2.13.0-cublas-cuda11-ffmpeg-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.12.4-cublas-cuda11-core` -> `v2.13.0-cublas-cuda11-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.12.4-cublas-cuda12-ffmpeg-core` ->
`v2.13.0-cublas-cuda12-ffmpeg-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.12.4-cublas-cuda12-core` -> `v2.13.0-cublas-cuda12-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.12.4-ffmpeg-core` -> `v2.13.0-ffmpeg-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.12.4` -> `v2.13.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>mudler/LocalAI (docker.io/localai/localai)</summary>

###
[`v2.13.0`](https://togithub.com/mudler/LocalAI/releases/tag/v2.13.0):
🖼️ v2.13.0 - Model gallery edition

[Compare
Source](https://togithub.com/mudler/LocalAI/compare/v2.12.4...v2.13.0)

Hello folks, Ettore here - I'm happy to announce the v2.13.0 LocalAI
release is out, with many features!

Below there is a small breakdown of the hottest features introduced in
this release - however - there are many other improvements (especially
from the community) as well, so don't miss out the changelog!

Check out the full changelog below for having an overview of all the
changes that went in this release (this one is quite packed up).

##### 🖼️ Model gallery

This is the first release with model gallery in the webUI, you can see
now a "Model" button in the WebUI which lands now in a selection of
models:


![output](https://togithub.com/mudler/LocalAI/assets/2420543/7b16676e-d5b1-4c97-89bd-9fa5065c21ad)

You can choose now models between stablediffusion, llama3, tts,
embeddings and more! The gallery is growing steadly and being kept
up-to-date.

The models are simple YAML files which are hosted in this repository:
https://github.com/mudler/LocalAI/tree/master/gallery - you can host
your own repository with your model index, or if you want you can
contribute to LocalAI.

If you want to contribute adding models, you can by opening up a PR in
the `gallery` directory:
https://github.com/mudler/LocalAI/tree/master/gallery.

##### Rerankers

I'm excited to introduce a new backend for `rerankers`. LocalAI now
implements the Jina API (https://jina.ai/reranker/#apiform) as a
compatibility layer, and you can use existing Jina clients and point to
those to the LocalAI address. Behind the hoods, uses
https://github.com/AnswerDotAI/rerankers.


![output](https://togithub.com/mudler/LocalAI/assets/2420543/ede67b25-fac4-4833-ae4f-78290e401e60)

You can test this by using container images with python (this does
**NOT** work with `core` images) and a model config file like this, or
by installing `cross-encoder` from the gallery in the UI:

```yaml
name: jina-reranker-v1-base-en
backend: rerankers
parameters:
  model: cross-encoder
```

and test it with:

```bash

    curl http://localhost:8080/v1/rerank \
      -H "Content-Type: application/json" \
      -d '{
      "model": "jina-reranker-v1-base-en",
      "query": "Organic skincare products for sensitive skin",
      "documents": [
        "Eco-friendly kitchenware for modern homes",
        "Biodegradable cleaning supplies for eco-conscious consumers",
        "Organic cotton baby clothes for sensitive skin",
        "Natural organic skincare range for sensitive skin",
        "Tech gadgets for smart homes: 2024 edition",
        "Sustainable gardening tools and compost solutions",
        "Sensitive skin-friendly facial cleansers and toners",
        "Organic food wraps and storage solutions",
        "All-natural pet food for dogs with allergies",
        "Yoga mats made from recycled materials"
      ],
      "top_n": 3
    }'
```

##### Parler-tts

There is a new backend available for tts now, `parler-tts`. It is
possible to install and configure the model directly from the gallery.
https://github.com/huggingface/parler-tts

##### 🎈 Lot of small improvements behind the scenes!

Thanks to our outstanding community, we have enhanced the performance
and stability of LocalAI across various modules. From backend
optimizations to front-end adjustments, every tweak helps make LocalAI
smoother and more robust.

##### 📣 Spread the word!

First off, a massive thank you (again!) to each and every one of you
who've chipped in to squash bugs and suggest cool new features for
LocalAI. Your help, kind words, and brilliant ideas are truly
appreciated - more than words can say!

And to those of you who've been heros, giving up your own time to help
out fellow users on Discord and in our repo, you're absolutely amazing.
We couldn't have asked for a better community.

Just so you know, LocalAI doesn't have the luxury of big corporate
sponsors behind it. It's all us, folks. So, if you've found value in
what we're building together and want to keep the momentum going,
consider showing your support. A little shoutout on your favorite social
platforms using @&#8203;LocalAI_OSS and @&#8203;mudler_it or joining our
sponsors can make a big difference.

Also, if you haven't yet joined our Discord, come on over! Here's the
link: https://discord.gg/uJAeKSAGDy

Every bit of support, every mention, and every star adds up and helps us
keep this ship sailing. Let's keep making LocalAI awesome together!

Thanks a ton, and here's to more exciting times ahead with LocalAI!

##### What's Changed

##### Bug fixes 🐛

- fix(autogptq): do not use_triton with qwen-vl by
[@&#8203;thiner](https://togithub.com/thiner) in
[mudler/LocalAI#1985
- fix: respect concurrency from parent build parameters when building
GRPC by [@&#8203;cryptk](https://togithub.com/cryptk) in
[mudler/LocalAI#2023
- ci: fix release pipeline missing dependencies by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2025
- fix: remove build path from help text documentation by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[mudler/LocalAI#2037
- fix: previous CLI rework broke debug logging by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[mudler/LocalAI#2036
- fix(fncall): fix regression introduced in
[#&#8203;1963](https://togithub.com/mudler/LocalAI/issues/1963) by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2048
- fix: adjust some sources names to match the naming of their
repositories by [@&#8203;cryptk](https://togithub.com/cryptk) in
[mudler/LocalAI#2061
- fix: move the GRPC cache generation workflow into it's own concurrency
group by [@&#8203;cryptk](https://togithub.com/cryptk) in
[mudler/LocalAI#2071
- fix(llama.cpp): set -1 as default for max tokens by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2087
- fix(llama.cpp-ggml): fixup `max_tokens` for old backend by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2094
- fix missing TrustRemoteCode in OpenVINO model load by
[@&#8203;fakezeta](https://togithub.com/fakezeta) in
[mudler/LocalAI#2114
- Incl ocv pkg for diffsusers utils by
[@&#8203;jtwolfe](https://togithub.com/jtwolfe) in
[mudler/LocalAI#2115

##### Exciting New Features 🎉

- feat: kong cli refactor fixes
[#&#8203;1955](https://togithub.com/mudler/LocalAI/issues/1955) by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[mudler/LocalAI#1974
- feat: add flash-attn in nvidia and rocm envs by
[@&#8203;golgeek](https://togithub.com/golgeek) in
[mudler/LocalAI#1995
- feat: use tokenizer.apply_chat_template() in vLLM by
[@&#8203;golgeek](https://togithub.com/golgeek) in
[mudler/LocalAI#1990
- feat(gallery): support ConfigURLs by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2012
- fix: dont commit generated files to git by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[mudler/LocalAI#1993
- feat(parler-tts): Add new backend by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2027
- feat(grpc): return consumed token count and update response
accordingly by [@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2035
- feat(store): add Golang client by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#1977
- feat(functions): support models with no grammar, add tests by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2068
- refactor(template): isolate and add tests by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2069
- feat: fiber logs with zerlog and add trace level by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[mudler/LocalAI#2082
- models(gallery): add gallery by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2078
- Add tensor_parallel_size setting to vllm setting items by
[@&#8203;Taikono-Himazin](https://togithub.com/Taikono-Himazin) in
[mudler/LocalAI#2085
- Transformer Backend: Implementing use_tokenizer_template and
stop_prompts options by
[@&#8203;fakezeta](https://togithub.com/fakezeta) in
[mudler/LocalAI#2090
- feat: Galleries UI by [@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2104
- Transformers Backend: max_tokens adherence to OpenAI API by
[@&#8203;fakezeta](https://togithub.com/fakezeta) in
[mudler/LocalAI#2108
- Fix cleanup sonarqube findings by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[mudler/LocalAI#2106
- feat(models-ui): minor visual enhancements by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2109
- fix(gallery): show a fake image if no there is no icon by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2111
- feat(rerankers): Add new backend, support jina rerankers API by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2121

##### 🧠 Models

- models(llama3): add llama3 to embedded models by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2074
- feat(gallery): add llama3, hermes, phi-3, and others by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2110
- models(gallery): add new models to the gallery by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2124
- models(gallery): add more models by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2129

##### 📖 Documentation and examples

- ⬆️ Update docs version mudler/LocalAI by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#1988
- docs: fix stores link by
[@&#8203;adrienbrault](https://togithub.com/adrienbrault) in
[mudler/LocalAI#2044
- AMD/ROCm Documentation update + formatting fix by
[@&#8203;jtwolfe](https://togithub.com/jtwolfe) in
[mudler/LocalAI#2100

##### 👒 Dependencies

- deps: Update version of vLLM to add support of Cohere Command_R model
in vLLM inference by
[@&#8203;holyCowMp3](https://togithub.com/holyCowMp3) in
[mudler/LocalAI#1975
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#1991
- build(deps): bump google.golang.org/protobuf from 1.31.0 to 1.33.0 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#1998
- build(deps): bump github.com/docker/docker from 20.10.7+incompatible
to 24.0.9+incompatible by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#1999
- build(deps): bump github.com/gofiber/fiber/v2 from 2.52.0 to 2.52.1 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2001
- build(deps): bump actions/checkout from 3 to 4 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2002
- build(deps): bump actions/setup-go from 4 to 5 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2003
- build(deps): bump peter-evans/create-pull-request from 5 to 6 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2005
- build(deps): bump actions/cache from 3 to 4 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2006
- build(deps): bump actions/upload-artifact from 3 to 4 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2007
- build(deps): bump github.com/charmbracelet/glamour from 0.6.0 to 0.7.0
by [@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2004
- build(deps): bump github.com/gofiber/fiber/v2 from 2.52.0 to 2.52.4 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2008
- build(deps): bump github.com/opencontainers/runc from 1.1.5 to 1.1.12
by [@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2000
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2014
- build(deps): bump the pip group across 4 directories with 8 updates by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2017
- build(deps): bump follow-redirects from 1.15.2 to 1.15.6 in
/examples/langchain/langchainjs-localai-example by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2020
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2024
- build(deps): bump softprops/action-gh-release from 1 to 2 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2039
- build(deps): bump dependabot/fetch-metadata from 1.3.4 to 2.0.0 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2040
- build(deps): bump github/codeql-action from 2 to 3 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2041
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2043
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2042
- build(deps): bump the pip group across 4 directories with 8 updates by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2049
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2050
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2060
- build(deps): bump aiohttp from 3.9.2 to 3.9.4 in
/examples/langchain/langchainpy-localai-example in the pip group across
1 directory by [@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2067
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2089
- deps(llama.cpp): update, use better model for function call tests by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2119
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2122
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2123
- build(deps): bump pydantic from 1.10.7 to 1.10.13 in
/examples/langchain/langchainpy-localai-example in the pip group across
1 directory by [@&#8203;dependabot](https://togithub.com/dependabot) in
[mudler/LocalAI#2125
- feat(swagger): update swagger by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2128

##### Other Changes

- ci: try to build on macos14 by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2011
- ⬆️ Update docs version mudler/LocalAI by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2013
- refactor: backend/service split, channel-based llm flow by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[mudler/LocalAI#1963
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2028
- fix - correct checkout versions by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[mudler/LocalAI#2029
- Revert "build(deps): bump the pip group across 4 directories with 8
updates" by [@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2030
- ⬆️ Update docs version mudler/LocalAI by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2032
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2033
- fix: action-tmate back to upstream, dead code removal by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[mudler/LocalAI#2038
- Revert [#&#8203;1963](https://togithub.com/mudler/LocalAI/issues/1963)
by [@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2056
- feat: refactor the dynamic json configs for api_keys and
external_backends by [@&#8203;cryptk](https://togithub.com/cryptk) in
[mudler/LocalAI#2055
- tests: add template tests by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2063
- feat: better control of GRPC docker cache by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[mudler/LocalAI#2070
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2051
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2080
- feat: enable polling configs for systems with broken fsnotify (docker
volumes on windows) by [@&#8203;cryptk](https://togithub.com/cryptk) in
[mudler/LocalAI#2081
- fix: action-tmate: use connect-timeout-sections and
limit-access-to-actor by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[mudler/LocalAI#2083
- refactor(routes): split routes registration by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2077
- fix: action-tmate detached by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[mudler/LocalAI#2092
- fix: rename fiber entrypoint from http/api to http/app by
[@&#8203;mudler](https://togithub.com/mudler) in
[mudler/LocalAI#2096
- fix: typo in models.go by
[@&#8203;eltociear](https://togithub.com/eltociear) in
[mudler/LocalAI#2099
- Update text-generation.md by
[@&#8203;Taikono-Himazin](https://togithub.com/Taikono-Himazin) in
[mudler/LocalAI#2095
- ⬆️ Update docs version mudler/LocalAI by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2105
- ⬆️ Update docs version mudler/LocalAI by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[mudler/LocalAI#2113

##### New Contributors

- [@&#8203;holyCowMp3](https://togithub.com/holyCowMp3) made their first
contribution in
[mudler/LocalAI#1975
- [@&#8203;dependabot](https://togithub.com/dependabot) made their first
contribution in
[mudler/LocalAI#1998
- [@&#8203;adrienbrault](https://togithub.com/adrienbrault) made their
first contribution in
[mudler/LocalAI#2044
- [@&#8203;Taikono-Himazin](https://togithub.com/Taikono-Himazin) made
their first contribution in
[mudler/LocalAI#2085
- [@&#8203;eltociear](https://togithub.com/eltociear) made their first
contribution in
[mudler/LocalAI#2099
- [@&#8203;jtwolfe](https://togithub.com/jtwolfe) made their first
contribution in
[mudler/LocalAI#2100

**Full Changelog**:
mudler/LocalAI@v2.12.4...V2.13.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Enabled.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR has been generated by [Renovate
Bot](https://togithub.com/renovatebot/renovate).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNy4zMjUuMSIsInVwZGF0ZWRJblZlciI6IjM3LjMyNS4xIiwidGFyZ2V0QnJhbmNoIjoibWFzdGVyIiwibGFiZWxzIjpbImF1dG9tZXJnZSIsInVwZGF0ZS9kb2NrZXIvZ2VuZXJhbC9ub24tbWFqb3IiXX0=-->
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies github_actions Pull requests that update GitHub Actions code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

0 participants