Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ci(arm64): fix gRPC build by adding googletest to CMakefile #2754

Merged
merged 1 commit into from
Jul 9, 2024

Conversation

mudler
Copy link
Owner

@mudler mudler commented Jul 9, 2024

Description

This PR fixes:

Run git clone --recurse-submodules -b v1.64.0 --depth 1 --shallow-submodules https://github.com/grpc/grpc && \
Cloning into 'grpc'...
Note: switching to 'b8a04acbbf18fd1c805e5d53d62ed9fa4721a4d1'.

You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by switching back to a branch.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -c with the switch command. Example:

  git switch -c <new-branch-name>

Or undo this operation with:

  git switch -

Turn off this advice by setting config variable advice.detachedHead to false

Submodule 'third_party/abseil-cpp' (https://github.com/abseil/abseil-cpp.git) registered for path 'third_party/abseil-cpp'
Submodule 'third_party/benchmark' (https://github.com/google/benchmark) registered for path 'third_party/benchmark'
Submodule 'third_party/bloaty' (https://github.com/google/bloaty.git) registered for path 'third_party/bloaty'
Submodule 'third_party/boringssl-with-bazel' (https://github.com/google/boringssl.git) registered for path 'third_party/boringssl-with-bazel'
Submodule 'third_party/cares/cares' (https://github.com/c-ares/c-ares.git) registered for path 'third_party/cares/cares'
Submodule 'third_party/envoy-api' (https://github.com/envoyproxy/data-plane-api.git) registered for path 'third_party/envoy-api'
Submodule 'third_party/googleapis' (https://github.com/googleapis/googleapis.git) registered for path 'third_party/googleapis'
Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/googletest'
Submodule 'third_party/opencensus-proto' (https://github.com/census-instrumentation/opencensus-proto.git) registered for path 'third_party/opencensus-proto'
Submodule 'third_party/opentelemetry' (https://github.com/open-telemetry/opentelemetry-proto.git) registered for path 'third_party/opentelemetry'
Submodule 'third_party/opentelemetry-cpp' (https://github.com/open-telemetry/opentelemetry-cpp) registered for path 'third_party/opentelemetry-cpp'
Submodule 'third_party/protobuf' (https://github.com/protocolbuffers/protobuf.git) registered for path 'third_party/protobuf'
Submodule 'third_party/protoc-gen-validate' (https://github.com/envoyproxy/protoc-gen-validate.git) registered for path 'third_party/protoc-gen-validate'
Submodule 'third_party/re2' (https://github.com/google/re2.git) registered for path 'third_party/re2'
Submodule 'third_party/xds' (https://github.com/cncf/xds.git) registered for path 'third_party/xds'
Submodule 'third_party/zlib' (https://github.com/madler/zlib) registered for path 'third_party/zlib'
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/abseil-cpp'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/benchmark'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/bloaty'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/boringssl-with-bazel'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/cares/cares'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/envoy-api'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/googleapis'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/googletest'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/opencensus-proto'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/opentelemetry'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/opentelemetry-cpp'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/protobuf'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/protoc-gen-validate'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/re2'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/xds'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/zlib'...
From https://github.com/abseil/abseil-cpp
 * branch            4a2c63365eff8823a5221db86ef490e828306f9d -> FETCH_HEAD
Submodule path 'third_party/abseil-cpp': checked out '4a2c63365eff8823a5221db86ef490e828306f9d'
From https://github.com/google/benchmark
 * branch            344117638c8ff7e239044fd0fa7085839fc03021 -> FETCH_HEAD
Submodule path 'third_party/benchmark': checked out '344117638c8ff7e239044fd0fa7085839fc03021'
From https://github.com/google/bloaty
 * branch            60209eb1ccc34d5deefb002d1b7f37545204f7f2 -> FETCH_HEAD
Submodule path 'third_party/bloaty': checked out '60209eb1ccc34d5deefb002d1b7f37545204f7f2'
Submodule 'third_party/abseil-cpp' (https://github.com/abseil/abseil-cpp.git) registered for path 'third_party/bloaty/third_party/abseil-cpp'
Submodule 'third_party/capstone' (https://github.com/aquynh/capstone.git) registered for path 'third_party/bloaty/third_party/capstone'
Submodule 'third_party/demumble' (https://github.com/nico/demumble.git) registered for path 'third_party/bloaty/third_party/demumble'
Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/bloaty/third_party/googletest'
Submodule 'third_party/protobuf' (https://github.com/protocolbuffers/protobuf.git) registered for path 'third_party/bloaty/third_party/protobuf'
Submodule 'third_party/re2' (https://github.com/google/re2) registered for path 'third_party/bloaty/third_party/re2'
Submodule 'third_party/zlib' (https://github.com/madler/zlib) registered for path 'third_party/bloaty/third_party/zlib'
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/bloaty/third_party/abseil-cpp'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/bloaty/third_party/capstone'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/bloaty/third_party/demumble'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/bloaty/third_party/googletest'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/bloaty/third_party/protobuf'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/bloaty/third_party/re2'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/bloaty/third_party/zlib'...
From https://github.com/abseil/abseil-cpp
 * branch            5dd240724366295970c613ed23d0092bcf392f18 -> FETCH_HEAD
Submodule path 'third_party/bloaty/third_party/abseil-cpp': checked out '5dd240724366295970c613ed23d0092bcf392f18'
From https://github.com/aquynh/capstone
 * branch            852f46a467cb37559a1f3a18bd45d5ca8c6fc5e7 -> FETCH_HEAD
Submodule path 'third_party/bloaty/third_party/capstone': checked out '852f46a467cb37559a1f3a18bd45d5ca8c6fc5e7'
From https://github.com/nico/demumble
 * branch            01098eab821b33bd31b9778aea38565cd796aa85 -> FETCH_HEAD
Submodule path 'third_party/bloaty/third_party/demumble': checked out '01098eab821b33bd31b9778aea38565cd796aa85'
From https://github.com/google/googletest
 * branch            565f1b848215b77c3732bca345fe76a0431d8b34 -> FETCH_HEAD
Submodule path 'third_party/bloaty/third_party/googletest': checked out '565f1b848215b77c3732bca345fe76a0431d8b34'
From https://github.com/protocolbuffers/protobuf
 * branch            bc1773c42c9c3c522145a3119e989e0dff2a8d54 -> FETCH_HEAD
Submodule path 'third_party/bloaty/third_party/protobuf': checked out 'bc1773c42c9c3c522145a3119e989e0dff2a8d54'
Submodule 'third_party/benchmark' (https://github.com/google/benchmark.git) registered for path 'third_party/bloaty/third_party/protobuf/third_party/benchmark'
Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/bloaty/third_party/protobuf/third_party/googletest'
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/bloaty/third_party/protobuf/third_party/benchmark'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/bloaty/third_party/protobuf/third_party/googletest'...
From https://github.com/google/benchmark
 * branch            5b7683f49e1e9223cf9927b24f6fd3d6bd82e3f8 -> FETCH_HEAD
Submodule path 'third_party/bloaty/third_party/protobuf/third_party/benchmark': checked out '5b7683f49e1e9223cf9927b24f6fd3d6bd82e3f8'
From https://github.com/google/googletest
 * branch            5ec7f0c4a113e2f18ac2c6cc7df51ad6afc24081 -> FETCH_HEAD
Submodule path 'third_party/bloaty/third_party/protobuf/third_party/googletest': checked out '5ec7f0c4a113e2f18ac2c6cc7df51ad6afc24081'
From https://github.com/google/re2
 * branch            5bd613749fd530b576b890283bfb6bc6ea6246cb -> FETCH_HEAD
Submodule path 'third_party/bloaty/third_party/re2': checked out '5bd613749fd530b576b890283bfb6bc6ea6246cb'
From https://github.com/madler/zlib
 * branch            cacf7f1d4e3d44d871b605da3b647f07d718623f -> FETCH_HEAD
Submodule path 'third_party/bloaty/third_party/zlib': checked out 'cacf7f1d4e3d44d871b605da3b647f07d718623f'
From https://github.com/google/boringssl
 * branch            5a2bca2124800f2861263959b72bc35cdf18949b -> FETCH_HEAD
Submodule path 'third_party/boringssl-with-bazel': checked out '5a2bca2124800f2861263959b72bc35cdf18949b'
From https://github.com/c-ares/c-ares
 * branch            6360e96b5cf8e5980c887ce58ef727e53d77243a -> FETCH_HEAD
Submodule path 'third_party/cares/cares': checked out '6360e96b5cf8e5980c887ce58ef727e53d77243a'
From https://github.com/envoyproxy/data-plane-api
 * branch            78f198cf96ecdc7120ef640406770aa01af775c4 -> FETCH_HEAD
Submodule path 'third_party/envoy-api': checked out '78f198cf96ecdc7120ef640406770aa01af775c4'
From https://github.com/googleapis/googleapis
 * branch            2f9af297c84c55c8b871ba4495e01ade42476c92 -> FETCH_HEAD
Submodule path 'third_party/googleapis': checked out '2f9af297c84c55c8b871ba4495e01ade42476c92'
From https://github.com/google/googletest
 * branch            2dd1c131950043a8ad5ab0d2dda0e0970596586a -> FETCH_HEAD
Submodule path 'third_party/googletest': checked out '2dd1c131950043a8ad5ab0d2dda0e0970596586a'
From https://github.com/census-instrumentation/opencensus-proto
 * branch            4aa53e15cbf1a47bc9087e6cfdca214c1eea4e89 -> FETCH_HEAD
Submodule path 'third_party/opencensus-proto': checked out '4aa53e15cbf1a47bc9087e6cfdca214c1eea4e89'
From https://github.com/open-telemetry/opentelemetry-proto
 * branch            60fa8754d890b5c55949a8c68dcfd7ab5c2395df -> FETCH_HEAD
Submodule path 'third_party/opentelemetry': checked out '60fa8754d890b5c55949a8c68dcfd7ab5c2395df'
From https://github.com/open-telemetry/opentelemetry-cpp
 * branch            4bd64c9a336fd438d6c4c9dad2e6b61b0585311f -> FETCH_HEAD
Submodule path 'third_party/opentelemetry-cpp': checked out '4bd64c9a336fd438d6c4c9dad2e6b61b0585311f'
Submodule 'third_party/benchmark' (https://github.com/google/benchmark) registered for path 'third_party/opentelemetry-cpp/third_party/benchmark'
Submodule 'third_party/googletest' (https://github.com/google/googletest) registered for path 'third_party/opentelemetry-cpp/third_party/googletest'
Submodule 'third_party/ms-gsl' (https://github.com/microsoft/GSL) registered for path 'third_party/opentelemetry-cpp/third_party/ms-gsl'
Submodule 'third_party/nlohmann-json' (https://github.com/nlohmann/json) registered for path 'third_party/opentelemetry-cpp/third_party/nlohmann-json'
Submodule 'third_party/opentelemetry-proto' (https://github.com/open-telemetry/opentelemetry-proto) registered for path 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto'
Submodule 'third_party/opentracing-cpp' (https://github.com/opentracing/opentracing-cpp.git) registered for path 'third_party/opentelemetry-cpp/third_party/opentracing-cpp'
Submodule 'third_party/prometheus-cpp' (https://github.com/jupp0r/prometheus-cpp) registered for path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp'
Submodule 'tools/vcpkg' (https://github.com/Microsoft/vcpkg) registered for path 'third_party/opentelemetry-cpp/tools/vcpkg'
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/opentelemetry-cpp/third_party/benchmark'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/opentelemetry-cpp/third_party/googletest'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/opentelemetry-cpp/third_party/ms-gsl'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/opentelemetry-cpp/third_party/nlohmann-json'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/opentelemetry-cpp/third_party/opentelemetry-proto'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/opentelemetry-cpp/third_party/opentracing-cpp'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/opentelemetry-cpp/third_party/prometheus-cpp'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/opentelemetry-cpp/tools/vcpkg'...
From https://github.com/google/benchmark
 * branch            d572f4777349d43653b21d6c2fc63020ab326db2 -> FETCH_HEAD
Submodule path 'third_party/opentelemetry-cpp/third_party/benchmark': checked out 'd572f4777349d43653b21d6c2fc63020ab326db2'
From https://github.com/google/googletest
 * branch            b796f7d44681514f58a683a3a71ff17c94edb0c1 -> FETCH_HEAD
Submodule path 'third_party/opentelemetry-cpp/third_party/googletest': checked out 'b796f7d44681514f58a683a3a71ff17c94edb0c1'
From https://github.com/microsoft/GSL
 * branch            6f4529395c5b7c2d661812257cd6780c67e54afa -> FETCH_HEAD
Submodule path 'third_party/opentelemetry-cpp/third_party/ms-gsl': checked out '6f4529395c5b7c2d661812257cd6780c67e54afa'
From https://github.com/nlohmann/json
 * branch            bc889afb4c5bf1c0d8ee29ef35eaaf4c8bef8a5d -> FETCH_HEAD
Submodule path 'third_party/opentelemetry-cpp/third_party/nlohmann-json': checked out 'bc889afb4c5bf1c0d8ee29ef35eaaf4c8bef8a5d'
From https://github.com/open-telemetry/opentelemetry-proto
 * branch            c4dfbc51f3cd4089778555a2ac5d9bc093ed2956 -> FETCH_HEAD
Submodule path 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto': checked out 'c4dfbc51f3cd4089778555a2ac5d9bc093ed2956'
From https://github.com/opentracing/opentracing-cpp
 * branch            06b57f48ded1fa3bdd3d4346f6ef29e40e08eaf5 -> FETCH_HEAD
Submodule path 'third_party/opentelemetry-cpp/third_party/opentracing-cpp': checked out '06b57f48ded1fa3bdd3d4346f6ef29e40e08eaf5'
From https://github.com/jupp0r/prometheus-cpp
 * branch            c9ffcdda9086ffd9e1283ea7a0276d831f3c8a8d -> FETCH_HEAD
Submodule path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp': checked out 'c9ffcdda9086ffd9e1283ea7a0276d831f3c8a8d'
Submodule 'civetweb' (https://github.com/civetweb/civetweb.git) registered for path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb'
Submodule 'googletest' (https://github.com/google/googletest.git) registered for path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest'
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest'...
From https://github.com/civetweb/civetweb
 * branch            eefb26f82b233268fc98577d265352720d477ba4 -> FETCH_HEAD
Submodule path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb': checked out 'eefb26f82b233268fc98577d265352720d477ba4'
From https://github.com/google/googletest
 * branch            e2239ee6043f73722e7aa812a459f54a28552929 -> FETCH_HEAD
Submodule path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest': checked out 'e2239ee6043f73722e7aa812a459f54a28552929'
From https://github.com/Microsoft/vcpkg
 * branch              8eb57355a4ffb410a2e94c07b4dca2dffbee8e50 -> FETCH_HEAD
Submodule path 'third_party/opentelemetry-cpp/tools/vcpkg': checked out '8eb57355a4ffb410a2e94c07b4dca2dffbee8e50'
From https://github.com/protocolbuffers/protobuf
 * branch            2434ef2adf0c74149b9d547ac5fb545a1ff8b6b5 -> FETCH_HEAD
Submodule path 'third_party/protobuf': checked out '2434ef2adf0c74149b9d547ac5fb545a1ff8b6b5'
Submodule 'third_party/abseil-cpp' (https://github.com/abseil/abseil-cpp.git) registered for path 'third_party/protobuf/third_party/abseil-cpp'
Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/protobuf/third_party/googletest'
Submodule 'third_party/jsoncpp' (https://github.com/open-source-parsers/jsoncpp.git) registered for path 'third_party/protobuf/third_party/jsoncpp'
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/protobuf/third_party/abseil-cpp'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/protobuf/third_party/googletest'...
Cloning into '/home/runner/work/LocalAI/LocalAI/grpc/third_party/protobuf/third_party/jsoncpp'...
From https://github.com/abseil/abseil-cpp
 * branch            4a2c63365eff8823a5221db86ef490e828306f9d -> FETCH_HEAD
Submodule path 'third_party/protobuf/third_party/abseil-cpp': checked out '4a2c63365eff8823a5221db86ef490e828306f9d'
From https://github.com/google/googletest
 * branch            4c9a3bb62bf3ba1f1010bf96f9c8ed767b363774 -> FETCH_HEAD
Submodule path 'third_party/protobuf/third_party/googletest': checked out '4c9a3bb62bf3ba1f1010bf96f9c8ed767b363774'
From https://github.com/open-source-parsers/jsoncpp
 * branch            9059f5cad030ba11d37818847443a53918c327b1 -> FETCH_HEAD
Submodule path 'third_party/protobuf/third_party/jsoncpp': checked out '9059f5cad030ba11d37818847443a53918c327b1'
From https://github.com/envoyproxy/protoc-gen-validate
 * branch            fab737efbb4b4d03e7c771393708f75594b121e4 -> FETCH_HEAD
Submodule path 'third_party/protoc-gen-validate': checked out 'fab737efbb4b4d03e7c771393708f75594b121e4'
From https://github.com/google/re2
 * branch            0c5616df9c0aaa44c9440d87422012423d91c7d1 -> FETCH_HEAD
Submodule path 'third_party/re2': checked out '0c5616df9c0aaa44c9440d87422012423d91c7d1'
From https://github.com/cncf/xds
 * branch            3a472e524827f72d1ad621c4983dd5af54c46776 -> FETCH_HEAD
Submodule path 'third_party/xds': checked out '3a472e524827f72d1ad621c4983dd5af54c46776'
From https://github.com/madler/zlib
 * branch            09155eaa2f9270dc4ed1fa13e2b4b2613e6e4851 -> FETCH_HEAD
Submodule path 'third_party/zlib': checked out '09155eaa2f9270dc4ed1fa13e2b4b2613e6e4851'
-- The C compiler identification is GNU 11.4.0
-- The CXX compiler identification is GNU 11.4.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/cc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- Performing Test HAVE_STDC_FORMAT_MACROS
-- Performing Test HAVE_STDC_FORMAT_MACROS - Success
CMake Warning at third_party/abseil-cpp/CMakeLists.txt:82 (message):
  A future Abseil release will default ABSL_PROPAGATE_CXX_STD to ON for CMake
  3.8 and up.  We recommend enabling this option to ensure your project still
  builds correctly.


-- Performing Test ABSL_INTERNAL_AT_LEAST_CXX17
-- Performing Test ABSL_INTERNAL_AT_LEAST_CXX17 - Success
-- Performing Test ABSL_INTERNAL_AT_LEAST_CXX20
-- Performing Test ABSL_INTERNAL_AT_LEAST_CXX20 - Failed
CMake Deprecation Warning at third_party/cares/cares/CMakeLists.txt:1 (CMAKE_MINIMUM_REQUIRED):
  Compatibility with CMake < 3.5 will be removed from a future version of
  CMake.

  Update the VERSION argument <min> value or use a ...<max> suffix to tell
  CMake that the project does not need compatibility with older versions.


-- Looking for res_servicename
-- Looking for res_servicename - not found
-- Looking for res_servicename in resolv
-- Looking for res_servicename in resolv - not found
-- Looking for gethostbyname
-- Looking for gethostbyname - found
-- Looking for socket
-- Looking for socket - found
-- Looking for clock_gettime
-- Looking for clock_gettime - found
-- Looking for include file sys/types.h
-- Looking for include file sys/types.h - found
-- Looking for include file sys/socket.h
-- Looking for include file sys/socket.h - found
-- Looking for include file sys/sockio.h
-- Looking for include file sys/sockio.h - not found
-- Looking for include file arpa/inet.h
-- Looking for include file arpa/inet.h - found
-- Looking for include file arpa/nameser_compat.h
-- Looking for include file arpa/nameser_compat.h - found
-- Looking for include file arpa/nameser.h
-- Looking for include file arpa/nameser.h - found
-- Looking for include file assert.h
-- Looking for include file assert.h - found
-- Looking for include file errno.h
-- Looking for include file errno.h - found
-- Looking for include file fcntl.h
-- Looking for include file fcntl.h - found
-- Looking for include file inttypes.h
-- Looking for include file inttypes.h - found
-- Looking for include file limits.h
-- Looking for include file limits.h - found
-- Looking for include file malloc.h
-- Looking for include file malloc.h - found
-- Looking for include file memory.h
-- Looking for include file memory.h - found
-- Looking for include file netdb.h
-- Looking for include file netdb.h - found
-- Looking for include file netinet/in.h
-- Looking for include file netinet/in.h - found
-- Looking for include file net/if.h
-- Looking for include file net/if.h - found
-- Looking for include file signal.h
-- Looking for include file signal.h - found
-- Looking for include file socket.h
-- Looking for include file socket.h - not found
-- Looking for include file stdbool.h
-- Looking for include file stdbool.h - found
-- Looking for include file stdint.h
-- Looking for include file stdint.h - found
-- Looking for include file stdlib.h
-- Looking for include file stdlib.h - found
-- Looking for include file strings.h
-- Looking for include file strings.h - found
-- Looking for include file string.h
-- Looking for include file string.h - found
-- Looking for include file stropts.h
-- Looking for include file stropts.h - not found
-- Looking for include file sys/ioctl.h
-- Looking for include file sys/ioctl.h - found
-- Looking for include file sys/param.h
-- Looking for include file sys/param.h - found
-- Looking for include file sys/select.h
-- Looking for include file sys/select.h - found
-- Looking for include file sys/stat.h
-- Looking for include file sys/stat.h - found
-- Looking for include file sys/time.h
-- Looking for include file sys/time.h - found
-- Looking for include file sys/uio.h
-- Looking for include file sys/uio.h - found
-- Looking for include file time.h
-- Looking for include file time.h - found
-- Looking for include file dlfcn.h
-- Looking for include file dlfcn.h - found
-- Looking for include file unistd.h
-- Looking for include file unistd.h - found
-- Looking for include files sys/types.h, netinet/tcp.h
-- Looking for include files sys/types.h, netinet/tcp.h - found
-- Performing Test HAVE_SOCKLEN_T
-- Performing Test HAVE_SOCKLEN_T - Success
-- Performing Test HAVE_TYPE_SOCKET
-- Performing Test HAVE_TYPE_SOCKET - Failed
-- Performing Test HAVE_BOOL_T
-- Performing Test HAVE_BOOL_T - Success
-- Performing Test HAVE_SSIZE_T
-- Performing Test HAVE_SSIZE_T - Success
-- Performing Test HAVE_LONGLONG
-- Performing Test HAVE_LONGLONG - Success
-- Performing Test HAVE_SIG_ATOMIC_T
-- Performing Test HAVE_SIG_ATOMIC_T - Success
-- Performing Test HAVE_STRUCT_ADDRINFO
-- Performing Test HAVE_STRUCT_ADDRINFO - Success
-- Performing Test HAVE_STRUCT_IN6_ADDR
-- Performing Test HAVE_STRUCT_IN6_ADDR - Success
-- Performing Test HAVE_STRUCT_SOCKADDR_IN6
-- Performing Test HAVE_STRUCT_SOCKADDR_IN6 - Success
-- Performing Test HAVE_STRUCT_SOCKADDR_STORAGE
-- Performing Test HAVE_STRUCT_SOCKADDR_STORAGE - Success
-- Performing Test HAVE_STRUCT_TIMEVAL
-- Performing Test HAVE_STRUCT_TIMEVAL - Success
-- Looking for AF_INET6
-- Looking for AF_INET6 - found
-- Looking for O_NONBLOCK
-- Looking for O_NONBLOCK - found
-- Looking for FIONBIO
-- Looking for FIONBIO - found
-- Looking for SIOCGIFADDR
-- Looking for SIOCGIFADDR - found
-- Looking for MSG_NOSIGNAL
-- Looking for MSG_NOSIGNAL - found
-- Looking for PF_INET6
-- Looking for PF_INET6 - found
-- Looking for SO_NONBLOCK
-- Looking for SO_NONBLOCK - not found
-- Looking for CLOCK_MONOTONIC
-- Looking for CLOCK_MONOTONIC - found
-- Performing Test HAVE_SOCKADDR_IN6_SIN6_SCOPE_ID
-- Performing Test HAVE_SOCKADDR_IN6_SIN6_SCOPE_ID - Success
-- Performing Test HAVE_LL
-- Performing Test HAVE_LL - Success
-- Looking for bitncmp
-- Looking for bitncmp - not found
-- Looking for closesocket
-- Looking for closesocket - not found
-- Looking for CloseSocket
-- Looking for CloseSocket - not found
-- Looking for connect
-- Looking for connect - found
-- Looking for fcntl
-- Looking for fcntl - found
-- Looking for freeaddrinfo
-- Looking for freeaddrinfo - found
-- Looking for getaddrinfo
-- Looking for getaddrinfo - found
-- Looking for getenv
-- Looking for getenv - found
-- Looking for gethostbyaddr
-- Looking for gethostbyaddr - found
-- Looking for gethostbyname
-- Looking for gethostbyname - found
-- Looking for gethostname
-- Looking for gethostname - found
-- Looking for getnameinfo
-- Looking for getnameinfo - found
-- Looking for getservbyport_r
-- Looking for getservbyport_r - found
-- Looking for getservbyname_r
-- Looking for getservbyname_r - found
-- Looking for gettimeofday
-- Looking for gettimeofday - found
-- Looking for if_indextoname
-- Looking for if_indextoname - found
-- Looking for inet_net_pton
-- Looking for inet_net_pton - not found
-- Looking for inet_ntop
-- Looking for inet_ntop - found
-- Looking for inet_pton
-- Looking for inet_pton - found
-- Looking for ioctl
-- Looking for ioctl - found
-- Looking for ioctlsocket
-- Looking for ioctlsocket - not found
-- Looking for IoctlSocket
-- Looking for IoctlSocket - not found
-- Looking for recv
-- Looking for recv - found
-- Looking for recvfrom
-- Looking for recvfrom - found
-- Looking for send
-- Looking for send - found
-- Looking for setsockopt
-- Looking for setsockopt - found
-- Looking for socket
-- Looking for socket - found
-- Looking for strcasecmp
-- Looking for strcasecmp - found
-- Looking for strcmpi
-- Looking for strcmpi - not found
-- Looking for strdup
-- Looking for strdup - found
-- Looking for stricmp
-- Looking for stricmp - not found
-- Looking for strncasecmp
-- Looking for strncasecmp - found
-- Looking for strncmpi
-- Looking for strncmpi - not found
-- Looking for strnicmp
-- Looking for strnicmp - not found
-- Looking for writev
-- Looking for writev - found
-- Looking for arc4random_buf
-- Looking for arc4random_buf - not found
-- Looking for __system_property_get
-- Looking for __system_property_get - not found
-- 
-- 26.1.0
-- Performing Test protobuf_HAVE_LD_VERSION_SCRIPT
-- Performing Test protobuf_HAVE_LD_VERSION_SCRIPT - Success
-- Performing Test protobuf_HAVE_BUILTIN_ATOMICS
-- Performing Test protobuf_HAVE_BUILTIN_ATOMICS - Success
-- The ASM compiler identification is GNU
-- Found assembler: /usr/bin/cc
-- Looking for stddef.h
-- Looking for stddef.h - found
-- Check size of off64_t
-- Check size of off64_t - done
-- Looking for fseeko
-- Looking for fseeko - found
-- Looking for unistd.h
-- Looking for unistd.h - found
-- Renaming
--     /home/runner/work/LocalAI/LocalAI/grpc/third_party/zlib/zconf.h
-- to 'zconf.h.included' because this file is included with zlib
-- but CMake generates it automatically in the build directory.
-- Found PkgConfig: /usr/bin/pkg-config (found version "1.8.0")
-- Checking for module 'libsystemd>=233'
--   Package 'libsystemd', required by 'virtual:world', not found
-- Configuring done (11.9s)
CMake Error at third_party/abseil-cpp/CMake/AbseilHelpers.cmake:317 (target_link_libraries):
  The link interface of target "test_allocator" contains:

    GTest::gmock

  but the target was not found.  Possible reasons include:

    * There is a typo in the target name.
    * A find_package call is missing for an IMPORTED target.
    * An ALIAS target is missing.

Call Stack (most recent call first):
  third_party/abseil-cpp/absl/container/CMakeLists.txt:206 (absl_cc_library)


CMake Generate step failed.  Build files cannot be regenerated correctly.
-- Generating done (0.8s)
Error: Process completed with exit code 1.

This seems to happen only on cross-arch arm builds - and started to happen all of a sudden. Seems that gmock can't be found, so I've adapted slightly the Cmake file coming from gRPC to add it with CMake rather then relying on the system to ship it (probably it was before, and now not part anymore of the runners environment?)

For context: http://google.github.io/googletest/quickstart-cmake.html

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Copy link

netlify bot commented Jul 9, 2024

Deploy Preview for localai ready!

Name Link
🔨 Latest commit b12f8af
🔍 Latest deploy log https://app.netlify.com/sites/localai/deploys/668d74747c9b8200078d45f0
😎 Deploy Preview https://deploy-preview-2754--localai.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@mudler mudler mentioned this pull request Jul 9, 2024
@mudler
Copy link
Owner Author

mudler commented Jul 9, 2024

let's quickly merge this to unblock other PRs

@mudler mudler merged commit 401ee55 into master Jul 9, 2024
22 of 41 checks passed
@mudler mudler deleted the fix_arm_build_gRPC branch July 9, 2024 17:47
mudler added a commit that referenced this pull request Jul 10, 2024
mudler added a commit that referenced this pull request Jul 10, 2024
* Revert "ci(grpc): disable ABSEIL tests (#2759)"

This reverts commit cbb93bd.

* Revert "fix: arm builds via disabling abseil tests (#2758)"

This reverts commit 8d046de.

* Revert "ci(arm64): fix gRPC build by adding googletest to CMakefile (#2754)"

This reverts commit 401ee55.

* ci(gmock): install libgmock-dev

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
truecharts-admin added a commit to truecharts/charts that referenced this pull request Jul 24, 2024
…9.1 by renovate (#24152)

This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.17.1-aio-cpu` -> `v2.19.1-aio-cpu` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.17.1-aio-gpu-nvidia-cuda-11` ->
`v2.19.1-aio-gpu-nvidia-cuda-11` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.17.1-aio-gpu-nvidia-cuda-12` ->
`v2.19.1-aio-gpu-nvidia-cuda-12` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.17.1-cublas-cuda11-ffmpeg-core` ->
`v2.19.1-cublas-cuda11-ffmpeg-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.17.1-cublas-cuda11-core` -> `v2.19.1-cublas-cuda11-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.17.1-cublas-cuda12-ffmpeg-core` ->
`v2.19.1-cublas-cuda12-ffmpeg-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.17.1-cublas-cuda12-core` -> `v2.19.1-cublas-cuda12-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.17.1-ffmpeg-core` -> `v2.19.1-ffmpeg-core` |
| [docker.io/localai/localai](https://togithub.com/mudler/LocalAI) |
minor | `v2.17.1` -> `v2.19.1` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>mudler/LocalAI (docker.io/localai/localai)</summary>

###
[`v2.19.1`](https://togithub.com/mudler/LocalAI/releases/tag/v2.19.1)

[Compare
Source](https://togithub.com/mudler/LocalAI/compare/v2.19.0...v2.19.1)


![local-ai-release-219-shadow](https://togithub.com/user-attachments/assets/c5d7c930-656f-410d-aab9-455a466925fe)

##### LocalAI 2.19.1 is out! :mega:

##### TLDR; Summary spotlight

- 🖧 Federated Instances via P2P: LocalAI now supports federated
instances with P2P, offering both load-balanced and non-load-balanced
options.
- 🎛️ P2P Dashboard: A new dashboard to guide and assist in setting up
P2P instances with auto-discovery using shared tokens.
- 🔊 TTS Integration: Text-to-Speech (TTS) is now included in the binary
releases.
- 🛠️ Enhanced Installer: The installer script now supports setting up
federated instances.
-   📥 Model Pulling: Models can now be pulled directly via URL.
- 🖼️ WebUI Enhancements: Visual improvements and cleanups to the WebUI
and model lists.
- 🧠 llama-cpp Backend: The llama-cpp (grpc) backend now supports
embedding ( https://localai.io/features/embeddings/#llamacpp-embeddings
)
-   ⚙️ Tool Support: Small enhancements to tools with disabled grammars.

##### 🖧 LocalAI Federation and AI swarms

<p align="center">
<img
src="https://github.com/user-attachments/assets/17b39f8a-fc41-47d9-b846-b3a88307813b"/>
</p>

LocalAI is revolutionizing the future of distributed AI workloads by
making it simpler and more accessible. No more complex setups, Docker or
Kubernetes configurations – LocalAI allows you to create your own AI
cluster with minimal friction. By auto-discovering and sharing work or
weights of the LLM model across your existing devices, LocalAI aims to
scale both horizontally and vertically with ease.

##### How it works?

Starting LocalAI with `--p2p` generates a shared token for connecting
multiple instances: and that's all you need to create AI clusters,
eliminating the need for intricate network setups. Simply navigate to
the "Swarm" section in the WebUI and follow the on-screen instructions.

For fully shared instances, initiate LocalAI with `--p2p --federated`
and adhere to the Swarm section's guidance. This feature, while still
experimental, offers a tech preview quality experience.

##### Federated LocalAI

Launch multiple LocalAI instances and cluster them together to share
requests across the cluster. The "Swarm" tab in the WebUI provides
one-liner instructions on connecting various LocalAI instances using a
shared token. Instances will auto-discover each other, even across
different networks.


![346663124-1d2324fd-8b55-4fa2-9856-721a467969c2](https://togithub.com/user-attachments/assets/19ebd44a-20ff-412c-b92f-cfb8efbe4b21)

Check out a demonstration video: [Watch
now](https://www.youtube.com/watch?v=pH8Bv\_\_9cnA)

##### LocalAI P2P Workers

Distribute weights across nodes by starting multiple LocalAI workers,
currently available only on the llama.cpp backend, with plans to expand
to other backends soon.


![346663124-1d2324fd-8b55-4fa2-9856-721a467969c2](https://togithub.com/user-attachments/assets/b8cadddf-a467-49cf-a1ed-8850de95366d)

Check out a demonstration video: [Watch
now](https://www.youtube.com/watch?v=ePH8PGqMSpo)

##### What's Changed

##### Bug fixes :bug:

- fix: make sure the GNUMake jobserver is passed to cmake for the
llama.cpp build by [@&#8203;cryptk](https://togithub.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2697](https://togithub.com/mudler/LocalAI/pull/2697)
- Using exec when starting a backend instead of spawning a new process
by [@&#8203;a17t](https://togithub.com/a17t) in
[https://github.com/mudler/LocalAI/pull/2720](https://togithub.com/mudler/LocalAI/pull/2720)
- fix(cuda): downgrade default version from 12.5 to 12.4 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2707](https://togithub.com/mudler/LocalAI/pull/2707)
- fix: Lora loading by [@&#8203;vaaale](https://togithub.com/vaaale) in
[https://github.com/mudler/LocalAI/pull/2893](https://togithub.com/mudler/LocalAI/pull/2893)
- fix: short-circuit when nodes aren't detected by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2909](https://togithub.com/mudler/LocalAI/pull/2909)
- fix: do not list txt files as potential models by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2910](https://togithub.com/mudler/LocalAI/pull/2910)

##### 🖧 P2P area

- feat(p2p): Federation and AI swarms by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2723](https://togithub.com/mudler/LocalAI/pull/2723)
- feat(p2p): allow to disable DHT and use only LAN by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2751](https://togithub.com/mudler/LocalAI/pull/2751)

##### Exciting New Features 🎉

- Allows to remove a backend from the list by
[@&#8203;mauromorales](https://togithub.com/mauromorales) in
[https://github.com/mudler/LocalAI/pull/2721](https://togithub.com/mudler/LocalAI/pull/2721)
- ci(Makefile): adds tts in binary releases by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2695](https://togithub.com/mudler/LocalAI/pull/2695)
- feat: HF `/scan` endpoint by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2566](https://togithub.com/mudler/LocalAI/pull/2566)
- feat(model-list): be consistent, skip known files from listing by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2760](https://togithub.com/mudler/LocalAI/pull/2760)
- feat(models): pull models from urls by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2750](https://togithub.com/mudler/LocalAI/pull/2750)
- feat(webui): show also models without a config in the welcome page by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2772](https://togithub.com/mudler/LocalAI/pull/2772)
- feat(install.sh): support federated install by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2752](https://togithub.com/mudler/LocalAI/pull/2752)
- feat(llama.cpp): support embeddings endpoints by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2871](https://togithub.com/mudler/LocalAI/pull/2871)
- feat(functions): parse broken JSON when we parse the raw results, use
dynamic rules for grammar keys by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2912](https://togithub.com/mudler/LocalAI/pull/2912)
- feat(federation): add load balanced option by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2915](https://togithub.com/mudler/LocalAI/pull/2915)

##### 🧠 Models

- models(gallery): :arrow_up: update checksum by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2701](https://togithub.com/mudler/LocalAI/pull/2701)
- models(gallery): add l3-8b-everything-cot by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2705](https://togithub.com/mudler/LocalAI/pull/2705)
- models(gallery): add hercules-5.0-qwen2-7b by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2708](https://togithub.com/mudler/LocalAI/pull/2708)
- models(gallery): add
llama3-8b-darkidol-2.2-uncensored-1048k-iq-imatrix by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2710](https://togithub.com/mudler/LocalAI/pull/2710)
- models(gallery): add llama-3-llamilitary by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2711](https://togithub.com/mudler/LocalAI/pull/2711)
- models(gallery): add tess-v2.5-gemma-2-27b-alpha by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2712](https://togithub.com/mudler/LocalAI/pull/2712)
- models(gallery): add arcee-agent by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2713](https://togithub.com/mudler/LocalAI/pull/2713)
- models(gallery): add gemma2-daybreak by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2714](https://togithub.com/mudler/LocalAI/pull/2714)
- models(gallery): add L3-Stheno-Maid-Blackroot-Grand-HORROR-16B-GGUF by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2715](https://togithub.com/mudler/LocalAI/pull/2715)
- models(gallery): add qwen2-7b-instruct-v0.8 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2717](https://togithub.com/mudler/LocalAI/pull/2717)
- models(gallery): add internlm2\_5-7b-chat-1m by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2719](https://togithub.com/mudler/LocalAI/pull/2719)
- models(gallery): add gemma-2-9b-it-sppo-iter3 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2722](https://togithub.com/mudler/LocalAI/pull/2722)
- models(gallery): add llama-3\_8b_unaligned_alpha by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2727](https://togithub.com/mudler/LocalAI/pull/2727)
- models(gallery): add l3-8b-lunaris-v1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2729](https://togithub.com/mudler/LocalAI/pull/2729)
- models(gallery): add llama-3\_8b_unaligned_alpha_rp_soup-i1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2734](https://togithub.com/mudler/LocalAI/pull/2734)
- models(gallery): add hathor_respawn-l3-8b-v0.8 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2738](https://togithub.com/mudler/LocalAI/pull/2738)
- models(gallery): add llama3-8b-instruct-replete-adapted by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2739](https://togithub.com/mudler/LocalAI/pull/2739)
- models(gallery): add llama-3-perky-pat-instruct-8b by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2740](https://togithub.com/mudler/LocalAI/pull/2740)
- models(gallery): add l3-uncen-merger-omelette-rp-v0.2-8b by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2741](https://togithub.com/mudler/LocalAI/pull/2741)
- models(gallery): add nymph\_8b-i1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2742](https://togithub.com/mudler/LocalAI/pull/2742)
- models(gallery): add smegmma-9b-v1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2743](https://togithub.com/mudler/LocalAI/pull/2743)
- models(gallery): add hathor_tahsin-l3-8b-v0.85 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2762](https://togithub.com/mudler/LocalAI/pull/2762)
- models(gallery): add replete-coder-instruct-8b-merged by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2782](https://togithub.com/mudler/LocalAI/pull/2782)
- models(gallery): add arliai-llama-3-8b-formax-v1.0 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2783](https://togithub.com/mudler/LocalAI/pull/2783)
- models(gallery): add smegmma-deluxe-9b-v1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2784](https://togithub.com/mudler/LocalAI/pull/2784)
- models(gallery): add l3-ms-astoria-8b by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2785](https://togithub.com/mudler/LocalAI/pull/2785)
- models(gallery): add halomaidrp-v1.33-15b-l3-i1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2786](https://togithub.com/mudler/LocalAI/pull/2786)
- models(gallery): add llama-3-patronus-lynx-70b-instruct by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2788](https://togithub.com/mudler/LocalAI/pull/2788)
- models(gallery): add llamax3 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2849](https://togithub.com/mudler/LocalAI/pull/2849)
- models(gallery): add arliai-llama-3-8b-dolfin-v0.5 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2852](https://togithub.com/mudler/LocalAI/pull/2852)
- models(gallery): add tiger-gemma-9b-v1-i1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2853](https://togithub.com/mudler/LocalAI/pull/2853)
- feat: models(gallery): add deepseek-v2-lite by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2658](https://togithub.com/mudler/LocalAI/pull/2658)
- models(gallery): :arrow_up: update checksum by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2860](https://togithub.com/mudler/LocalAI/pull/2860)
- models(gallery): add phi-3.1-mini-4k-instruct by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2863](https://togithub.com/mudler/LocalAI/pull/2863)
- models(gallery): :arrow_up: update checksum by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2887](https://togithub.com/mudler/LocalAI/pull/2887)
- models(gallery): add ezo model series (llama3, gemma) by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2891](https://togithub.com/mudler/LocalAI/pull/2891)
- models(gallery): add l3-8b-niitama-v1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2895](https://togithub.com/mudler/LocalAI/pull/2895)
- models(gallery): add mathstral-7b-v0.1-imat by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2901](https://togithub.com/mudler/LocalAI/pull/2901)
- models(gallery): add MythicalMaid/EtherealMaid 15b by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2902](https://togithub.com/mudler/LocalAI/pull/2902)
- models(gallery): add flammenai/Mahou-1.3d-mistral-7B by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2903](https://togithub.com/mudler/LocalAI/pull/2903)
- models(gallery): add big-tiger-gemma-27b-v1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2918](https://togithub.com/mudler/LocalAI/pull/2918)
- models(gallery): add phillama-3.8b-v0.1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2920](https://togithub.com/mudler/LocalAI/pull/2920)
- models(gallery): add qwen2-wukong-7b by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2921](https://togithub.com/mudler/LocalAI/pull/2921)
- models(gallery): add einstein-v4-7b by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2922](https://togithub.com/mudler/LocalAI/pull/2922)
- models(gallery): add gemma-2b-translation-v0.150 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2923](https://togithub.com/mudler/LocalAI/pull/2923)
- models(gallery): add emo-2b by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2924](https://togithub.com/mudler/LocalAI/pull/2924)
- models(gallery): add celestev1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2925](https://togithub.com/mudler/LocalAI/pull/2925)

##### 📖 Documentation and examples

- :arrow_up: Update docs version mudler/LocalAI by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2699](https://togithub.com/mudler/LocalAI/pull/2699)
- examples(gha): add example on how to run LocalAI in Github actions by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2716](https://togithub.com/mudler/LocalAI/pull/2716)
- docs(swagger): enhance coverage of APIs by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2753](https://togithub.com/mudler/LocalAI/pull/2753)
- docs(swagger): comment LocalAI gallery endpoints and rerankers by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2854](https://togithub.com/mudler/LocalAI/pull/2854)
- docs: add a note on benchmarks by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2857](https://togithub.com/mudler/LocalAI/pull/2857)
- docs(swagger): cover p2p endpoints by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2862](https://togithub.com/mudler/LocalAI/pull/2862)
- ci: use github action by [@&#8203;mudler](https://togithub.com/mudler)
in
[https://github.com/mudler/LocalAI/pull/2899](https://togithub.com/mudler/LocalAI/pull/2899)
- docs: update try-it-out.md by
[@&#8203;eltociear](https://togithub.com/eltociear) in
[https://github.com/mudler/LocalAI/pull/2906](https://togithub.com/mudler/LocalAI/pull/2906)
- docs(swagger): core more localai/openai endpoints by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2904](https://togithub.com/mudler/LocalAI/pull/2904)
- docs: more swagger, update docs by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2907](https://togithub.com/mudler/LocalAI/pull/2907)
- feat(swagger): update swagger by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2916](https://togithub.com/mudler/LocalAI/pull/2916)

##### 👒 Dependencies

- :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2700](https://togithub.com/mudler/LocalAI/pull/2700)
- :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2704](https://togithub.com/mudler/LocalAI/pull/2704)
- deps(whisper.cpp): update to latest commit by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2709](https://togithub.com/mudler/LocalAI/pull/2709)
- :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2718](https://togithub.com/mudler/LocalAI/pull/2718)
- :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2725](https://togithub.com/mudler/LocalAI/pull/2725)
- :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2736](https://togithub.com/mudler/LocalAI/pull/2736)
- :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2744](https://togithub.com/mudler/LocalAI/pull/2744)
- :arrow_up: Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2746](https://togithub.com/mudler/LocalAI/pull/2746)
- :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2747](https://togithub.com/mudler/LocalAI/pull/2747)
- :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2755](https://togithub.com/mudler/LocalAI/pull/2755)
- :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2767](https://togithub.com/mudler/LocalAI/pull/2767)
- :arrow_up: Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2756](https://togithub.com/mudler/LocalAI/pull/2756)
- :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2774](https://togithub.com/mudler/LocalAI/pull/2774)
- chore(deps): Update Dependencies by
[@&#8203;reneleonhardt](https://togithub.com/reneleonhardt) in
[https://github.com/mudler/LocalAI/pull/2538](https://togithub.com/mudler/LocalAI/pull/2538)
- chore(deps): Bump dependabot/fetch-metadata from 2.1.0 to 2.2.0 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2791](https://togithub.com/mudler/LocalAI/pull/2791)
- chore(deps): Bump llama-index from 0.9.48 to 0.10.55 in
/examples/chainlit by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2795](https://togithub.com/mudler/LocalAI/pull/2795)
- chore(deps): Bump openai from 1.33.0 to 1.35.13 in /examples/functions
by [@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2793](https://togithub.com/mudler/LocalAI/pull/2793)
- chore(deps): Bump nginx from 1.a.b.c to 1.27.0 in /examples/k8sgpt by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2790](https://togithub.com/mudler/LocalAI/pull/2790)
- chore(deps): Bump setuptools from 69.5.1 to 70.3.0 in
/backend/python/coqui by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2798](https://togithub.com/mudler/LocalAI/pull/2798)
- chore(deps): Bump inflect from 7.0.0 to 7.3.1 in
/backend/python/openvoice by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2796](https://togithub.com/mudler/LocalAI/pull/2796)
- chore(deps): Bump setuptools from 69.5.1 to 70.3.0 in
/backend/python/parler-tts by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2797](https://togithub.com/mudler/LocalAI/pull/2797)
- chore(deps): Bump setuptools from 69.5.1 to 70.3.0 in
/backend/python/petals by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2799](https://togithub.com/mudler/LocalAI/pull/2799)
- chore(deps): Bump causal-conv1d from 1.2.0.post2 to 1.4.0 in
/backend/python/mamba by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2792](https://togithub.com/mudler/LocalAI/pull/2792)
- chore(deps): Bump docs/themes/hugo-theme-relearn from `c25bc2a` to
`1b2e139` by [@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2801](https://togithub.com/mudler/LocalAI/pull/2801)
- chore(deps): Bump tenacity from 8.3.0 to 8.5.0 in
/examples/langchain/langchainpy-localai-example by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2803](https://togithub.com/mudler/LocalAI/pull/2803)
- chore(deps): Bump openai from 1.33.0 to 1.35.13 in
/examples/langchain-chroma by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2794](https://togithub.com/mudler/LocalAI/pull/2794)
- chore(deps): Bump setuptools from 69.5.1 to 70.3.0 in
/backend/python/bark by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2805](https://togithub.com/mudler/LocalAI/pull/2805)
- chore(deps): Bump streamlit from 1.30.0 to 1.36.0 in
/examples/streamlit-bot by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2804](https://togithub.com/mudler/LocalAI/pull/2804)
- chore(deps): Bump setuptools from 69.5.1 to 70.3.0 in
/backend/python/diffusers by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2807](https://togithub.com/mudler/LocalAI/pull/2807)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in
/backend/python/exllama2 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2809](https://togithub.com/mudler/LocalAI/pull/2809)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in
/backend/python/common/template by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2802](https://togithub.com/mudler/LocalAI/pull/2802)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in
/backend/python/autogptq by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2800](https://togithub.com/mudler/LocalAI/pull/2800)
- chore(deps): Bump weaviate-client from 4.6.4 to 4.6.5 in
/examples/chainlit by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2811](https://togithub.com/mudler/LocalAI/pull/2811)
- chore(deps): Bump gradio from 4.36.1 to 4.37.1 in
/backend/python/openvoice in the pip group by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2815](https://togithub.com/mudler/LocalAI/pull/2815)
- chore(deps): Bump setuptools from 69.5.1 to 70.3.0 in
/backend/python/vall-e-x by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2812](https://togithub.com/mudler/LocalAI/pull/2812)
- chore(deps): Bump certifi from 2024.6.2 to 2024.7.4 in
/examples/langchain/langchainpy-localai-example by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2814](https://togithub.com/mudler/LocalAI/pull/2814)
- chore(deps): Bump setuptools from 69.5.1 to 70.3.0 in
/backend/python/transformers by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2817](https://togithub.com/mudler/LocalAI/pull/2817)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in
/backend/python/sentencetransformers by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2813](https://togithub.com/mudler/LocalAI/pull/2813)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in
/backend/python/rerankers by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2819](https://togithub.com/mudler/LocalAI/pull/2819)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in
/backend/python/parler-tts by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2818](https://togithub.com/mudler/LocalAI/pull/2818)
- chore(deps): Bump setuptools from 69.5.1 to 70.3.0 in
/backend/python/vllm by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2820](https://togithub.com/mudler/LocalAI/pull/2820)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in
/backend/python/coqui by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2825](https://togithub.com/mudler/LocalAI/pull/2825)
- chore(deps): Bump faster-whisper from 0.9.0 to 1.0.3 in
/backend/python/openvoice by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2829](https://togithub.com/mudler/LocalAI/pull/2829)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in
/backend/python/exllama by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2841](https://togithub.com/mudler/LocalAI/pull/2841)
- chore(deps): Bump scipy from 1.13.0 to 1.14.0 in
/backend/python/transformers-musicgen by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2842](https://togithub.com/mudler/LocalAI/pull/2842)
- chore: :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2846](https://togithub.com/mudler/LocalAI/pull/2846)
- chore(deps): Bump langchain from 0.2.3 to 0.2.7 in /examples/functions
by [@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2806](https://togithub.com/mudler/LocalAI/pull/2806)
- chore(deps): Bump mamba-ssm from 1.2.0.post1 to 2.2.2 in
/backend/python/mamba by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2821](https://togithub.com/mudler/LocalAI/pull/2821)
- chore(deps): Bump pydantic from 2.7.3 to 2.8.2 in
/examples/langchain/langchainpy-localai-example by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2832](https://togithub.com/mudler/LocalAI/pull/2832)
- chore(deps): Bump langchain from 0.2.3 to 0.2.7 in
/examples/langchain-chroma by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2822](https://togithub.com/mudler/LocalAI/pull/2822)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in /backend/python/bark
by [@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2831](https://togithub.com/mudler/LocalAI/pull/2831)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in
/backend/python/diffusers by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2833](https://togithub.com/mudler/LocalAI/pull/2833)
- chore(deps): Bump setuptools from 69.5.1 to 70.3.0 in
/backend/python/autogptq by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2816](https://togithub.com/mudler/LocalAI/pull/2816)
- chore(deps): Bump gradio from 4.36.1 to 4.38.1 in
/backend/python/openvoice by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2840](https://togithub.com/mudler/LocalAI/pull/2840)
- chore(deps): Bump the pip group across 1 directory with 2 updates by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2848](https://togithub.com/mudler/LocalAI/pull/2848)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in
/backend/python/transformers by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2837](https://togithub.com/mudler/LocalAI/pull/2837)
- chore(deps): Bump sentence-transformers from 2.5.1 to 3.0.1 in
/backend/python/sentencetransformers by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2826](https://togithub.com/mudler/LocalAI/pull/2826)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in
/backend/python/vall-e-x by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2830](https://togithub.com/mudler/LocalAI/pull/2830)
- chore(deps): Bump setuptools from 69.5.1 to 70.3.0 in
/backend/python/rerankers by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2834](https://togithub.com/mudler/LocalAI/pull/2834)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in /backend/python/vllm
by [@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2839](https://togithub.com/mudler/LocalAI/pull/2839)
- chore(deps): Bump librosa from 0.9.1 to 0.10.2.post1 in
/backend/python/openvoice by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2836](https://togithub.com/mudler/LocalAI/pull/2836)
- chore(deps): Bump setuptools from 69.5.1 to 70.3.0 in
/backend/python/transformers-musicgen by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2843](https://togithub.com/mudler/LocalAI/pull/2843)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in
/backend/python/mamba by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2808](https://togithub.com/mudler/LocalAI/pull/2808)
- chore(deps): Bump llama-index from 0.10.43 to 0.10.55 in
/examples/langchain-chroma by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2810](https://togithub.com/mudler/LocalAI/pull/2810)
- chore(deps): Bump langchain from 0.2.3 to 0.2.7 in
/examples/langchain/langchainpy-localai-example by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2824](https://togithub.com/mudler/LocalAI/pull/2824)
- chore(deps): Bump numpy from 1.26.4 to 2.0.0 in
/backend/python/openvoice by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2823](https://togithub.com/mudler/LocalAI/pull/2823)
- chore(deps): Bump grpcio from 1.64.0 to 1.64.1 in
/backend/python/transformers-musicgen by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2844](https://togithub.com/mudler/LocalAI/pull/2844)
- build(deps): bump docker/build-push-action from 5 to 6 by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2592](https://togithub.com/mudler/LocalAI/pull/2592)
- chore(deps): Bump chromadb from 0.5.0 to 0.5.4 in
/examples/langchain-chroma by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2828](https://togithub.com/mudler/LocalAI/pull/2828)
- chore(deps): Bump torch from 2.2.0 to 2.3.1 in /backend/python/mamba
by [@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2835](https://togithub.com/mudler/LocalAI/pull/2835)
- chore: :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2851](https://togithub.com/mudler/LocalAI/pull/2851)
- chore(deps): Bump setuptools from 69.5.1 to 70.3.0 in
/backend/python/sentencetransformers by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2838](https://togithub.com/mudler/LocalAI/pull/2838)
- chore: update edgevpn dependency by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2855](https://togithub.com/mudler/LocalAI/pull/2855)
- chore: :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2859](https://togithub.com/mudler/LocalAI/pull/2859)
- chore(deps): Bump langchain from 0.2.7 to 0.2.8 in /examples/functions
by [@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2873](https://togithub.com/mudler/LocalAI/pull/2873)
- chore(deps): Bump langchain from 0.2.7 to 0.2.8 in
/examples/langchain/langchainpy-localai-example by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2874](https://togithub.com/mudler/LocalAI/pull/2874)
- chore(deps): Bump numexpr from 2.10.0 to 2.10.1 in
/examples/langchain/langchainpy-localai-example by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2877](https://togithub.com/mudler/LocalAI/pull/2877)
- chore: :arrow_up: Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2885](https://togithub.com/mudler/LocalAI/pull/2885)
- chore: :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2886](https://togithub.com/mudler/LocalAI/pull/2886)
- chore(deps): Bump debugpy from 1.8.1 to 1.8.2 in
/examples/langchain/langchainpy-localai-example by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2878](https://togithub.com/mudler/LocalAI/pull/2878)
- chore(deps): Bump langchain-community from 0.2.5 to 0.2.7 in
/examples/langchain/langchainpy-localai-example by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2875](https://togithub.com/mudler/LocalAI/pull/2875)
- chore(deps): Bump langchain from 0.2.7 to 0.2.8 in
/examples/langchain-chroma by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2872](https://togithub.com/mudler/LocalAI/pull/2872)
- chore(deps): Bump openai from 1.33.0 to 1.35.13 in
/examples/langchain/langchainpy-localai-example by
[@&#8203;dependabot](https://togithub.com/dependabot) in
[https://github.com/mudler/LocalAI/pull/2876](https://togithub.com/mudler/LocalAI/pull/2876)
- chore: :arrow_up: Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2898](https://togithub.com/mudler/LocalAI/pull/2898)
- chore: :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2897](https://togithub.com/mudler/LocalAI/pull/2897)
- chore: :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2905](https://togithub.com/mudler/LocalAI/pull/2905)
- chore: :arrow_up: Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2917](https://togithub.com/mudler/LocalAI/pull/2917)

##### Other Changes

- ci: add pipelines for discord notifications by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2703](https://togithub.com/mudler/LocalAI/pull/2703)
- ci(arm64): fix gRPC build by adding googletest to CMakefile by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2754](https://togithub.com/mudler/LocalAI/pull/2754)
- fix: arm builds via disabling abseil tests by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2758](https://togithub.com/mudler/LocalAI/pull/2758)
- ci(grpc): disable ABSEIL tests by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2759](https://togithub.com/mudler/LocalAI/pull/2759)
- ci(deps): add libgmock-dev by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2761](https://togithub.com/mudler/LocalAI/pull/2761)
- fix abseil test issue \[attempt 3] by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2769](https://togithub.com/mudler/LocalAI/pull/2769)
- feat(swagger): update swagger by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2766](https://togithub.com/mudler/LocalAI/pull/2766)
- ci: Do not test the full matrix on PRs by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2771](https://togithub.com/mudler/LocalAI/pull/2771)
- Git fetch specific branch instead of full tree during build by
[@&#8203;LoricOSC](https://togithub.com/LoricOSC) in
[https://github.com/mudler/LocalAI/pull/2748](https://togithub.com/mudler/LocalAI/pull/2748)
- fix(ci): small fixups to checksum_checker.sh by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2776](https://togithub.com/mudler/LocalAI/pull/2776)
- fix(ci): fixup correct path for check_and_update.py by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2777](https://togithub.com/mudler/LocalAI/pull/2777)
- fixes to `check_and_update.py` script by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2778](https://togithub.com/mudler/LocalAI/pull/2778)
- Update remaining git clones to git fetch by
[@&#8203;LoricOSC](https://togithub.com/LoricOSC) in
[https://github.com/mudler/LocalAI/pull/2779](https://togithub.com/mudler/LocalAI/pull/2779)
- feat(scripts): add scripts to help adding new models to the gallery by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2789](https://togithub.com/mudler/LocalAI/pull/2789)
- build: speedup `git submodule update` with `--single-branch` by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2847](https://togithub.com/mudler/LocalAI/pull/2847)
- Revert "chore(deps): Bump inflect from 7.0.0 to 7.3.1 in
/backend/python/openvoice" by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2856](https://togithub.com/mudler/LocalAI/pull/2856)
- Revert "chore(deps): Bump librosa from 0.9.1 to 0.10.2.post1 in
/backend/python/openvoice" by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2861](https://togithub.com/mudler/LocalAI/pull/2861)
- feat(swagger): update swagger by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2858](https://togithub.com/mudler/LocalAI/pull/2858)
- Revert "chore(deps): Bump numpy from 1.26.4 to 2.0.0 in
/backend/python/openvoice" by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2868](https://togithub.com/mudler/LocalAI/pull/2868)
- feat(swagger): update swagger by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2884](https://togithub.com/mudler/LocalAI/pull/2884)
- fix: update grpcio version to match version used in builds by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2888](https://togithub.com/mudler/LocalAI/pull/2888)
- fix: cleanup indentation and remove duplicate dockerfile stanza by
[@&#8203;cryptk](https://togithub.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2889](https://togithub.com/mudler/LocalAI/pull/2889)
- ci: add workflow to comment new Opened PRs by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2892](https://togithub.com/mudler/LocalAI/pull/2892)
- build: fix go.mod - don't import ourself by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2896](https://togithub.com/mudler/LocalAI/pull/2896)
- feat(swagger): update swagger by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2908](https://togithub.com/mudler/LocalAI/pull/2908)
- refactor: move federated server logic to its own service by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2914](https://togithub.com/mudler/LocalAI/pull/2914)
- refactor: groundwork - add pkg/concurrency and the associated test
file by [@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2745](https://togithub.com/mudler/LocalAI/pull/2745)

##### New Contributors

- [@&#8203;a17t](https://togithub.com/a17t) made their first
contribution in
[https://github.com/mudler/LocalAI/pull/2720](https://togithub.com/mudler/LocalAI/pull/2720)
- [@&#8203;LoricOSC](https://togithub.com/LoricOSC) made their first
contribution in
[https://github.com/mudler/LocalAI/pull/2748](https://togithub.com/mudler/LocalAI/pull/2748)
- [@&#8203;vaaale](https://togithub.com/vaaale) made their first
contribution in
[https://github.com/mudler/LocalAI/pull/2893](https://togithub.com/mudler/LocalAI/pull/2893)

**Full Changelog**:
https://github.com/mudler/LocalAI/compare/v2.18.1...v2.19.0

###
[`v2.19.0`](https://togithub.com/mudler/LocalAI/releases/tag/v2.19.0)

[Compare
Source](https://togithub.com/mudler/LocalAI/compare/v2.18.1...v2.19.0)


![local-ai-release-219-shadow](https://togithub.com/user-attachments/assets/c5d7c930-656f-410d-aab9-455a466925fe)

##### LocalAI 2.19.0 is out! :mega:

##### TLDR; Summary spotlight

- 🖧 Federated Instances via P2P: LocalAI now supports federated
instances with P2P, offering both load-balanced and non-load-balanced
options.
- 🎛️ P2P Dashboard: A new dashboard to guide and assist in setting up
P2P instances with auto-discovery using shared tokens.
- 🔊 TTS Integration: Text-to-Speech (TTS) is now included in the binary
releases.
- 🛠️ Enhanced Installer: The installer script now supports setting up
federated instances.
-   📥 Model Pulling: Models can now be pulled directly via URL.
- 🖼️ WebUI Enhancements: Visual improvements and cleanups to the WebUI
and model lists.
- 🧠 llama-cpp Backend: The llama-cpp (grpc) backend now supports
embedding ( https://localai.io/features/embeddings/#llamacpp-embeddings
)
-   ⚙️ Tool Support: Small enhancements to tools with disabled grammars.

##### 🖧 LocalAI Federation and AI swarms

<p align="center">
<img
src="https://github.com/user-attachments/assets/17b39f8a-fc41-47d9-b846-b3a88307813b"/>
</p>

LocalAI is revolutionizing the future of distributed AI workloads by
making it simpler and more accessible. No more complex setups, Docker or
Kubernetes configurations – LocalAI allows you to create your own AI
cluster with minimal friction. By auto-discovering and sharing work or
weights of the LLM model across your existing devices, LocalAI aims to
scale both horizontally and vertically with ease.

##### How it works?

Starting LocalAI with `--p2p` generates a shared token for connecting
multiple instances: and that's all you need to create AI clusters,
eliminating the need for intricate network setups. Simply navigate to
the "Swarm" section in the WebUI and follow the on-screen instructions.

For fully shared instances, initiate LocalAI with `--p2p --federated`
and adhere to the Swarm section's guidance. This feature, while still
experimental, offers a tech preview quality experience.

##### Federated LocalAI

Launch multiple LocalAI instances and cluster them together to share
requests across the cluster. The "Swarm" tab in the WebUI provides
one-liner instructions on connecting various LocalAI instances using a
shared token. Instances will auto-discover each other, even across
different networks.


![346663124-1d2324fd-8b55-4fa2-9856-721a467969c2](https://togithub.com/user-attachments/assets/19ebd44a-20ff-412c-b92f-cfb8efbe4b21)

Check out a demonstration video: [Watch
now](https://www.youtube.com/watch?v=pH8Bv\_\_9cnA)

##### LocalAI P2P Workers

Distribute weights across nodes by starting multiple LocalAI workers,
currently available only on the llama.cpp backend, with plans to expand
to other backends soon.


![346663124-1d2324fd-8b55-4fa2-9856-721a467969c2](https://togithub.com/user-attachments/assets/b8cadddf-a467-49cf-a1ed-8850de95366d)

Check out a demonstration video: [Watch
now](https://www.youtube.com/watch?v=ePH8PGqMSpo)

##### What's Changed

##### Bug fixes :bug:

- fix: make sure the GNUMake jobserver is passed to cmake for the
llama.cpp build by [@&#8203;cryptk](https://togithub.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2697](https://togithub.com/mudler/LocalAI/pull/2697)
- Using exec when starting a backend instead of spawning a new process
by [@&#8203;a17t](https://togithub.com/a17t) in
[https://github.com/mudler/LocalAI/pull/2720](https://togithub.com/mudler/LocalAI/pull/2720)
- fix(cuda): downgrade default version from 12.5 to 12.4 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2707](https://togithub.com/mudler/LocalAI/pull/2707)
- fix: Lora loading by [@&#8203;vaaale](https://togithub.com/vaaale) in
[https://github.com/mudler/LocalAI/pull/2893](https://togithub.com/mudler/LocalAI/pull/2893)
- fix: short-circuit when nodes aren't detected by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2909](https://togithub.com/mudler/LocalAI/pull/2909)
- fix: do not list txt files as potential models by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2910](https://togithub.com/mudler/LocalAI/pull/2910)

##### 🖧 P2P area

- feat(p2p): Federation and AI swarms by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2723](https://togithub.com/mudler/LocalAI/pull/2723)
- feat(p2p): allow to disable DHT and use only LAN by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2751](https://togithub.com/mudler/LocalAI/pull/2751)

##### Exciting New Features 🎉

- Allows to remove a backend from the list by
[@&#8203;mauromorales](https://togithub.com/mauromorales) in
[https://github.com/mudler/LocalAI/pull/2721](https://togithub.com/mudler/LocalAI/pull/2721)
- ci(Makefile): adds tts in binary releases by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2695](https://togithub.com/mudler/LocalAI/pull/2695)
- feat: HF `/scan` endpoint by
[@&#8203;dave-gray101](https://togithub.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2566](https://togithub.com/mudler/LocalAI/pull/2566)
- feat(model-list): be consistent, skip known files from listing by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2760](https://togithub.com/mudler/LocalAI/pull/2760)
- feat(models): pull models from urls by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2750](https://togithub.com/mudler/LocalAI/pull/2750)
- feat(webui): show also models without a config in the welcome page by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2772](https://togithub.com/mudler/LocalAI/pull/2772)
- feat(install.sh): support federated install by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2752](https://togithub.com/mudler/LocalAI/pull/2752)
- feat(llama.cpp): support embeddings endpoints by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2871](https://togithub.com/mudler/LocalAI/pull/2871)
- feat(functions): parse broken JSON when we parse the raw results, use
dynamic rules for grammar keys by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2912](https://togithub.com/mudler/LocalAI/pull/2912)
- feat(federation): add load balanced option by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2915](https://togithub.com/mudler/LocalAI/pull/2915)

##### 🧠 Models

- models(gallery): :arrow_up: update checksum by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2701](https://togithub.com/mudler/LocalAI/pull/2701)
- models(gallery): add l3-8b-everything-cot by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2705](https://togithub.com/mudler/LocalAI/pull/2705)
- models(gallery): add hercules-5.0-qwen2-7b by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2708](https://togithub.com/mudler/LocalAI/pull/2708)
- models(gallery): add
llama3-8b-darkidol-2.2-uncensored-1048k-iq-imatrix by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2710](https://togithub.com/mudler/LocalAI/pull/2710)
- models(gallery): add llama-3-llamilitary by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2711](https://togithub.com/mudler/LocalAI/pull/2711)
- models(gallery): add tess-v2.5-gemma-2-27b-alpha by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2712](https://togithub.com/mudler/LocalAI/pull/2712)
- models(gallery): add arcee-agent by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2713](https://togithub.com/mudler/LocalAI/pull/2713)
- models(gallery): add gemma2-daybreak by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2714](https://togithub.com/mudler/LocalAI/pull/2714)
- models(gallery): add L3-Stheno-Maid-Blackroot-Grand-HORROR-16B-GGUF by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2715](https://togithub.com/mudler/LocalAI/pull/2715)
- models(gallery): add qwen2-7b-instruct-v0.8 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2717](https://togithub.com/mudler/LocalAI/pull/2717)
- models(gallery): add internlm2\_5-7b-chat-1m by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2719](https://togithub.com/mudler/LocalAI/pull/2719)
- models(gallery): add gemma-2-9b-it-sppo-iter3 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2722](https://togithub.com/mudler/LocalAI/pull/2722)
- models(gallery): add llama-3\_8b_unaligned_alpha by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2727](https://togithub.com/mudler/LocalAI/pull/2727)
- models(gallery): add l3-8b-lunaris-v1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2729](https://togithub.com/mudler/LocalAI/pull/2729)
- models(gallery): add llama-3\_8b_unaligned_alpha_rp_soup-i1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2734](https://togithub.com/mudler/LocalAI/pull/2734)
- models(gallery): add hathor_respawn-l3-8b-v0.8 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2738](https://togithub.com/mudler/LocalAI/pull/2738)
- models(gallery): add llama3-8b-instruct-replete-adapted by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2739](https://togithub.com/mudler/LocalAI/pull/2739)
- models(gallery): add llama-3-perky-pat-instruct-8b by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2740](https://togithub.com/mudler/LocalAI/pull/2740)
- models(gallery): add l3-uncen-merger-omelette-rp-v0.2-8b by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2741](https://togithub.com/mudler/LocalAI/pull/2741)
- models(gallery): add nymph\_8b-i1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2742](https://togithub.com/mudler/LocalAI/pull/2742)
- models(gallery): add smegmma-9b-v1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2743](https://togithub.com/mudler/LocalAI/pull/2743)
- models(gallery): add hathor_tahsin-l3-8b-v0.85 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2762](https://togithub.com/mudler/LocalAI/pull/2762)
- models(gallery): add replete-coder-instruct-8b-merged by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2782](https://togithub.com/mudler/LocalAI/pull/2782)
- models(gallery): add arliai-llama-3-8b-formax-v1.0 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2783](https://togithub.com/mudler/LocalAI/pull/2783)
- models(gallery): add smegmma-deluxe-9b-v1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2784](https://togithub.com/mudler/LocalAI/pull/2784)
- models(gallery): add l3-ms-astoria-8b by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2785](https://togithub.com/mudler/LocalAI/pull/2785)
- models(gallery): add halomaidrp-v1.33-15b-l3-i1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2786](https://togithub.com/mudler/LocalAI/pull/2786)
- models(gallery): add llama-3-patronus-lynx-70b-instruct by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2788](https://togithub.com/mudler/LocalAI/pull/2788)
- models(gallery): add llamax3 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2849](https://togithub.com/mudler/LocalAI/pull/2849)
- models(gallery): add arliai-llama-3-8b-dolfin-v0.5 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2852](https://togithub.com/mudler/LocalAI/pull/2852)
- models(gallery): add tiger-gemma-9b-v1-i1 by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2853](https://togithub.com/mudler/LocalAI/pull/2853)
- feat: models(gallery): add deepseek-v2-lite by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2658](https://togithub.com/mudler/LocalAI/pull/2658)
- models(gallery): :arrow_up: update checksum by
[@&#8203;localai-bot](https://togithub.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2860](https://togithub.com/mudler/LocalAI/pull/2860)
- models(gallery): add phi-3.1-mini-4k-instruct by
[@&#8203;mudler](https://togithub.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2863](https://togithub.com/mudl

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Enabled.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR has been generated by [Renovate
Bot](https://togithub.com/renovatebot/renovate).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNy40NDAuNiIsInVwZGF0ZWRJblZlciI6IjM3LjQ0MC42IiwidGFyZ2V0QnJhbmNoIjoibWFzdGVyIiwibGFiZWxzIjpbImF1dG9tZXJnZSIsInVwZGF0ZS9kb2NrZXIvZ2VuZXJhbC9ub24tbWFqb3IiXX0=-->
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant