Skip to content

feat(container): update image docker.io/ollama/ollama ( 0.15.6 ➔ 0.16.0 )#4066

Merged
binaryn3xus merged 1 commit intomainfrom
renovate/docker.io-ollama-ollama-0.x
Feb 12, 2026
Merged

feat(container): update image docker.io/ollama/ollama ( 0.15.6 ➔ 0.16.0 )#4066
binaryn3xus merged 1 commit intomainfrom
renovate/docker.io-ollama-ollama-0.x

Conversation

@unsc-oni-ancilla
Copy link
Contributor

This PR contains the following updates:

Package Update Change
docker.io/ollama/ollama minor 0.15.60.16.0

Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about these updates again.


  • If you want to rebase/retry this PR, check this box

This PR has been generated by Renovate Bot.

@unsc-oni-ancilla unsc-oni-ancilla bot added renovate/container type/minor area/kubernetes Changes made in the kubernetes directory labels Feb 12, 2026
@unsc-oni-ancilla
Copy link
Contributor Author

--- kubernetes/apps/ai/ollama/app Kustomization: ai/ollama HelmRelease: ai/ollama

+++ kubernetes/apps/ai/ollama/app Kustomization: ai/ollama HelmRelease: ai/ollama

@@ -44,13 +44,13 @@

               OLLAMA_MODELS: /models
               OLLAMA_NUM_PARALLEL: 1
               OLLAMA_ORIGINS: '*'
               TZ: null
             image:
               repository: docker.io/ollama/ollama
-              tag: 0.15.6
+              tag: 0.16.0
             resources:
               limits:
                 memory: 12Gi
                 nvidia.com/gpu: 1
               requests:
                 cpu: 200m
@@ -90,13 +90,13 @@

             - name: OLLAMA_MODELS
               value: /models
             - name: OLLAMA_HOST
               value: 0.0.0.0
             image:
               repository: docker.io/ollama/ollama
-              tag: 0.15.6
+              tag: 0.16.0
     defaultPodOptions:
       nodeSelector:
         nvidia.com/gpu.present: 'true'
       runtimeClassName: nvidia
       tolerations:
       - effect: NoSchedule

@unsc-oni-ancilla
Copy link
Contributor Author

--- HelmRelease: ai/ollama Deployment: ai/ollama

+++ HelmRelease: ai/ollama Deployment: ai/ollama

@@ -74,13 +74,13 @@

           wait $OLLAMA_PID 2>/dev/null || true
         env:
         - name: OLLAMA_MODELS
           value: /models
         - name: OLLAMA_HOST
           value: 0.0.0.0
-        image: docker.io/ollama/ollama:0.15.6
+        image: docker.io/ollama/ollama:0.16.0
         name: pull-models
         volumeMounts:
         - mountPath: /models
           name: models
       containers:
       - env:
@@ -99,13 +99,13 @@

         - name: OLLAMA_NUM_PARALLEL
           value: '1'
         - name: OLLAMA_ORIGINS
           value: '*'
         - name: TZ
           value: null
-        image: docker.io/ollama/ollama:0.15.6
+        image: docker.io/ollama/ollama:0.16.0
         name: app
         resources:
           limits:
             memory: 12Gi
             nvidia.com/gpu: 1
           requests:

@binaryn3xus binaryn3xus merged commit 10a4c61 into main Feb 12, 2026
11 checks passed
@binaryn3xus binaryn3xus deleted the renovate/docker.io-ollama-ollama-0.x branch February 12, 2026 23:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area/kubernetes Changes made in the kubernetes directory renovate/container type/minor

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant