diff --git a/_blog.yml b/_blog.yml
index b38d344fe4..a8ff097d7d 100644
--- a/_blog.yml
+++ b/_blog.yml
@@ -4977,3 +4977,10 @@
- text-to-speech
- voice
- voice-cloning
+
+- local: google-cloud
+ date: Nov 13, 2025
+ tags:
+ - partnerships
+ - google
+ - announcement
diff --git a/assets/google-cloud/google-cloud-thumbnail.png b/assets/google-cloud/google-cloud-thumbnail.png
new file mode 100644
index 0000000000..6b0cfc48cd
Binary files /dev/null and b/assets/google-cloud/google-cloud-thumbnail.png differ
diff --git a/gcp-partnership.md b/gcp-partnership.md
index bcb966bb05..c70162d4b5 100644
--- a/gcp-partnership.md
+++ b/gcp-partnership.md
@@ -10,6 +10,8 @@ authors:

+> [!TIP]
+> 11/13/2025 Update: we announced a [new and deeper partnership with Google Cloud](https://huggingface.co/blog/google-cloud) to enable companies to build their own AI with open models!
At Hugging Face, we want to enable all companies to build their own AI, leveraging open models and open source technologies. Our goal is to build an open platform, making it easy for data scientists, machine learning engineers and developers to access the latest models from the community, and use them within the platform of their choice.
diff --git a/google-cloud.md b/google-cloud.md
new file mode 100644
index 0000000000..c100301c7a
--- /dev/null
+++ b/google-cloud.md
@@ -0,0 +1,53 @@
+---
+title: "Building for an Open Future - our new partnership with Google Cloud"
+thumbnail: /blog/assets/google-cloud/google-cloud-thumbnail.png
+authors:
+- user: jeffboudier
+- user: pagezyhf
+---
+
+# Building for an Open Future - our new partnership with Google Cloud
+
+
+
+Today, we are happy to announce a new and deeper partnership with Google Cloud, to enable companies to build their own AI with open models.
+
+“_Google has made some of the most impactful contributions to open AI, from the OG transformer to the Gemma models. I believe in a future where all companies will build and customize their own AI. With this new strategic partnership, we’re making it easy to do on Google Cloud._” says Jeff Boudier, at Hugging Face.
+
+“_Hugging Face has been the driving force enabling companies large and small all over the world to access, use and customize now more than 2 million open models, and we’ve been proud to contribute over 1,000 of our models to the community_”, says Ryan J. Salva, Senior Director of Product Management at Google Cloud. “_Together we will make Google Cloud the best place to build with open models._”
+
+## A Partnership for Google Cloud customers
+
+Google Cloud customers use open models from Hugging Face in many of its leading AI services. In Vertex AI, the most popular open models are ready to deploy in a couple clicks within Model Garden. Customers who want greater control over their AI infrastructure can find a similar model library available in GKE AI/ML, or use pre-configured environments maintained by Hugging Face. Customers also run AI inference workloads with Cloud Run GPUs, enabling serverless open model deployments.
+
+The common thread: we work with Google Cloud to build seamless experiences fully leveraging the unique capabilities of each service to offer choice to the customers.
+
+
+
+## The Gateway to Open Models - A Fast Lane for Google Cloud Customers
+
+Usage of Hugging Face by Google Cloud customers has grown 10x over the last 3 years, and today, this translates into tens of petabytes of model downloads every month, in billions of requests.
+
+To make sure Google Cloud customers have the best experience building with models and datasets from Hugging Face, we are working together to create a CDN Gateway for Hugging Face repositories built on top of both Hugging Face Xet optimized storage and data transfer technologies, and Google Cloud advanced storage and networking capabilities.
+
+This CDN Gateway will cache Hugging Face models and datasets directly on Google Cloud to significantly reduce downloading times, and strengthen model supply chain robustness for Google Cloud customers. Whether you’re using Vertex, GKE, Cloud Run or just building your own stack in VMs in Compute Engine, you will benefit from faster time-to-first-token and simplified model governance.
+
+## A partnership for Hugging Face customers
+
+Hugging Face [Inference Endpoints](https://endpoints.huggingface.co/) is the easiest way to go from model to deployment in just a couple clicks. Through this deepened partnership we will bring the unique capabilities and cost performance of Google Cloud to Hugging Face customers, starting with Inference Endpoints. Expect more and newer instances available as well as price drops!
+
+
+
+We will ensure all the fruits of our product and engineering collaboration become easily available to the 10 million AI Builders on Hugging Face. Going from a model page to deploying on Vertex Model Garden or GKE should only take a couple steps. Taking a private model securely hosted in an Enterprise organization on Hugging Face should be as easy as working with public models.
+
+TPUs, Google custom AI accelerator chips now in their seventh generation, have been steadily improving in performance and software stack maturity. We want to make sure Hugging Face users can fully benefit from the current and the next generations of TPUs when they build AI with open models. We are excited to make TPUs as easy to use as GPUs for Hugging Face models, thanks to native support in our libraries.
+
+Additionally, this new partnership will enable Hugging Face to leverage Google industry-leading security technology to make the millions of open models on Hugging Face more secure. Powered by [Google Threat Intelligence](https://cloud.google.com/security/products/threat-intelligence) and [Mandiant](https://www.mandiant.com/), this joint effort aims to secure models, datasets and Spaces as you use the Hugging Face Hub daily.
+
+## Building the open future of AI together
+
+We want to see a future where every company can build their own AI with open models and host it within their own secure infrastructure, with full control. We are excited to make this future happen with Google Cloud. Our deep collaboration will accelerate this vision, whether you are using Vertex AI Model Garden, Google Kubernetes Engine, Cloud Run or Hugging Face Inference Endpoints.
+
+Is there something you want us to create or improve thanks to our partnership with Google? Let us know in the comments!
+
+