diff --git a/changelog/2024-12-05.mdx b/fern/changelog/2024-12-05.mdx
similarity index 55%
rename from changelog/2024-12-05.mdx
rename to fern/changelog/2024-12-05.mdx
index a10256759..ab73b8a55 100644
--- a/changelog/2024-12-05.mdx
+++ b/fern/changelog/2024-12-05.mdx
@@ -1,4 +1,6 @@
-1. **OAuth2 Support for Custom LLM Credentials and Webhooks**: You can now secure access to your [custom LLMs](https://docs.vapi.ai/customization/custom-llm/using-your-server#step-2-configuring-vapi-with-custom-llm) and [server urls (aka webhooks)](https://docs.vapi.ai/server-url) using OAuth2 (RFC 6749). Create a webhook credential with `CreateWebhookCredentialDTO` and specify the following information.
+1. **OAuth2 Support for Custom LLM Credentials and Webhooks**: You can now authorize access to your [custom LLMs](https://docs.vapi.ai/customization/custom-llm/using-your-server#step-2-configuring-vapi-with-custom-llm) and [server urls (aka webhooks)](https://docs.vapi.ai/server-url) using OAuth2 (RFC 6749).
+
+For example, create a webhook credential with `CreateWebhookCredentialDTO` with the following payload:
```json
{
@@ -13,4 +15,10 @@
}
```
+This returns a [`WebhookCredential`](https://api.vapi.ai/api) object as follows:
+
+
+
+
+
3. **Removal of Canonical Knowledge Base**: The ability to create, update, and use canoncial knowledge bases in your assistant has been removed from the API(as custom knowledge bases and the Trieve integration supports as superset of this functionality). Please update your implementations as endpoints and models referencing canoncial knowledge base schemas are no longer available.
\ No newline at end of file
diff --git a/fern/changelog/2024-12-06.mdx b/fern/changelog/2024-12-06.mdx
new file mode 100644
index 000000000..d9d6b49a7
--- /dev/null
+++ b/fern/changelog/2024-12-06.mdx
@@ -0,0 +1,24 @@
+1. **OAuth 2 Authentication for Custom LLM Models and Webhooks**: In addition to (AuthZ)[https://www.okta.com/identity-101/authentication-vs-authorization/], you can now now authenticate users accessing your [custom LLMs](https://docs.vapi.ai/customization/custom-llm/using-your-server#step-2-configuring-vapi-with-custom-llm) and [server urls (aka webhooks)](https://docs.vapi.ai/server-url) using OAuth2 (RFC 6749). Use the `authenticationSession` dictionary which contains an `accessToken` and `expiresAt` datetime to authenticate further requests to your custom LLM or server URL.
+
+For example, create a webhook credential with `CreateCustomLLMCredentialDTO` with the following payload:
+```json
+{
+ "provider": "custom-llm",
+ "apiKey": "your-api-key-max-10000-characters",
+ "authenticationPlan": {
+ "type": "oauth2",
+ "url": "https://your-url.com/your/path/token",
+ "clientId": "your-client-id",
+ "clientSecret": "your-client-secret"
+ },
+ "name": "your-credential-name-between-1-and-40-characters"
+}
+```
+
+This returns a [`CustomLLMCredential`](https://api.vapi.ai/api) object as follows:
+
+
+
+
+
+This can be used to authenticate successive requests to your custom LLM or server URL.
diff --git a/fern/static/images/changelog/custom-llm-credential.png b/fern/static/images/changelog/custom-llm-credential.png
new file mode 100644
index 000000000..48076e161
Binary files /dev/null and b/fern/static/images/changelog/custom-llm-credential.png differ
diff --git a/fern/static/images/changelog/webhook-credential.png b/fern/static/images/changelog/webhook-credential.png
new file mode 100644
index 000000000..47f2159bf
Binary files /dev/null and b/fern/static/images/changelog/webhook-credential.png differ