Skip to content

Commit

Permalink
Merge branch 'main' into vertexai-docs
Browse files Browse the repository at this point in the history
Signed-off-by: Anais Urlichs <33576047+AnaisUrlichs@users.noreply.github.com>
  • Loading branch information
AnaisUrlichs committed May 16, 2024
2 parents c4af680 + 7508683 commit b6ee171
Show file tree
Hide file tree
Showing 13 changed files with 108 additions and 34 deletions.
4 changes: 2 additions & 2 deletions docs/explanation/integrations.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,13 @@

Integrations in k8sGPT allows you to manage and configure various integrations with external tools and services within your repository's codebase.

These Integrations enhance the functionality of k8sGPT by providing additional capabilities for scanning, diagnosing, and triaging issues in the Kubernetes clusters.
These integrations enhance the functionality of k8sGPT by providing additional capabilities for scanning, diagnosing, and triaging issues in the Kubernetes clusters.

## Description

The `integration` command in the k8sgpt enables seamless integration with external tools and services. It allows you to activate, configure, and manage integrations that complement the functionalities of k8sgpt.

Integrations are designed to interact with external systems and tools that complement the functionalities of k8sgpt. These integrations include vulnerability scanners, monitoring services, incident management platforms, and more
Integrations are designed to interact with external systems and tools that complement the functionalities of k8sgpt. These integrations include vulnerability scanners, monitoring services, incident management platforms, and more.

By using the following command users can access all K8sGPT CLI options related to integrations:

Expand Down
6 changes: 3 additions & 3 deletions docs/getting-started/Community.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
## GitHub

the [k8sgpt source code](https://github.com/k8sgpt-ai/k8sgpt) and other related projects managed on github are in the [k8sgpt](https://github.com/k8sgpt-ai) organization
The [k8sgpt source code](https://github.com/k8sgpt-ai/k8sgpt) and other related projects managed on github are in the [k8sgpt](https://github.com/k8sgpt-ai) organization.


## Slack Channel

You can join the slack channel using the link : [slack](https://join.slack.com/t/k8sgpt/shared_invite/zt-276pa9uyq-pxAUr4TCVHubFxEvLZuT1Q)


## Community Meetings / Office Hours
## Community Meetings / Office Hours

these happen on 1st and 3rd Thursday of the month Time zone: Europe/London Time: 12:00 - 13:00 Joining Info:
These happen on 1st and 3rd Thursday of the month Time zone: Europe/London Time: 12:00 - 13:00 Joining Info:

Google Meet : [link](https://meet.google.com/beu-kbdx-dfa)

Expand Down
2 changes: 1 addition & 1 deletion docs/getting-started/in-cluster-operator.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ Those responses will appear as `details` within the `Result` custom resources th
The default backend in this example is [OpenAI](https://openai.com/) and allows for additional details to be generated and solutions provided for issues.
If you wish to disable out-of-cluster communication and any Artificial Intelligence processing through models, simply set `enableAI` to `false`.

_It should also be noted that `localai` and `azureopenai` is supported and in-cluster models will be supported in the near future_
_It should also be noted that `localai` and `azureopenai` are supported and in-cluster models will be supported in the near future_

## Viewing the results

Expand Down
12 changes: 6 additions & 6 deletions docs/getting-started/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,9 +71,9 @@ apk add k8sgpt_amd64.apk

## Windows

* Download the latest Windows binaries of **k8sgpt** from the [Release](https://github.com/k8sgpt-ai/k8sgpt/releases)
* Download the latest Windows binaries of **k8sgpt** from the [Release](https://github.com/k8sgpt-ai/k8sgpt/releases)
tab based on your system architecture.
* Extract the downloaded package to your desired location. Configure the system *path* variable with the binary location
* Extract the downloaded package to your desired location. Configure the system *path* variable with the binary location.

## Verify installation

Expand All @@ -92,7 +92,7 @@ Failing Installation on WSL or Linux (missing gcc)
When installing Homebrew on WSL or Linux, you may encounter the following error:

```bash
==> Installing k8sgpt from k8sgpt-ai/k8sgpt Error: The following formula cannot be installed from bottle and must be
==> Installing k8sgpt from k8sgpt-ai/k8sgpt Error: The following formula cannot be installed from bottle and must be
built from source. k8sgpt Install Clang or run brew install gcc.
```

Expand All @@ -107,7 +107,7 @@ If you install gcc as suggested, the problem will persist. Therefore, you need t
When installing Homebrew on WSL or Linux, you may encounter the following error:

```
==> Installing k8sgpt from k8sgpt-ai/k8sgpt Error: The following formula cannot be installed from a bottle and must be
==> Installing k8sgpt from k8sgpt-ai/k8sgpt Error: The following formula cannot be installed from a bottle and must be
built from the source. k8sgpt Install Clang or run brew install gcc.
```

Expand All @@ -117,7 +117,7 @@ If you install gcc as suggested, the problem will persist. Therefore, you need t
sudo apt-get install build-essential
```

## Running K8sGPT through a container
## Running K8sGPT through a container

If you are running K8sGPT through a container, the CLI will not be able to open the website for the OpenAI token.

Expand All @@ -137,7 +137,7 @@ services:

## Installing the K8sGPT Operator Helm Chart

K8sGPT can be installed as an Operator inside the cluster.
K8sGPT can be installed as an Operator inside the cluster.
For further information, see the [K8sGPT Operator](in-cluster-operator.md) documentation.

## Upgrading the brew installation
Expand Down
67 changes: 62 additions & 5 deletions docs/reference/cli/filters.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Also, please make sure that you are connected to a Kubernetes cluster.

**Prerequisites**

* Connected to a running Kubernetes cluster, any cluster will work for demonstration purposes
* Connected to a running Kubernetes cluster, any cluster will work for demonstration purposes.

To list all integrations run the following command:
```bash
Expand All @@ -51,11 +51,11 @@ Once the Trivy Operator is installed inside the cluster, K8sGPT will have access
```bash
❯ k8sgpt filters list

Active:
Active:
> VulnerabilityReport (integration)
> Pod
> ConfigAuditReport (integration)
Unused:
Unused:
> PersistentVolumeClaim
> Service
> CronJob
Expand Down Expand Up @@ -125,7 +125,7 @@ Once activated, K8sGPT will have access to new filters:
```bash
❯ k8sgpt filters list

Active:
Active:
> PersistentVolumeClaim
> Service
> ValidatingWebhookConfiguration
Expand All @@ -140,7 +140,7 @@ Active:
> StatefulSet
> PrometheusConfigReport
> ReplicaSet
Unused:
Unused:
> HorizontalPodAutoScaler
> PodDisruptionBudget
> NetworkPolicy
Expand Down Expand Up @@ -207,6 +207,63 @@ at least one of the following label sets:
Note: the LLM prompt includes a subset of your Prometheus relabeling rules to
avoid using too many tokens, so you may not see every label set in the output.

## AWS

The AWS Operator is a tool that allows Kubernetes to manage AWS resources directly, making it easier to integrate AWS services with other Kubernetes applications. This integration helps K8sGPT to interact with the AWS resources managed by the Operator. As a result, you can use K8sGPT to analyze and manage not only your Kubernetes resources but also your AWS resources that are under the management of the AWS Operator.

Activate the AWS integration:
```bash
k8sgpt integration activate aws
```
Once activated, you should see the following success message displayed:
```
Activated integration aws
```

This will activate the AWS Kubernetes Operator into the Kubernetes cluster and make it possible for K8sGPT to interact with the results of the Operator.

Once the AWS integration is activated inside the cluster, K8sGPT will have access to EKS:
```bash
❯ k8sgpt filters list

Active:
> StatefulSet
> Ingress
> Pod
> Node
> ValidatingWebhookConfiguration
> Service
> EKS (integration)
> PersistentVolumeClaim
> MutatingWebhookConfiguration
> CronJob
> Deployment
> ReplicaSet
Unused:
> Log
> GatewayClass
> Gateway
> HTTPRoute
> HorizontalPodAutoScaler
> PodDisruptionBudget
> NetworkPolicy
```

More information can be found on the official [AWS-Operator documentation](https://aws.amazon.com/blogs/opensource/aws-service-operator-kubernetes-available/).

### Using the new filters to analyze your cluster

Any of the filters listed in the previous section can be used as part of the `k8sgpt analyze` command.

> **Note:** Ensure the `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` environment variables are set as outlined in the [AWS CLI environment variables documentation](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-envvars.html).
To use the `EKS` filter from the AWS integration, specify it with the --filter flag:
```bash
k8sgpt analyze --filter EKS
```

This command analyzes your cluster's EKS resources using K8sGPT. Make sure your EKS cluster is working in the specified namespace. The report's results will vary based on the EKS reports available in your cluster.

## Adding and removing default filters

_Remove default filters_
Expand Down
2 changes: 1 addition & 1 deletion docs/reference/cli/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ _Run a scan with the default analyzers_

```
k8sgpt generate
k8sgpt auth new
k8sgpt auth add
k8sgpt analyze --explain
```

Expand Down
2 changes: 1 addition & 1 deletion docs/reference/guidelines/guidelines.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,4 @@ The documentation is created with [mkdocs](https://www.mkdocs.org/), specificall

## Contributing projects in the K8sGPT organisation

All project in the K8sGPT organisation follow our [contributing guidelines.](https://github.com/k8sgpt-ai/k8sgpt/blob/main/CONTRIBUTING.md)
All projects in the K8sGPT organisation follow our [contributing guidelines.](https://github.com/k8sgpt-ai/k8sgpt/blob/main/CONTRIBUTING.md)
4 changes: 2 additions & 2 deletions docs/reference/guidelines/privacy.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ K8sGPT is a privacy-first tool and believe transparency is key for you to unders

## Data we collect

K8sGPT will collect data from Analyzers and either display it directly to you or
K8sGPT will collect data from Analyzers and either display it directly to you or
with the `--explain` flag it will send it to the selected AI backend.

The type of data collected depends on the Analyzer you are using. For example, the `k8sgpt analyze pod` command will collect the following data:
Expand All @@ -23,7 +23,7 @@ To learn more about the privacy policy of our default AI backend OpenAI please v

## Data we protect

When you are sending data through the `--explain` option, there is the capability of anonymising some of that data. This is done by using the `--anonymize` flag. In the example of the Deployment Analyzer, this will obfusicate the following data:
When you are sending data through the `--explain` option, there is the capability of anonymising some of that data. This is done by using the `--anonymize` flag. In the example of the Deployment Analyzer, this will obfuscate the following data:

- Deployment name
- Deployment namespace
Expand Down
10 changes: 5 additions & 5 deletions docs/reference/operator/advanced-installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ This documentation lists advanced installation options for the K8sGPT Operator.

## ArgoCD

ArgoCD is a continuous Deployment tool that implements GitOps best practices to install and manage Kubernetes resources.
ArgoCD is a continuous deployment tool that implements GitOps best practices to install and manage Kubernetes resources.

### Prerequisites

Expand Down Expand Up @@ -46,9 +46,9 @@ spec:
- CreateNamespace=true
```

Note:
Note:

* Ensure that the `namespace` is correctly set to your ArgoCD namespace
* Ensure that the `namespace` is correctly set to your ArgoCD namespace.
* Ensure that the `<VERSION>` is set to the [K8sGPT Operator Release Version](https://github.com/k8sgpt-ai/k8sgpt-operator/releases) that you want to use.
* Modify the `helm.values` section with the Helm Values that you would like to overwrite. Check the [values.yaml](https://github.com/k8sgpt-ai/k8sgpt-operator/tree/main/chart/operator) file of the Operator for options.

Expand All @@ -62,7 +62,7 @@ kubectl apply -f application.yaml

You will still need to install the

* K8sGPT Operator CRD
* K8sGPT Operator CRD
* K8sGPT secret to access the AI backend

that are both detailed in the Operator installation page. The above Application resource will only install the Operator pods themselves not additional resources. Note that you could manage those resources also through ArgoCD. Please refer to the official [ArgoCD documentation](https://argo-cd.readthedocs.io/en/stable/getting_started/) for further information.
that are both detailed in the Operator installation page. The above application resource will only install the Operator pods themselves not additional resources. Note that you could manage those resources also through ArgoCD. Please refer to the official [ArgoCD documentation](https://argo-cd.readthedocs.io/en/stable/getting_started/) for further information.
27 changes: 22 additions & 5 deletions docs/reference/providers/backend.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ Currently, we have a total of 10 backends available:
- [Azure OpenAI](https://azure.microsoft.com/en-us/products/cognitive-services/openai-service)
- [Google Gemini](https://ai.google.dev/docs/gemini_api_overview)
- [Google Vertex AI](https://cloud.google.com/vertex-ai)
- [Hugging Face](https://huggingface.co)
- [LocalAI](https://github.com/go-skynet/LocalAI)
- FakeAI

Expand All @@ -24,7 +25,7 @@ OpenAI is the default backend for K8sGPT. We recommend using OpenAI first if you
```
- To set the token in K8sGPT, use the following command:
```bash
k8sgpt auth add
k8sgpt auth add
```
- Run the following command to analyze issues within your cluster using OpenAI:
```bash
Expand Down Expand Up @@ -91,11 +92,11 @@ Example how to deploy Amazon SageMaker with cdk is available in [llm-sagemaker-j
Azure OpenAI Provider provides REST API access to OpenAI's powerful language models. It gives the users an advanced language AI with powerful models with the security and enterprise promise of Azure.

- The Azure OpenAI Provider requires a deployment as a prerequisite. You can visit their [documentation](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/how-to/create-resource?pivots=web-portal#create-a-resource) to create your own.
To authenticate with k8sgpt, you would require an Azure OpenAI endpoint of your tenant `https://your Azure OpenAI Endpoint`,the API key to access your deployment, the Deployment name of your model and the model name itself.
To authenticate with k8sgpt, you would require an Azure OpenAI endpoint of your tenant `https://your Azure OpenAI Endpoint`,the API key to access your deployment, the deployment name of your model and the model name itself.

- Run the following command to authenticate with Azure OpenAI:
```bash
k8sgpt auth --backend azureopenai --baseurl https://<your Azure OpenAI endpoint> --engine <deployment_name> --model <model_name>
k8sgpt auth add --backend azureopenai --baseurl https://<your Azure OpenAI endpoint> --engine <deployment_name> --model <model_name>
```
- Now you are ready to analyze with the Azure OpenAI backend:
```bash
Expand Down Expand Up @@ -141,6 +142,22 @@ Google [Gemini](https://blog.google/technology/ai/google-gemini-ai/#performance)
k8sgpt analyze --explain --backend googlevertexai
```

## HuggingFace

Hugging Face is a versatile backend for K8sGPT, offering access to a wide range of pre-trained language models. It provides easy-to-use interfaces for both training and inference tasks. Refer to the Hugging Face [documentation](https://huggingface.co/docs) for further insights into model usage and capabilities.

- To use Hugging Face API in K8sGPT, obtain [the API key](https://huggingface.co/settings/tokens).
- Configure the HuggingFace backend in K8sGPT by specifying the desired model (see all [models](https://huggingface.co/models) here) using auth command:
```bash
k8sgpt auth add --backend huggingface --model <model name>
```
> NOTE: Since the default gpt-3.5-turbo model is not available in Hugging Face, a valid backend model is required.
- Once configured, you can analyze issues within your cluster using the Hugging Face provider with the following command:
```bash
k8sgpt analyze --explain --backend huggingface
```

## LocalAI

LocalAI is a local model, which is an OpenAI compatible API. It uses llama.cpp and ggml to run inference on consumer-grade hardware. Models supported by LocalAI for instance are Vicuna, Alpaca, LLaMA, Cerebras, GPT4ALL, GPT4ALL-J and koala.
Expand All @@ -149,7 +166,7 @@ LocalAI is a local model, which is an OpenAI compatible API. It uses llama.cpp a
- To start the API server, follow the instruction in [LocalAI](https://github.com/go-skynet/LocalAI#example-use-gpt4all-j-model).
- Authenticate K8sGPT with LocalAI:
```bash
k8sgpt auth new --backend localai --model <model_name> --baseurl http://localhost:8080/v1
k8sgpt auth add --backend localai --model <model_name> --baseurl http://localhost:8080/v1
```
- Analyze with a LocalAI backend:
```bash
Expand All @@ -159,7 +176,7 @@ LocalAI is a local model, which is an OpenAI compatible API. It uses llama.cpp a
## FakeAI

FakeAI or the NoOpAiProvider might be useful in situations where you need to test a new feature or simulate the behaviour of an AI based-system without actually invoking it. It can help you with local development, testing and troubleshooting.
The NoOpAiProvider does not actually perfornm any AI-based operations but simulates them by echoing the input given as a problem.
The NoOpAiProvider does not actually perform any AI-based operations but simulates them by echoing the input given as a problem.

Follow the steps outlined below to learn how to utilize the NoOpAiProvider:

Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/content-collection/content-collection.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Content Collection

This section provides a collection of vidoes, blog posts and more on K8sGPT, posted on external sites.
This section provides a collection of videos, blog posts and more on K8sGPT, posted on external sites.

## Blogs
Have a look at the K8sGPT blog on the [website.](https://k8sgpt.ai/blog/)
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/index.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Tutorials

This section provides
This section provides:

* end-to-end tutorials on specific use cases
* a collection of user and contributor created content
2 changes: 1 addition & 1 deletion docs/tutorials/playground.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ If you want to try out K8sGPT, we highly suggest you to follow this Killrcoda ex

Link: [**K8sGPT CLI Tutorial**](https://killercoda.com/matthisholleville/scenario/k8sgpt-cli)

This tutorials covers:
This tutorial covers:

- Run a simple analysis and explore possible options
- Discover how AI works Explanation
Expand Down

0 comments on commit b6ee171

Please sign in to comment.