Skip to content

Commit

Permalink
chore(docs): add getting started guide to docs
Browse files Browse the repository at this point in the history
  • Loading branch information
Umaaz committed Jun 7, 2023
1 parent 277631b commit 327f592
Show file tree
Hide file tree
Showing 19 changed files with 98 additions and 7,131 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@ a [deployment](https://github.com/intergral/deep/tree/master/examples/kubernetes
## Configure

Before you can use any of the examples that use AWS S3 you will need to create an IAM role to allow DEEP access to the
bucket. See [AWS/Permissions](./aws/permissions.md) for more info.
bucket. See [AWS/Permissions](permissions.md) for more info.
2 changes: 1 addition & 1 deletion docs/docs/deploy/local.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Deploy DEEP Locally
# Deploy Locally

The easiest way to evaluate DEEP and to test the features is to deploy the stack locally. To do this there a few
examples using docker compose available in the [main repo](https://github.com/intergral/deep/tree/master/examples/docker-compose).
85 changes: 85 additions & 0 deletions docs/docs/getstarted.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
# Getting Started

On this page we will cover how to get started with DEEP using the local docker compose stack.

## Prerequisites

To follow this guide you will need to install:

- [Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git)
- [Docker](https://docs.docker.com/engine/install/)
- [Docker Compose](https://docs.docker.com/compose/)

## Guide

To start deep using the local docker stack follow these steps:

1. Checkout the repo
```bash
git checkout https://github.com/intergral/deep.git
```
2. Open the example directory
```bash
cd examples/docker-compose/local
```
3. Start the services
```bash
docker compose up -d
```

At this point docker will start the required containers. These will include:

- Deep - running in single tenant and single binary mode
- Prometheus - collecting metrics from deep
- Grafana - with datasource's connecting to Deep and Prometheus (this is currently a custom-built image with the deep
plugins installed)
- Test App - A simple test app running python that can be used as the target to debug

## Using Deep

Now that we have deep running and a test app connected we want to see what we can do.

1. Open Grafana to the [explore page](http://localhost:3000/explore).
2. Ensure that the Deep data source is selected. (Should be the default datasource)
3. Now open the query input to create a tracepoint:

![Tracepoint Create](./images/Explore_TracepointCreate.png)

4. Here we can enter the file and line number to collect data. In this example we want to use:
- File Path: simple_test.py
- Line Number: 31

Once set click 'Create Tracepoint' to create the tracepoint.
7. If created successfully you will see the result in the query result below:

![Tracepoint List](./images/Explore_TracepointList.png)

8. This new config will be sent to the connected client and will result in a Snapshot being created. To see the
Snapshots click on the lint in the column 'Tracepoint ID'. This will create a split view and show the available
Snapshots.

Note: It can take up to a minute before the Snapshot will be available in the results.

![Snapshot Result](./images/Explore_SnapshotSplitView.png)

9. From this you can now select a snapshot from the list to view the full details.

![Snapshot Panel](./images/Explore_SnapshotPanel.png)

Now you know how to create tracepoints and view the data you can play with creating other tracepoints, or changing the
settings to collect watches, or change the fire count.

The source code for the test app is open source [intergral/python-deep-client/examples](https://github.com/intergral/deep-python-client/tree/master/examples)


## Clean up
To clean up this example you simple need to remove the compose stack.

```bash
docker compose down -v
```

You will also might want to remove the stored data
```bash
rm -Rf deep-data
```
Binary file added docs/docs/images/Explore_SnapshotPanel.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/images/Explore_SnapshotSplitView.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/images/Explore_TracepointCreate.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/docs/images/Explore_TracepointList.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion docs/docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ on [GitHub](https://github.com/intergra/deep)
To use deep you need to deploy the service, there are a few options for this:

- [Local as a docker compose stack](./deploy/local.md)
- [Using AWS EKS](./deploy/aws-eks.md)
- [Using AWS EKS](deploy/aws/aws-eks.md)

Once the service is deployed you will need to setup a client and connect Grafana to the service.

Expand Down
10 changes: 10 additions & 0 deletions docs/mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,3 +10,13 @@ copyright: Intergral GmbH 2023

theme:
name: material


markdown_extensions:
- pymdownx.highlight:
anchor_linenums: true
line_spans: __span
pygments_lang_class: true
- pymdownx.inlinehilite
- pymdownx.snippets
- pymdownx.superfences
4 changes: 0 additions & 4 deletions examples/docker-compose/debug/docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -35,16 +35,12 @@ services:
- GF_AUTH_ANONYMOUS_ENABLED=true
- GF_AUTH_ANONYMOUS_ORG_ROLE=Admin
- GF_AUTH_DISABLE_LOGIN_FORM=true
- GF_FEATURE_TOGGLES_ENABLE=traceqlEditor
ports:
- "3000:3000"

test_app:
image: ghcr.io/intergral/deep-python-client:simple-app
environment:
- DEEP_SERVICE_SECURE=False
- DEEP_LOGGING_CONF=/etc/client_logging.conf
volumes:
- ../shared/client_logging.conf:/etc/client_logging.conf
depends_on:
- deep
4 changes: 0 additions & 4 deletions examples/docker-compose/distributed/docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -110,9 +110,6 @@ services:
environment:
- DEEP_SERVICE_URL=distributor:43315
- DEEP_SERVICE_SECURE=False
- DEEP_LOGGING_CONF=/etc/client_logging.conf
volumes:
- ../shared/client_logging.conf:/etc/client_logging.conf
depends_on:
- distributor

Expand All @@ -135,6 +132,5 @@ services:
- GF_AUTH_ANONYMOUS_ENABLED=true
- GF_AUTH_ANONYMOUS_ORG_ROLE=Admin
- GF_AUTH_DISABLE_LOGIN_FORM=true
- GF_FEATURE_TOGGLES_ENABLE=traceqlEditor
ports:
- "3000:3000"
4 changes: 0 additions & 4 deletions examples/docker-compose/local/docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -34,16 +34,12 @@ services:
- GF_AUTH_ANONYMOUS_ENABLED=true
- GF_AUTH_ANONYMOUS_ORG_ROLE=Admin
- GF_AUTH_DISABLE_LOGIN_FORM=true
- GF_FEATURE_TOGGLES_ENABLE=traceqlEditor
ports:
- "3000:3000"

test_app:
image: ghcr.io/intergral/deep-python-client:simple-app
environment:
- DEEP_SERVICE_SECURE=False
- DEEP_LOGGING_CONF=/etc/client_logging.conf
volumes:
- ../shared/client_logging.conf:/etc/client_logging.conf
depends_on:
- deep
3 changes: 0 additions & 3 deletions examples/docker-compose/s3/docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,6 @@ services:
image: ghcr.io/intergral/deep-python-client:simple-app
environment:
- DEEP_SERVICE_SECURE=False
- DEEP_LOGGING_CONF=/etc/client_logging.conf
volumes:
- ../shared/client_logging.conf:/etc/client_logging.conf
depends_on:
- deep

Expand Down
27 changes: 0 additions & 27 deletions examples/docker-compose/shared/client_logging.conf

This file was deleted.

Loading

0 comments on commit 327f592

Please sign in to comment.