Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add docker playground to practice tutorials #258

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions demo/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
FROM bitnami/spark:3.2.4

USER root

# Install vim and cd to onetable
RUN apt-get update \
&& apt-get install -y vim \
&& rm -rf /var/lib/apt/lists/* \
&& mkdir -p /home/onetable \
&& chmod -R 777 /home/onetable

WORKDIR /home/onetable
53 changes: 41 additions & 12 deletions demo/docker-compose.yaml
Original file line number Diff line number Diff line change
@@ -1,5 +1,46 @@
version: "3.9"
services:
spark-master:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can run spark from the notebook currently, should we just do that?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Leaving this PR open until I could confirm quickstart examples on the demo jupyter notebook works seamlessly.

build:
context: .
dockerfile: Dockerfile # Dockerfile creates /home/onetable dir and gives permissions to user
environment:
- SPARK_MODE=master
- SPARK_MASTER_PORT=7077
ports:
- "18080:18080" # Spark master web UI port
- "7077:7077" # Spark master port
volumes:
- ./jars/utilities-0.1.0-SNAPSHOT-bundled.jar:/home/onetable/utilities/target/utilities-0.1.0-SNAPSHOT-bundled.jar

spark-worker-1:
image: bitnami/spark:3.2.4
environment:
- SPARK_MODE=worker
- SPARK_MASTER_URL=spark://spark-master:7077
depends_on:
- spark-master

spark-worker-2:
image: bitnami/spark:3.2.4
environment:
- SPARK_MODE=worker
- SPARK_MASTER_URL=spark://spark-master:7077
depends_on:
- spark-master

hive-metastore:
container_name: hive-metastore
hostname: hive-metastore
image: 'apache/hive:4.0.0-alpha-2'
ports:
- '9083:9083' # Metastore Thrift
environment:
SERVICE_NAME: metastore
HIVE_METASTORE_WAREHOUSE_DIR: /home/data
volumes:
- ./data:/home/data

trino:
container_name: trino
ports:
Expand All @@ -21,18 +62,6 @@ services:
- ./presto/node.properties:/opt/presto-server/etc/node.properties
- ./data:/home/data

hive-metastore:
container_name: hive-metastore
hostname: hive-metastore
image: 'apache/hive:4.0.0-alpha-2'
ports:
- '9083:9083' # Metastore Thrift
environment:
SERVICE_NAME: metastore
HIVE_METASTORE_WAREHOUSE_DIR: /home/data
volumes:
- ./data:/home/data

jupyter:
container_name: jupyter
hostname: jupyter
Expand Down
5 changes: 3 additions & 2 deletions demo/start_demo.sh
Original file line number Diff line number Diff line change
@@ -1,10 +1,11 @@
#!/bin/bash
# Create the require jars for the demo and copy them into a directory we'll mount in our notebook container
cd .. && mvn install -am -pl core -DskipTests -T 2
# Create the require jars for the demo and copy them into a directory we'll mount in our notebook & spark container
cd .. && mvn install -DskipTests -T 2
mkdir -p demo/jars
cp hudi-support/utils/target/hudi-utils-0.1.0-SNAPSHOT.jar demo/jars
cp api/target/onetable-api-0.1.0-SNAPSHOT.jar demo/jars
cp core/target/onetable-core-0.1.0-SNAPSHOT.jar demo/jars
cp utilities/target/utilities-0.1.0-SNAPSHOT-bundled.jar demo/jars

cd demo
docker-compose up
File renamed without changes.
36 changes: 36 additions & 0 deletions website/docs/docker/playground.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
---
sidebar_position: 2
title: "Docker Playground"
---

# OneTable Playground
This playground helps you install and work with OneTable in a dockerized environment.

## Pre-requisites
* Install Docker in your local machine
* Clone [OneTable GitHub repository](https://github.com/onetable-io/onetable)

:::note NOTE:
This demo was tested in AArch64 based macOS operating system
:::

## Setting up the playground
After cloning the OneTable repository, change directory to `demo` and run the `start_demo.sh` script.
This script builds OneTable jars required for the demo and then spins up docker containers to start a Spark cluster,
Jupyter notebook with Scala interpreter, Hive Metastore, Presto and Trino.

```shell md title="shell"
cd demo
./start_demo.sh
```

### Accessing the Spark cluster
You can access the Spark master by running `docker exec -it demo-spark-master-1 /bin/bash` in a separate terminal window.
This will open the master node in a bash shell and will initially take you to `/home/onetable` directory.

### Running the tutorial
Once inside the master node, you can follow the [Creating your first interoperable table](/docs/how-to#steps) tutorial.

To run sync, from the same directory, you can create the my_config.yaml file using `vi my_config.yaml` command and copy-paste
the contents from the [run sync](/docs/how-to#running-sync) section and then
run the `java -jar utilities/target/utilities-0.1.0-SNAPSHOT-bundled.jar --datasetConfig my_config.yaml` command.
14 changes: 11 additions & 3 deletions website/sidebars.js
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ module.exports = {
{
type: 'category',
label: 'Catalogs',
collapsed: false,
collapsed: true,
link: {
type: 'doc',
id: 'catalogs-index'
Expand All @@ -44,7 +44,7 @@ module.exports = {
{
type: 'category',
label: 'Query Engines',
collapsed: false,
collapsed: true,
link: {
type: 'doc',
id: 'query-engines-index'
Expand All @@ -62,6 +62,14 @@ module.exports = {
}
]
},
'demo/docker',
{
type: 'category',
label: 'Docker',
collapsed: false,
items: [
'docker/demo',
'docker/playground',
],
}
],
};