A streaming service built on kubernetes with a vue frontend
Explore the docs »
Report Bug
·
Request Feature
I started this project to get a hands on experience on kubernetes and docker. There are a lot of things which can be improved. You may suggest changes by raising an issue or creating a pull request by forking this repo.
Majority of the code is written in typescript. Here is the list of all frameworks that are used.
- MySQL - Database
- Typescript/Nodejs/Python - Backend
- Vue/Nuxtjs - Frontend
- JWT - Authentication
- GraphQL/Rest - API Requests
- Elasticsearch - Search
- Apache Spark - Calculating user views, generate video recommendations, trending videos
- FFMpeg, Shaka Packager - Video processing
- RabbitMQ - Distributed transcoding
- Skaffold - Code reloads during developement
- Kuberentes/Docker - Microservices management
There are some additional optional frameworks which are used to improve scalibility, those will be covered in their individual guides.
- Material Design
- Login/Signup
- Adaptive video playback
- Video uploads
- Channel subscription
- Video likes, comments
- Turn off lights
- Trending, recent videos
- Recommended videos based on watch history
- Distributed video transcoding
- And many more.
To get a local copy up and running you need to have docker and kubernetes cluster running. Ensure you have at least > 8GB RAM available.
-
Skaffold is needed for easier build process. Download the binary and add it to your system path.
-
Helm can also be installed to speed up the process.
-
This guide assumes that both helm and skaffold are installed.
-
Start by cloning the repo
git clone https://github.com/arjit95/vidstream
cd vidstream- Setup a new kuberentes namespaces
# Create a new kubernetes namespace
$ kubernetes create ns vidstream
# Switch to the new namespace
$ kubectl config set-context --current --namespace=vidstream- Setup storage
# Create a new nfs volume
# Add bitnami repo for nfs server chart
$ helm repo add stable https://kubernetes-charts.storage.googleapis.com
$ helm repo update
$ helm install nfs-server stable/nfs-server-provisioner
# Create new volume and claims
# These will be used to store your videos/assets
# Before this step, change the hostPath in pv.yml to match your machine configuration
$ kubernetes apply -f deploy/kubectl/volume/pv.yml
# Apply claim
$ kubernetes apply -f deploy/kubectl/volume/claim.yml- RabbitMQ
$ helm repo add bitnami https://charts.bitnami.com/bitnami
$ helm repo update
$ helm install \
--set auth.username=<username>,auth.password=<password>,persistence.enabled=false \
rabbitmq bitnami/rabbitmq- Elasticsearch
## bare bones es installation
$ kubectl apply -f deploy/kubectl/elasticsearch.yml-
Update
deploy/kubectl/secrets.ymlwith your rabbitmq username/password and assign new values for db username password. All values are base64 encoded. Here's the field mappingSECRET_QUEUE_USERNAME: RabbitMQ usernameSECRET_QUEUE_PASSWORD: RabbitMQ passwordSECRET_DB_USERNAME: MySQL usernameSECRET_DB_PASSWORD: MySQL password -
Mysql. To deploy a sharded cluster follow this guide
## bare bones mysql isntallation
$ kubectl apply -f deploy/kubectl/mysql.yml
## Get mysql pod name using
$ kubectl get pods
## Exec into pod and create a new database. DB name is defined in deploy/kubectl/configmap.yml
$ kubectl exec -it <pod name> -- sh-
Update
SECRET_JWT_TOKENindeploy/kubectl/secrets.ymlwith a random value. This will be used as a hash to generate jwt tokens. -
Get your minikube ip and private ip and update
deploy/kubectl/configmap.yml -
Update
CONFIG_API_SERVICEwith<minikube ip>:32767 -
Update
CONFIG_CORS_ALLOWED_ORIGINSwith<private ip>:3000 -
Start all the services.
$ skaffold run --tail
# Or start in dev mode to enable code reloads on change.
$ skaffold dev- The first build will take some time depending upon your machine.
- Meanwhile open a separate terminal and run the following commands to start your frontend service.
$ cd src/web
## Replace with your minikube ip
$ export API_SERVICE_ADDR="<minikube ip>:32767"
# Start frontend server
$ npm run dev
# Or alternatively
$ yarn dev- With this setup, users will get related videos based on similar keywords. If you want user based recommendations, view counter updates, trending video generation etc, you need to setup some spark jobs to process user data. Docs for spark jobs can be viewed here.
If everything works up until now, you can visit the below url in your browser.
http://<private_ip>:3000
Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make are greatly appreciated.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature) - Commit your Changes (
git commit -m 'Add some AmazingFeature') - Push to the Branch (
git push origin feature/AmazingFeature) - Open a Pull Request
Most of the code is distributed under the MIT License, with parts of code under Apache License. See LICENSE for more information.
Arjit Srivastava - Email
