Query s3 buckets.
- Go
- Docker
- kubectl
- terraform
- Jenkins with following plugins- Docker Pipeline, Terraform, Kubernetes CLI. github need to be configured as 'scm'. Dockerhub credentials need to be configured as 'docker-hub-credentials'.
- minikube
- kubeseal
./s3-query --help
Usage of ./s3-query:
-port string
HTTP port. HTTPPORT environment variable can also be used. (default "8080")
Port can be set using HTTPPORT environment variable as well. Parameter takes precedence.
Run the application.
./s3-query -port 8083
Docker Build
sudo docker build -t mintojoseph/s3-query:1.0 .
sudo docker container run -p 8080:8080 mintojoseph/s3-query:1.0
kubeseal is used to manage the credentials.
Encode the aws creds.
echo -n '<aws creds>' |base64
Update credentails in secret.yml file like below.
apiVersion: v1
kind: Secret
metadata:
name: secret-basic-auth
data:
username: <aws_access_key_id>
password: <aws_secret_access_key>
Create sealed credetails using kubeseal.
kubeseal --format yaml <secret.yml >sealedsecret.yml
Server can be queried using follwouing syntax.
curl <hostname>:<port>/list?name=<bucket name>
$ curl 192.168.39.142:8080/list?name=mintos-test-bucket
Name: sample.war
Storage class:STANDARD
Name: testfile
Storage class:STANDARD
Found 2 items in bucket
Use EXTERNAL-IP from following command as hostname.
kubectl get svc
- deployment/k8s/ - Kuberenetes yaml files for s3-query application.
- terraform/ - Terraform code for deploying an s3 bucket.
- main.go - main program.
- Jenkinsfile - To build and deploy.
- Dockerfile - To build images.
- Makefile - To build program.