File Storage Service handles upload, list and delete related files data into storage.
- Table of Contents
- Project Summary
- Architecture Diagram
- Installation
- Development workflow and guidelines
- CLI
- Project Structure
- GitHub Actions CI
- Documentation
Item | Description |
---|---|
Golang Version | 1.19 |
Object Storage | MinIO and minio-go |
moq | mockgen |
Linter | GolangCI-Lint |
Testing | testing and testify/assert |
Load Testing | ghz |
API | gRPC and gRPC-Gateway |
CLI | flag |
Application Architecture | Clean Architecture |
Directory Structure | Standard Go Project Layout |
CI (Lint, Test, Generate) | GitHubActions |
Visualize Code Diagram | go-callviz |
Sequence Diagram | Mermaid |
Protobuf Operations | buf |
Instrumentation | OpenTelemetry and Jaeger |
Logger | zap |
See the following page to download and install Golang.
You can install all tools for development and deployment for this service by running:
$ go mod download
$ make install
This project using gRPC and Protocol Buffers, thus all needed data like Service definition, RPC's list, Entities will store in api/proto directory.
If you unfamiliar with Protocol Buffer, please visit this link for the detail:
For generating the Proto files, make sure to have these libs installed on your system, please refer to this link:
- https://buf.build/
- https://grpc.io/docs/protoc-installation
- https://grpc.io/docs/languages/go/quickstart/
The validation for this API using protoc-gen-validate
, for the detail please refer to this lib:
This service also implementing gRPC-Gateway with this library:
For generating the gRPC-Gateway and OpenAPI files, there's required additional package such as:
- github.com/grpc-ecosystem/grpc-gateway/v2/protoc-gen-grpc-gateway
- github.com/grpc-ecosystem/grpc-gateway/v2/protoc-gen-openapiv2
Then, generating the Protobuf files can be done my this command:
$ make protoc
If you have any difficulties installing all dependencies needed for generating the proto files, you can easily build the docker image and use that instead.
Here are 2 commands for building and generating:
make build-protoc
make docker-protoc
Instead storing the files to disk, this project use MinIO as an Object Storage to store the files. You can follow the installation from the official page, or easily run this docker-compose command to setup everything including the bucket.
$ docker-compose -f ./development/docker-compose.yml up minio
For create the default bucket:
$ docker-compose -f ./development/docker-compose.yml up createbuckets
For create the backup:
$ docker-compose -f ./development/docker-compose.yml up createbackup
This service implements OpenTelemetry to enable instrumentation in order to measure the performance. The data exported to Jaeger and can be seen in the Jaeger UI http://localhost:16686
For running the Jaeger exporter, easily run with docker-compose command:
$ docker-compose -f ./development/docker-compose.yml up jaeger
You can simply execute the following command to run all test cases in this service:
$ make test
For running the linter make sure these libraries already installed in your system:
Then checks the Go and Proto code style using lint can be done with this command:
$ make lint
This service using Mock in some places like in the repository, usecase, pkg, etc.
To automatically updating the mock if the interface changed, easily run with go generate
command:
$ make mock
For running the service, you need the database running and set up some env variables:
# app config
export APP_ENV=dev
export SERVER_PORT=8080
# minio config
export MINIO_HOST=localhost:9000
export MINIO_ACCESS_KEY_ID=minioadmin
export MINIO_SECRET_ACCESS_KEY=minioadmin
# tracing config
export OTEL_AGENT=http://localhost:14268/api/traces
Or you can just execute the sh file:
$ ./scripts/run.sh
The example how to call the gRPC service written in Golang can be seen on these 6 examples:
- upload-from-file file.
- upload-from-url file.
- concurrent-upload-file file.
- concurrent-upload-url file.
- list-file file.
- delete-file file.
NOTE: To test this service need MinIO running in order to store the file.
If you want to test by GUI client, you can use either BloomRPC (although already no longer active) or Postman. For the detail please visit these links:
Basically you just need to import the api/proto/service.proto file if you want to test via BloomRPC / Postman.
NOTE: There will be a possibility issue when importing the proto file to BloomRPC or Postman. It is caused by some path issue, the usage of
gRPC Gateway
andprotoc-gen-validate
library. To solve this issue, there's need a modification for the proto file.
BloomRPC will have these issues when trying to import the proto file:
Error while importing protos
illegal name ';' (/path/file-storage/api/proto/service.proto, line 14)
Error while importing protos
no such type: e.Transaction
need to remove gRPC Gateway related annotations:
option (grpc.gateway.protoc_gen_openapiv2.options.openapiv2_swagger) = {
...
};
There's some issue when importing to Postman. Basically we need to do the same things like BloomRPC (remove gRPC Gateway related annotations) and disable the protoc annotations import.
import "protoc-gen-openapiv2/options/annotations.proto";
To this:
// import "protoc-gen-openapiv2/options/annotations.proto";
Also don't forget to set the import path e.g. {YOUR-DIR}/file-storage/api/proto
This service has HTTP server built on gRPC-Gateway, if you prefer to test using HTTP instead HTTP2 protocol, you can copy the Swagger file here api/openapiv2/proto/service.swagger.json and then copy paste to this URL https://editor.swagger.io/
By default, HTTP server running on gRPC port + 1, if the gRPC port is 8080, then HTTP server will run on 8081.
If you have any difficulties to run the service, easily just run all dependencies by docker-compose for the example:
docker-compose -f ./development/docker-compose.yml up
Then you will have all services running like
minio
,createbuckets
,createbackup
,jaeger
and runfile-storage
server.
This project has CLI to simplify interact with the File-Storage server. You need to build the binary before running the command:
make build-cli
You will have a binary named fs-store
.
NOTE: If the binary is not in your PATH, you need to run it directly like this: ./fs-store list-files
Then, here are several commands available:
// Use default bucket, upload from file path
fs-store file-upload /path/test.txt
// Use default bucket, upload from URL
fs-store -source=url file-upload /path/test.txt
// Use specific bucket
fs-store -bucket=my-bucket file-upload /path/test.txt
// Use specific filename
fs-store -filename=my-file.txt file-upload /path/test.txt
// Use validations
fs-store -content_type=image/jpeg,image/png -max_size=1000 file-upload /path/test.txt
// Use default bucket
file-storage list-files
// With specific bucket
file-storage -bucket=my-bucket list-files
// Use default bucket
file-storage file-delete test.txt
// With specific bucket
file-storage -bucket=my-bucket file-delete test.txt
In order to make sure the service ready to handle a big traffic, it will better if we can do Load Testing to see the performance.
Since the service running in gRPC, we need the tool that support to do HTTP2 request. In this case we can use https://ghz.sh/ because it is very simple and can generate various output report type.
NOTE: Like importing the proto file to BloomRPC / Postman, when running the
ghz
there's will be issue shown due to the tool can't read the path & validate lib.
Here are some possibility issues when we're trying to run the ghz
commands:
./api/proto/service.proto:5:8: open api/proto/proto/entity.proto: no such file or directory
./api/proto/service.proto:7:8: open api/proto/validate/validate.proto: no such file or directory
To fix this issue, you need to change some file in proto file:
import "google/api/annotations.proto";
import "protoc-gen-openapiv2/options/annotations.proto";
To this:
// import "google/api/annotations.proto";
// import "protoc-gen-openapiv2/options/annotations.proto";
and remove gRPC Gateway related annotations:
option (grpc.gateway.protoc_gen_openapiv2.options.openapiv2_swagger) = {
...
};
Then, you can run this ghz
command to do Load Testing for specific RPC, for the example:
ghz --insecure --proto ./api/proto/service.proto --call FileStorageService.Upload -d '{ "type": 1, "filename": "test.txt", "bucket": "default", "file": { "data": "dGVzdA==", "offset": 0 } }' 0.0.0.0:8080 -O html -o load_testing_upload_file.html
ghz --insecure --proto ./api/proto/service.proto --call FileStorageService.List -d '{ "bucket": "default" }' 0.0.0.0:8080 -O html -o load_testing_list_files.html
ghz --insecure --proto ./api/proto/service.proto --call FileStorageService.Delete -d '{ "bucket": "default", "object": "file.txt" }' 0.0.0.0:8080 -O html -o load_testing_delete_file.html
This project follow https://github.com/golang-standards/project-layout
However, for have a clear direction when working in this project, here are some small guide about each directory:
- api: contains Protobuf files, generated protobuf, swagger, etc.
- build: Docker file for the service, migration, etc.
- cmd: main Go file for running the service, producer, consumer, etc.
- development: file to support development like docker-compose.
- docs: file about project documentations such as diagram, sequence diagram, etc.
- internal: internal code that can't be shared.
- internal/adapters/grpchandler: adapter layer that serve into gRPC service.
- internal/di: dependencies injection for connecting each layer.
- internal/usecases: business logic that connect to repository layer, RPC & HTTP client, etc.
- pkg: package code that can be shared.
- scripts: shell script, go script to help build or testing something.
- tools: package that need to store on go.mod in order to easily do installation.
This project has GitHub Actions CI to do some automation such as:
- lint: check the code style.
- test: run unit testing and uploaded code coverage artifact.
- generate-proto: generates protobuf files.
- generate-diagram: generates graph code visualization.
- push-file: commit and push generated proto, diagram as github-actions[bot] user.
To help give a better understanding about reading the code such as relations with packages and types, here are some diagrams listed generated automatically using https://github.com/ofabry/go-callvis
To help give a better understanding about reading the RPC flow such as relations with usecases and repositories, here are some sequence diagrams (generated automatically) listed in Markdown file and written in Mermaid JS https://mermaid-js.github.io/mermaid/ format.
To generate the RPC sequence diagram, there's a Makefile command that can be use:
- Run this command to generate specific RPC
make sequence-diagram RPC=GetData
. - For generates multiple RPC's, just adding the other RPC by comma
make sequence-diagram RPC=GetData,GetList
. - For generates all RPC's, use wildcard * in the parameter
make sequence-diagram RPC=*
.