Beneficiary Claims Data API
To get started, install some dependencies:
- Install Go
- Install Docker
- Install Docker Compose
- Install Ansible Vault
- Ensure all dependencies installed above are on PATH and can be executed directly from command line.
Sensitive Docker Configuration Files
The files committed in the
shared_files/encrypted directory hold secret information, and are encrypted with Ansible Vault.
- See a team member for the Ansible Vault password
- Create a file named
.vault_passwordin the root directory of the repository
- Place the Ansible Vault password in this file
To avoid committing and pushing unencrypted secret files, use the included
scripts/pre-commit git pre-commit hook by running the following script from the repository root directory:
cp ops/pre-commit .git/hooks
The pre-commit hook will also ensure that any added, copied, or modified go files are formatted properly.
Managing encrypted files
- Temporarily decrypt files by running the following command from the repository root directory:
- While files are decrypted, copy the files in this directory to the sibling directory
- Encrypt changed files with:
./ops/secrets --encrypt <filename>
The project uses Go Modules allowing you to clone the repo outside of the
$GOPATH. This also means that running
go get inside the repo will add the dependency to the project, not globally.
Build / Start
Build the images and start the containers:
- Build the images and load with fixture data
- Start the containers
Run tests and produce test metrics.
The items identified above in the
Build/Start section are prerequisites to running tests.
In order to keep the test feedback loop optimized, the following items must be handled by the caller (and are not handled by the test targets):
- Ensuring the compose stack is up and running
- Ensuring the database has been seeded
- Managing images/containers (if Dockerfile changes have occurred, an image rebuild is required and won't occur as part of the test targets)
- Run golang linter and gosec:
- Run unit tests (this places results and a coverage report in test_results/):
- Run postman integration tests:
make postman env=local
- Run smoke tests:
- Run full test suite (executes all of items in 1-4 above):
- Run performance tests (primarily to be utilized by Jenkins in AWS):
Updating seed data for unit tests
After the user has finished updating the Postgres db used for unit testing with the new data, the user can update the seed data by running the following comamnd:
This script will update
This file is used to initialize the Postgres db with all of the necessary data needed for the various unit tests.
For more information on intialization, please see
This script is executed when the Postgres container is launched.
dump.pgdata should be committed with the other associated changes.
Running unit tests locally
Spin up the Postgres unit test container
$ make unit-test-db
Source the required environment variables from the
./.vscode/settings.json(under go.testEnvVars) and
NOTE: Since we're connecting to Postgres externally, we need to use the local host/port instead.
For vscode users, these variables are already by the workspace settings file (
Auto-generating mock implementations
Testify mocks can be automatically be generated using mockery. Installation and other runtime instructions can be found here. Mockery uses interfaces to generate the mocks. In the example below, the Repository interface in
repository.go will be used to generate the mocks.
mockery --name Repository --inpackage --case snake
Use the application
See: API documentation
NEVER PUT PASSWORDS, KEYS, OR SECRETS OF ANY KIND IN APPLICATION CODE! INSTEAD, USE THE STRATEGY OUTLINED HERE
In the project root
bcda-app/ directory, create a file called
.env.sh. This file is ignored by git and will not be committed
$ touch .env.sh
.env.sh to include the bash shebang and any necessary environment variables like this
#!/bin/bash export BCDA_AUTH_PROVIDER=okta export OKTA_OAUTH_SERVER_ID="<serverID>" export OKTA_CLIENT_TOKEN="<apiKey>" export OKTA_CLIENT_SECRET="<clientSecret>"
Lastly, source the file to add the variables to your local development environment
$ source .env.sh
You're good to go! Use the environment variables in application code like this
apiKey := os.Getenv("OKTA_CLIENT_TOKEN")
Optionally, you can edit your
~/.bashrc file to eliminate the need to source the file for each shell start by appending this line
[src-path] is your relative path to the bcda-app repo.
bcdaworker apps by setting the following environment variables.
BCDA_ERROR_LOG <file_path> BCDA_REQUEST_LOG <file_path> BCDA_BB_LOG <file_path> BCDA_OKTA_LOG <file_path> BB_CLIENT_CERT_FILE <file_path> BB_CLIENT_KEY_FILE <file_path> BB_SERVER_LOCATION <url> OKTA_CLIENT_TOKEN <api_key> OKTA_CLIENT_ORGURL <url> OKTA_EMAIL <test_account> FHIR_PAYLOAD_DIR <directory_path> JWT_EXPIRATION_DELTA <integer> (time in hours that JWT access tokens are valid for)
BCDA_WORKER_ERROR_LOG <file_path> BCDA_BB_LOG <file_path> BB_CLIENT_CERT_FILE <file_path> BB_CLIENT_KEY_FILE <file_path> BB_SERVER_LOCATION <url> FHIR_PAYLOAD_DIR <directory_path> BB_TIMEOUT_MS <integer>
Other things you can do
Use docker to look at the api database with psql:
docker run --rm --network bcda-app_default -it postgres psql -h bcda-app_db_1 -U postgres bcda
See docker-compose.yml for the password.
Use docker to run the CLI against an API instance
docker exec -it bcda-app_api_1 sh -c 'bcda -h'
If you have no data in your database, you can load the fixture data with
Follow installing go + vscode setup guide.
Additional settings found under
.vscode/settings.json allow tests to be run within vscode.