Skip to content

Commit

Permalink
Merge pull request #2515 from LiteFarmOrg/LF-3079-Add-MinIO-integrati…
Browse files Browse the repository at this point in the history
…on-to-development-environment-to-allow-localhost-document-export-and-download

LF-3079 Replace Digital Ocean Spaces with MinIO for the development environment
  • Loading branch information
SayakaOno committed Mar 28, 2023
2 parents 6117589 + 8642d0a commit b8d4e3f
Show file tree
Hide file tree
Showing 9 changed files with 142 additions and 28 deletions.
47 changes: 47 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -120,6 +120,53 @@ To run [ESLint](https://eslint.org/) checks execute `pnpm lint`

Since this is a mobile web application, webapp should be viewed in a mobile view in the browser.

## export server

Certification document export is handled by a Node.js application that runs separately from the api and connects to a Digital Ocean Space (AWS S3 bucket). It can be tested locally by setting up a Redis database for the job queue and using MinIO, a free and open-source drop-in replacement for AWS S3 that can be run on your own machine.

<details>
<summary>Full instructions for running the export server locally</summary>

1. Create and run a local Redis database on the default port with password "test"
2. Install and configure MinIO. You will want to create a Single-Node Single-Drive (Standalone) MinIO installation. The MinIO website lists instructions for [Docker](https://min.io/docs/minio/container/index.html) and [MacOS](https://min.io/docs/minio/macos/index.html) along with other operating systems. Use the default port.
3. Use the MinIO console to
- create a new bucket and set its access policy to "public"
- generate an access key. Record both key + secret
4. Download [aws-cli](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html) and configure it with your MinIO access key + secret directly in the terminal using:
```
aws configure
```
Make sure that the region name is either removed from your aws configuration or set up correspondingly in your MinIO admin panel.
5. Connect MinIO to LiteFarm:

in `packages/api/.env` make sure you have the following variables set:

```
MINIO_ENDPOINT=http://localhost:9000
PRIVATE_BUCKET_NAME=<MinIO bucket name here>
PUBLIC_BUCKET_NAME=<MinIO bucket name here>
```

in `packages/webapp/.env`:

```
VITE_DEV_BUCKET_NAME=<MinIO bucket name here>
VITE_DEV_ENDPOINT=localhost:9000
```

6. Add an `exports/` directory to LiteFarm `packages/api`
7. Make sure both the LiteFarm api and webapp are already running, then run the export server from `packages/api` using
```
npm run scheduler
```

</details>

A [detailed walkthrough](https://lite-farm.atlassian.net/wiki/spaces/LITEFARM/pages/1190101039/The+export+jobs+pseudo-package#Running-the-export-server-locally) (with screenshots) is also available on the LiteFarm Confluence.

You can also use the same MinIO bucket to store documents (but not, currently, images) uploaded from the Documents view of the webapp. To configure this, set up MinIO as above, and add the access key credentials to `/api/.env` under
`DO_SPACES_ACCESS_KEY_ID` and `DO_SPACES_SECRET_ACCESS_KEY`.

# ngrok

## Use cases for ngrok
Expand Down
3 changes: 3 additions & 0 deletions packages/api/.env.default
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,9 @@ REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_PASSWORD=test

# The default minio port
MINIO_ENDPOINT=http://localhost:9000

# For document export of private SurveyStack responses
SURVEY_USER=<email>
SURVEY_TOKEN=<token>
Expand Down
13 changes: 12 additions & 1 deletion packages/api/src/jobs/certification/do_retrieve.js
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,21 @@ export default (nextQueue, emailQueue) => (job, done) => {
`s3://${getPrivateS3BucketName()}/${farm_id}/document`, // location
`temp/${exportId}`, // destination
'--recursive',
'--endpoint=https://nyc3.digitaloceanspaces.com',
`--endpoint=${
process.env.NODE_ENV === 'development'
? process.env.MINIO_ENDPOINT
: 'https://nyc3.digitaloceanspaces.com'
}`,
'--exclude=*',
].concat(files.map(({ url }) => `--include=${url.split('/').pop()}`));

const awsCopyProcess = spawn('aws', args, { cwd: process.env.EXPORT_WD });

// Receive informative error messages from the child process
awsCopyProcess.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});

awsCopyProcess.on(
'exit',
childProcessExitCheck(
Expand Down
12 changes: 11 additions & 1 deletion packages/api/src/jobs/certification/upload.js
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,19 @@ export default (emailQueue) => (job, done) => {
'cp', //sub command
`temp/${exportId}.zip`, // location
`s3://${fileIdentifier}.zip`, // destination
'--endpoint=https://nyc3.digitaloceanspaces.com',
`--endpoint=${
process.env.NODE_ENV === 'development'
? process.env.MINIO_ENDPOINT
: 'https://nyc3.digitaloceanspaces.com'
}`,
];
const awsCopyProcess = spawn('aws', args, { cwd: process.env.EXPORT_WD });

// Receive informative error messages from the child process
awsCopyProcess.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});

awsCopyProcess.on(
'exit',
childProcessExitCheck(
Expand Down
10 changes: 9 additions & 1 deletion packages/api/src/util/digitalOceanSpaces.js
Original file line number Diff line number Diff line change
Expand Up @@ -35,11 +35,13 @@ function getImaginaryUrl(
}

const DO_ENDPOINT = 'nyc3.digitaloceanspaces.com';
const MINIO_ENDPOINT = process.env.MINIO_ENDPOINT;

const s3 = new S3({
endpoint: DO_ENDPOINT,
endpoint: process.env.NODE_ENV === 'development' ? MINIO_ENDPOINT : DO_ENDPOINT,
accessKeyId: process.env.DO_SPACES_ACCESS_KEY_ID,
secretAccessKey: process.env.DO_SPACES_SECRET_ACCESS_KEY,
s3ForcePathStyle: process.env.NODE_ENV === 'development' ? true : false,
});

async function imaginaryPost(
Expand Down Expand Up @@ -98,10 +100,16 @@ function getRandomFileName(file) {
}

function getPrivateS3Url() {
if (process.env.NODE_ENV === 'development') {
return `${MINIO_ENDPOINT}/${getPrivateS3BucketName()}`;
}
return `https://${getPrivateS3BucketName()}.${DO_ENDPOINT}`;
}

function getPublicS3Url() {
if (process.env.NODE_ENV === 'development') {
return `${MINIO_ENDPOINT}/${getPublicS3BucketName()}`;
}
return `https://${getPublicS3BucketName()}.${DO_ENDPOINT}`;
}

Expand Down
8 changes: 7 additions & 1 deletion packages/webapp/.env.default
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,12 @@ VITE_GOOGLE_OAUTH_CLIENT_ID=?
VITE_WEATHER_API_KEY=?

VITE_ENV=development
VITE_DO_BUCKET_NAME=litefarm
NODE_ENV=development
VITE_API_URL=http://localhost:5001

# Do not change this
VITE_DO_BUCKET_NAME=litefarm

# Match to your MinIO setup
VITE_DEV_BUCKET_NAME=minio-test
VITE_DEV_ENDPOINT=http://localhost:9000
29 changes: 22 additions & 7 deletions packages/webapp/src/containers/ExportDownload/index.jsx
Original file line number Diff line number Diff line change
Expand Up @@ -24,12 +24,27 @@ export default function DownloadExport({ match }) {
const dispatch = useDispatch();

useEffect(() => {
dispatch(downloadExport({
file: `https://${import.meta.env.VITE_DO_BUCKET_NAME}.nyc3.digitaloceanspaces.com/${fileSrc}.zip`,
from,
to,
}));
}, [])
const file =
import.meta.env.VITE_ENV === 'development'
? `http://${import.meta.env.VITE_DEV_ENDPOINT}/${
import.meta.env.VITE_DEV_BUCKET_NAME
}/${fileSrc}.zip`
: `https://${
import.meta.env.VITE_DO_BUCKET_NAME
}.nyc3.digitaloceanspaces.com/${fileSrc}.zip`;

return <p style={{ marginLeft: '8px', marginTop: '24px' }}>{i18n.t('CERTIFICATIONS.EXPORT_DOWNLOADING_MESSAGE')}</p>;
dispatch(
downloadExport({
file,
from,
to,
}),
);
}, []);

return (
<p style={{ marginLeft: '8px', marginTop: '24px' }}>
{i18n.t('CERTIFICATIONS.EXPORT_DOWNLOADING_MESSAGE')}
</p>
);
}
34 changes: 22 additions & 12 deletions packages/webapp/src/containers/ExportDownload/saga.js
Original file line number Diff line number Diff line change
Expand Up @@ -21,20 +21,30 @@ import i18n from '../../locales/i18n';
export const downloadExport = createAction('downloadExportSaga');

export function* downloadExportSaga({ payload }) {
const {
farm_name
} = yield select(userFarmSelector);
const { farm_name } = yield select(userFarmSelector);
try {
const fileName = farm_name ? `${farm_name} ${i18n.t('CERTIFICATIONS.EXPORT_FILE_TITLE')} ${payload.from} - ${payload.to}.zip` : undefined;
const config = {
headers: {
Authorization: 'Bearer ' + localStorage.getItem('farm_token'),
},
responseType: 'arraybuffer',
method: 'GET',
};
const fileName = farm_name
? `${farm_name} ${i18n.t('CERTIFICATIONS.EXPORT_FILE_TITLE')} ${payload.from} - ${
payload.to
}.zip`
: undefined;

const url = new URL(payload.file);
url.hostname = 'images.litefarm.workers.dev';

let config = {};

if (import.meta.env.VITE_ENV !== 'development') {
config = {
headers: {
Authorization: 'Bearer ' + localStorage.getItem('farm_token'),
},
responseType: 'arraybuffer',
method: 'GET',
};

url.hostname = 'images.litefarm.workers.dev';
}

const res = yield fetch(url, config);
if (res.status !== 403) {
const blob = yield res.blob();
Expand Down
14 changes: 9 additions & 5 deletions packages/webapp/src/containers/MediaWithAuthentication/index.jsx
Original file line number Diff line number Diff line change
Expand Up @@ -21,11 +21,15 @@ export function MediaWithAuthentication({
const fetchMediaUrl = async () => {
try {
subscribed = true;
const url = new URL(fileUrl);
url.hostname = 'images.litefarm.workers.dev';
const response = await fetch(url.toString(), config);
const blobFile = await response.blob();
subscribed && setMediaUrl(URL.createObjectURL(blobFile));
if (import.meta.env.VITE_ENV === 'development') {
subscribed && setMediaUrl(fileUrl);
} else {
const url = new URL(fileUrl);
url.hostname = 'images.litefarm.workers.dev';
const response = await fetch(url.toString(), config);
const blobFile = await response.blob();
subscribed && setMediaUrl(URL.createObjectURL(blobFile));
}
} catch (e) {
console.log(e);
}
Expand Down

0 comments on commit b8d4e3f

Please sign in to comment.