This is a test project, using node streams to deliver on demand csv files to clients.
The used packages:
expressas http server@faker-js/fakerto generate fake datacsv-stringifyto convert object in csv
It also uses Readable.from to create a Readable Streams from a generator and AbortController to stop the download if the client cancel the request.
This project uses docker compose:
docker compose up -dor
npm run docker:upThe startup sequence is:
stateDiagram-v2
direction LR
nginx: Load Balance
db: Postgres
seed: Seed
app: Application
[*] --> db
db --> seed
seed --> app
state app {
direction LR
app1
app2
}
app --> nginx
nginx --> [*]
To destroy the services, including volumes and images:
docker compose down --volumes --rmi allor
npm run docker:downThe url http://localhost/download-csv will download a 10.000 line csv file. But this endpoint has two query params:
sizeis the number of lines
Examples
http://localhost/download-csv/?size=50000will download a csv with 50k lines.
The url and query params are the same.
# this command to see the lines building up
curl http://localhost/download-csv?size=500# this command to download the file, and see the progress
curl http://localhost/download-csv?size=500 -o ./file.csvBecause it uses Node Streams and asynchronous delay (it's simulates requests) this example can handle a lot of requests at once without blocking or getting out of memory