A simple project for learning and experimenting with Node.js streams through a practical file upload application. This project demonstrates how to handle large file uploads by breaking them into manageable chunks using streams.
This application consists of two main components:
- Server (
index.ts): An Express server that handles file uploads in chunks, reassembles them, and saves the complete file. - Client (
client.ts): A command-line utility that breaks files into chunks and streams them to the server.
The project showcases several important Node.js stream concepts:
- Reading files in chunks with
fs.createReadStream - Writing files with
fs.createWriteStream - Piping data between streams
- Handling stream events
# Clone the repository
git clone <repository-url>
cd stream-studies
# Install dependencies
npm installStart the server with:
npm run dev:fileThis will launch the server on port 3000.
To upload a file using the client:
ts-node client.ts /path/to/your/fileThe client will:
- Break the file into 1MB chunks
- Initialize an upload session with the server
- Upload each chunk individually
- Request the server to reassemble the chunks into the complete file
- The client calculates how many chunks are needed based on file size
- The server creates a unique ID for each upload and a directory structure
- Each chunk is streamed to the server and saved temporarily
- When all chunks are received, the server combines them in the correct order
- The final file is saved in the
upload/{id}/directory
index.ts- Server implementation with Expressclient.ts- Client implementation for uploading filesdb.ts- Simple in-memory database for tracking uploadsutils.ts- Utility functions for file handling and ID generation
- TypeScript
- Node.js
- Express
- Node.js Streams API
- Axios for HTTP requests
This project demonstrates:
- How to efficiently handle large file uploads
- Stream-based processing to minimize memory usage
- Chunked file transfer with progress tracking
- Reassembling file chunks in the correct order