Skip to content

femitubosun/node-streams-file-upload

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Node.js Stream Studies

A simple project for learning and experimenting with Node.js streams through a practical file upload application. This project demonstrates how to handle large file uploads by breaking them into manageable chunks using streams.

Project Overview

This application consists of two main components:

  1. Server (index.ts): An Express server that handles file uploads in chunks, reassembles them, and saves the complete file.
  2. Client (client.ts): A command-line utility that breaks files into chunks and streams them to the server.

The project showcases several important Node.js stream concepts:

  • Reading files in chunks with fs.createReadStream
  • Writing files with fs.createWriteStream
  • Piping data between streams
  • Handling stream events

Installation

# Clone the repository
git clone <repository-url>
cd stream-studies

# Install dependencies
npm install

Running the Server

Start the server with:

npm run dev:file

This will launch the server on port 3000.

Using the Client

To upload a file using the client:

ts-node client.ts /path/to/your/file

The client will:

  1. Break the file into 1MB chunks
  2. Initialize an upload session with the server
  3. Upload each chunk individually
  4. Request the server to reassemble the chunks into the complete file

How It Works

  1. The client calculates how many chunks are needed based on file size
  2. The server creates a unique ID for each upload and a directory structure
  3. Each chunk is streamed to the server and saved temporarily
  4. When all chunks are received, the server combines them in the correct order
  5. The final file is saved in the upload/{id}/ directory

Project Structure

  • index.ts - Server implementation with Express
  • client.ts - Client implementation for uploading files
  • db.ts - Simple in-memory database for tracking uploads
  • utils.ts - Utility functions for file handling and ID generation

Technologies Used

  • TypeScript
  • Node.js
  • Express
  • Node.js Streams API
  • Axios for HTTP requests

Learning Outcomes

This project demonstrates:

  • How to efficiently handle large file uploads
  • Stream-based processing to minimize memory usage
  • Chunked file transfer with progress tracking
  • Reassembling file chunks in the correct order

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published