A secure, feature-rich file transfer server with encryption support, priority queuing, and comprehensive status tracking.
- 🔐 AES-128-EAX Encryption - Optional end-to-end file encryption
- 📊 Live Progress Tracking - Real-time upload/download progress with speed and ETA
- 🧩 Chunked Upload/Download - Break large files into chunks with resume capability
- 🔄 Automatic Retry - Exponential backoff retry logic for failed transfers
- 👥 Multi-Client Tracking - Track which client uploaded/downloaded each file
- 🌐 Network Health Monitoring - Detect unstable connections and adapt transfer strategy
- ⚡ Speed Tracking - Display transfer speed in MB/s with network quality assessment
- 💾 Queue Persistence - Transfer queue survives server restarts
- 🎯 Priority Queue - Transfer priority management
- 🔌 WebSocket Real-Time Notifications - Live updates via WebSocket connections
- 📦 Batch File Operations - Upload/download multiple files in one request
- ❌ Transfer Cancellation - Cancel ongoing transfers mid-upload
- 🖼️ File Metadata & Thumbnails - Rich file information with image thumbnails
- 📈 Transfer History - Filterable transfer history with date and client filters
- 🔑 API Key Authentication - Secure access with API key authentication
- 📊 Transfer Statistics - Comprehensive analytics and performance metrics
- 🔒 Thread-safe - Concurrent access with file locking
- ✅ Checksums - SHA-256 file integrity verification
- 🌐 CORS Support - Ready for frontend integration
- 📝 Comprehensive Logging - Detailed error and event tracking
┌─────────────────────────────────────────────────────────────────┐
│ CLIENT LAYER │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
│ │ React │ │ CLI Client │ │ Mobile App │ │
│ │ Frontend │ │ (client.py) │ │ (Future) │ │
│ └──────┬───────┘ └──────┬───────┘ └──────┬───────┘ │
└─────────┼──────────────────┼──────────────────┼────────────────┘
│ │ │
└──────────────────┼──────────────────┘
│
┌─────────────────────────────────────────────────────────────────┐
│ CONNECTION LAYER │
│ │
│ HTTP/REST APIs WebSocket (Real-time) │
│ └─ /upload └─ status_update │
│ └─ /download └─ transfer_update │
│ └─ /status └─ stats_update │
│ └─ /cancel │
└─────────────────────────────┬───────────────────────────────────┘
│
┌─────────────────────────────────────────────────────────────────┐
│ FLASK BACKEND SERVER │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ API Endpoints (server.py) │ │
│ │ /upload /download /status /cancel /batch /stats │ │
│ └────────────────────────┬────────────────────────────────┘ │
│ │ │
│ ┌────────────────────────┴────────────────────────────────┐ │
│ │ Core Services │ │
│ │ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌─────────┐│ │
│ │ │ Encrypt │ │ Hash │ │ Progress │ │ Status ││ │
│ │ │ Util │ │ Util │ │ Tracker │ │ Handler ││ │
│ │ └──────────┘ └──────────┘ └──────────┘ └─────────┘│ │
│ └─────────────────────────────────────────────────────────┘ │
└─────────────────────────────┬───────────────────────────────────┘
│
┌─────────────────────────────────────────────────────────────────┐
│ STORAGE LAYER │
│ ┌──────────────┐ ┌──────────────────┐ ┌─────────────────┐ │
│ │ Local Disk │ │ transfer_status │ │ Priority Queue │ │
│ │ (storage/) │ │ .json │ │ Management │ │
│ │ - Uploads │ │ - Metadata │ │ - Persistence │ │
│ │ - Encrypted │ │ - History │ │ - Retry Logic │ │
│ │ - Thumbnails │ │ - Stats │ │ │ │
│ └──────────────┘ └──────────────────┘ └─────────────────┘ │
└─────────────────────────────┬───────────────────────────────────┘
│
┌─────────────────────────────────────────────────────────────────┐
│ OPTIONAL CLOUD STORAGE (Future) │
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────┐ │
│ │ AWS S3 │ │ Azure │ │ Google │ │ Dropbox │ │
│ │ │ │ Blob │ │ Cloud │ │ API │ │
│ └──────────┘ └──────────┘ └──────────┘ └──────────┘ │
└─────────────────────────────────────────────────────────────────┘
The system processes file transfers through an 8-step pipeline:
- Upload Request - Client initiates file upload via HTTP POST with optional encryption flag
- Chunking - Large files (>5MB) are automatically split into manageable chunks for reliability
- Integrity Check - Each chunk is hashed using SHA-256 to verify data integrity
- Encryption (optional) - Files are encrypted using AES-128-EAX with authenticated encryption
- Storage - Chunks are written to disk with atomic operations and file locking
- WebSocket Broadcast - Real-time progress updates pushed to all subscribed clients
- Queue Management - Transfer added to priority queue with persistent state tracking
- Download - Files are streamed back to clients with optional decryption
The backend implements multiple resilience features:
- Resume on Failure - Chunked uploads can resume from the last successful chunk using
/resume_infoendpoint - Auto-Retry with exponential backoff - Failed transfers retry automatically with increasing delays (1s, 2s, 4s, 8s, 16s)
- Network Monitoring with adaptive chunk sizing -
/pingendpoint monitors latency and adjusts chunk size based on network quality - Transfer Cancellation - In-flight transfers can be cancelled gracefully via
/cancelendpoint
backend/
├── server.py # Flask HTTP API server with WebSocket support
├── client.py # CLI client for testing
├── config.py # Configuration management
├── requirements.txt # Python dependencies
├── transfer_status.json # Transfer status tracking
├── test_integration.py # Integration test suite
└── utils/
├── hash_util.py # SHA-256 checksums
├── encrypt_util.py # AES-128-EAX encryption
├── status_handler.py # Thread-safe status management
├── progress_tracker.py # Live progress tracking utilities
├── metadata_util.py # File metadata and thumbnail generation
└── auto_restart.py # Auto-restart monitor
- Python 3.8+
- Windows or Linux/Ubuntu
-
Navigate to backend directory:
cd backend -
Install dependencies:
pip install -r requirements.txt
-
Configure (optional): Edit
config.pyor use environment variables:# Windows (PowerShell) $env:SERVER_PORT = "8080" $env:ENCRYPTION_KEY = "your-16-byte-key" # Linux/Mac export SERVER_PORT=8080 export ENCRYPTION_KEY="your-16-byte-key"
Basic start:
python server.pyWith auto-restart monitor:
python utils/auto_restart.pyThe server will start on http://localhost:8080 by default.
Upload a file:
python client.py upload document.pdfUpload with encryption:
python client.py upload secret.txt --encrypt --priority 5Upload with retry logic:
python client.py upload large_file.zip --retry 5Chunked upload with resume:
python client.py upload huge_file.zip --chunked --chunk-size 2097152Check all transfers:
python client.py statusCheck specific file:
python client.py status --file document.pdfDownload a file:
python client.py download document.pdf --output ./downloads/Health check:
python client.py healthFull CLI help:
python client.py --help
python client.py upload --helpUpload a file with optional encryption.
Request (multipart/form-data):
file: File to upload (required)filename: Custom filename (optional)encryption: Enable encryption -true/false(default:false)priority: Transfer priority 0-10 (default: 0)
Response (200 OK):
{
"success": true,
"filename": "document.pdf",
"hash": "abc123...",
"status": "completed",
"encryption": false,
"size": 1024,
"priority": 0
}Errors:
400: Missing file or invalid parameters413: File too large (>100MB)500: Server error
Get all transfer status entries.
Response (200 OK):
{
"transfers": [
{
"filename": "document.pdf",
"status": "completed",
"checksum": "abc123...",
"encryption": false,
"priority": 0,
"created_at": "2024-10-24T07:30:00",
"updated_at": "2024-10-24T07:30:05"
}
],
"queue": [
{"filename": "document.pdf", "priority": 5}
],
"metadata": {
"total_transfers": 10,
"last_updated": "2024-10-24T07:35:00"
}
}Get status for a specific file.
Response (200 OK):
{
"filename": "document.pdf",
"status": "uploading",
"checksum": "abc123...",
"encryption": false,
"priority": 0,
"progress": 45,
"speed": 5242880,
"eta": 120,
"transferred_bytes": 1048576,
"total_bytes": 10485760,
"client_ip": "192.168.1.100",
"retry_count": 0,
"created_at": "2024-10-24T07:30:00",
"updated_at": "2024-10-24T07:30:05"
}Errors:
404: File not found
Download a file with optional decryption.
Query Parameters:
decrypt: Decrypt encrypted files -true/false(default:true)
Response (200 OK):
File stream with Content-Disposition: attachment
Errors:
404: File not found403: Permission denied500: Decryption failed
Health check endpoint.
Response (200 OK):
{
"status": "ok",
"timestamp": "2024-10-24T07:30:00Z"
}Upload a single chunk of a file with progress tracking.
Request (multipart/form-data):
chunk: File chunk (binary, required)filename: Target filename (required)chunk_number: Current chunk number 0-based (required)total_chunks: Total number of chunks (required)chunk_hash: SHA-256 hash of this chunk (required)client_id: Optional custom client identifier
Response (200 OK):
{
"success": true,
"chunks_received": 5,
"status": "completed"
}Errors:
400: Missing required fields or chunk integrity check failed500: Chunk upload failed
Get which chunks have been received for resuming upload.
Response (200 OK):
{
"received_chunks": [0, 1, 2, 4],
"can_resume": true
}Get list of all clients that have uploaded/downloaded files.
Response (200 OK):
{
"clients": [
{
"ip": "192.168.1.100",
"files": ["file1.txt", "file2.pdf"],
"total_uploads": 2,
"total_downloads": 0,
"last_activity": "2024-10-24T07:30:00",
"client_agent": "Mozilla/5.0...",
"client_id": "client_123"
}
]
}Client pings to measure latency and network quality.
Request (JSON):
{
"timestamp": 1698123456.789
}Response (200 OK):
{
"server_timestamp": 1698123456.790,
"latency_ms": 1.2,
"network_quality": "excellent",
"recommended_chunk_size": 1048576
}Upload multiple files in one request.
Headers:
X-API-Key: API key for authentication (required)
Request (multipart/form-data):
files: Multiple files to upload (required)encryption: Enable encryption -true/false(default:false)priority: Transfer priority 0-10 (default: 0)client_id: Custom client identifier (optional)
Response (200 OK):
{
"success": true,
"total_files": 3,
"successful": 3,
"failed": 0,
"results": [
{
"filename": "file1.txt",
"status": "success",
"hash": "abc123...",
"size": 1024,
"metadata": {...}
}
]
}Download multiple files as a ZIP archive.
Headers:
X-API-Key: API key for authentication (required)
Request (JSON):
{
"filenames": ["file1.txt", "file2.pdf", "file3.jpg"]
}Response (200 OK):
ZIP file stream with Content-Type: application/zip
Cancel an ongoing transfer.
Headers:
X-API-Key: API key for authentication (required)
Response (200 OK):
{
"success": true,
"message": "file.txt cancelled"
}Get file metadata including thumbnail.
Response (200 OK):
{
"filename": "image.jpg",
"size": 1048576,
"created": "2024-10-24T07:30:00",
"modified": "2024-10-24T07:30:05",
"mime_type": "image/jpeg",
"extension": ".jpg",
"thumbnail": "image.jpg.thumb.jpg"
}Serve thumbnail image.
Response (200 OK):
JPEG image stream with Content-Type: image/jpeg
Get transfer history with optional filters.
Query Parameters:
status: Filter by status (completed, failed, uploading, etc.)client: Filter by client IPfrom: From date (YYYY-MM-DD)to: To date (YYYY-MM-DD)limit: Maximum results (default: 100)offset: Skip results (default: 0)
Response (200 OK):
{
"transfers": {
"file1.txt": {
"status": "completed",
"client_ip": "192.168.1.100",
"created_at": "2024-10-24T07:30:00"
}
},
"total_count": 50,
"returned_count": 10,
"offset": 0,
"limit": 100
}Get comprehensive transfer statistics.
Response (200 OK):
{
"total_transfers": 100,
"total_bytes": 104857600,
"total_size_mb": 100.0,
"completed_count": 95,
"failed_count": 5,
"active_count": 0,
"success_rate": 95.0,
"average_speed_mbps": 10.5,
"total_clients": 15,
"file_types": {
".txt": 30,
".pdf": 20,
".jpg": 25
},
"encrypted_count": 40,
"queue_length": 0
}Generate a new API key (admin only).
Response (200 OK):
{
"api_key": "new_generated_key_here"
}const socket = io('http://localhost:8080');
socket.on('connect', () => {
console.log('Connected to file transfer server');
});// Subscribe to specific file
socket.emit('subscribe_status', { filename: 'document.pdf' });
// Subscribe to all transfers
socket.emit('subscribe_all');
// Subscribe to statistics
socket.emit('subscribe_stats');// File status updates
socket.on('status_update', (data) => {
console.log('File update:', data.progress, '%');
});
// Transfer updates
socket.on('transfer_update', (data) => {
console.log('Transfer update:', data.filename, data.progress);
});
// Statistics updates
socket.on('stats_update', (stats) => {
console.log('Stats update:', stats.total_transfers);
});1. Install dependencies:
npm install socket.io-client axios2. Complete React component example:
import React, { useState, useEffect } from 'react';
import axios from 'axios';
import io from 'socket.io-client';
const FileUploader = () => {
const [file, setFile] = useState(null);
const [progress, setProgress] = useState(0);
const [speed, setSpeed] = useState(0);
const [eta, setEta] = useState(0);
const [encryption, setEncryption] = useState(false);
const [priority, setPriority] = useState(5);
const [status, setStatus] = useState('');
const [socket, setSocket] = useState(null);
const [isDragging, setIsDragging] = useState(false);
const API_URL = process.env.REACT_APP_API_URL || 'http://localhost:8080';
const API_KEY = process.env.REACT_APP_API_KEY;
useEffect(() => {
// Initialize WebSocket connection
const newSocket = io(API_URL);
setSocket(newSocket);
newSocket.on('connect', () => {
console.log('Connected to file transfer server');
newSocket.emit('subscribe_all');
});
newSocket.on('transfer_update', (data) => {
if (file && data.filename === file.name) {
setProgress(data.progress || 0);
setSpeed(data.speed || 0);
setEta(data.eta || 0);
setStatus(data.status || '');
}
});
return () => newSocket.close();
}, [API_URL, file]);
const handleDragOver = (e) => {
e.preventDefault();
setIsDragging(true);
};
const handleDragLeave = () => {
setIsDragging(false);
};
const handleDrop = (e) => {
e.preventDefault();
setIsDragging(false);
const droppedFile = e.dataTransfer.files[0];
if (droppedFile) setFile(droppedFile);
};
const handleFileChange = (e) => {
setFile(e.target.files[0]);
};
const handleUpload = async () => {
if (!file) {
alert('Please select a file first');
return;
}
const formData = new FormData();
formData.append('file', file);
formData.append('encryption', encryption);
formData.append('priority', priority);
try {
setStatus('uploading');
const response = await axios.post(`${API_URL}/upload`, formData, {
headers: {
'Content-Type': 'multipart/form-data',
'X-API-Key': API_KEY
},
onUploadProgress: (progressEvent) => {
const percentCompleted = Math.round(
(progressEvent.loaded * 100) / progressEvent.total
);
setProgress(percentCompleted);
}
});
setStatus('completed');
console.log('Upload successful:', response.data);
alert(`File uploaded successfully: ${response.data.filename}`);
} catch (error) {
setStatus('failed');
console.error('Upload failed:', error);
alert(`Upload failed: ${error.response?.data?.message || error.message}`);
}
};
const formatSpeed = (bytesPerSecond) => {
return (bytesPerSecond / (1024 * 1024)).toFixed(2) + ' MB/s';
};
const formatTime = (seconds) => {
if (seconds < 60) return `${seconds}s`;
const minutes = Math.floor(seconds / 60);
const secs = seconds % 60;
return `${minutes}m ${secs}s`;
};
return (
<div style={{ padding: '20px', maxWidth: '600px', margin: '0 auto' }}>
<h2>Smart File Transfer</h2>
{/* Drag-and-drop zone */}
<div
onDragOver={handleDragOver}
onDragLeave={handleDragLeave}
onDrop={handleDrop}
style={{
border: `2px dashed ${isDragging ? '#007bff' : '#ccc'}`,
borderRadius: '8px',
padding: '40px',
textAlign: 'center',
backgroundColor: isDragging ? '#f0f8ff' : '#f9f9f9',
cursor: 'pointer',
marginBottom: '20px'
}}
>
<input
type="file"
onChange={handleFileChange}
style={{ display: 'none' }}
id="file-input"
/>
<label htmlFor="file-input" style={{ cursor: 'pointer' }}>
{file ? (
<p>📄 {file.name} ({(file.size / (1024 * 1024)).toFixed(2)} MB)</p>
) : (
<p>Drag and drop a file here, or click to select</p>
)}
</label>
</div>
{/* Options */}
<div style={{ marginBottom: '20px' }}>
<label style={{ display: 'block', marginBottom: '10px' }}>
<input
type="checkbox"
checked={encryption}
onChange={(e) => setEncryption(e.target.checked)}
/>
🔐 Enable Encryption (AES-128)
</label>
<label style={{ display: 'block', marginBottom: '10px' }}>
⚡ Priority (0-10):
<input
type="range"
min="0"
max="10"
value={priority}
onChange={(e) => setPriority(parseInt(e.target.value))}
style={{ marginLeft: '10px', width: '200px' }}
/>
<span style={{ marginLeft: '10px' }}>{priority}</span>
</label>
</div>
{/* Upload button */}
<button
onClick={handleUpload}
disabled={!file || status === 'uploading'}
style={{
padding: '10px 20px',
backgroundColor: '#007bff',
color: 'white',
border: 'none',
borderRadius: '5px',
cursor: file && status !== 'uploading' ? 'pointer' : 'not-allowed',
fontSize: '16px',
width: '100%'
}}
>
{status === 'uploading' ? '⏳ Uploading...' : '📤 Upload File'}
</button>
{/* Progress bar */}
{status === 'uploading' && (
<div style={{ marginTop: '20px' }}>
<div
style={{
width: '100%',
backgroundColor: '#e0e0e0',
borderRadius: '10px',
overflow: 'hidden',
height: '30px'
}}
>
<div
style={{
width: `${progress}%`,
backgroundColor: '#28a745',
height: '100%',
display: 'flex',
alignItems: 'center',
justifyContent: 'center',
color: 'white',
fontWeight: 'bold',
transition: 'width 0.3s ease'
}}
>
{progress}%
</div>
</div>
{speed > 0 && (
<p style={{ marginTop: '10px', textAlign: 'center' }}>
🚀 Speed: {formatSpeed(speed)} | ⏱️ ETA: {formatTime(eta)}
</p>
)}
</div>
)}
{/* Status message */}
{status === 'completed' && (
<div style={{ marginTop: '20px', color: '#28a745', textAlign: 'center' }}>
✅ Upload completed successfully!
</div>
)}
{status === 'failed' && (
<div style={{ marginTop: '20px', color: '#dc3545', textAlign: 'center' }}>
❌ Upload failed. Please try again.
</div>
)}
</div>
);
};
export default FileUploader;3. Start the backend:
cd backend
python server.py4. Access the API:
- Base URL:
http://localhost:8080 - API Key: Set in environment variables or config.py
Create a .env file in your frontend project root:
# Backend API Configuration
REACT_APP_API_URL=http://localhost:8080
REACT_APP_API_KEY=your_api_key_here
# For production deployment
# REACT_APP_API_URL=https://your-backend.herokuapp.comA complete React frontend with advanced features is under development:
Planned Features:
- 🎨 Drag-and-drop file upload - Intuitive file selection with visual feedback
- 📊 Real-time progress tracking - Live progress bars with speed and ETA
- 📋 Transfer history table - Sortable and filterable transfer records
- 📈 Analytics dashboard - Visual charts for transfer statistics
- 🌓 Dark/light theme toggle - User preference persistence
- 🔐 Encryption controls - Easy toggle for secure transfers
- ⚡ Priority management - Visual priority selector with queue preview
- 🔄 Resume functionality - Automatic retry and resume for failed transfers
- 📱 Responsive design - Mobile-friendly interface
- 🔔 Push notifications - Desktop notifications for completed transfers
Test individual modules:
# Hash utility tests
python utils/hash_util.py
# Encryption utility tests
python utils/encrypt_util.py
# Status handler tests
python utils/status_handler.pyRun full integration test suite:
# Start server first
python server.py
# In another terminal, run tests
python test_integration.pyAuto-start server and test:
python test_integration.py --auto-startTest specific server:
python test_integration.py --url http://localhost:9000The integration tests verify:
✅ Valid Requests:
- File upload (unencrypted)
- File upload with encryption
- Chunked upload with resume capability
- Live progress tracking
- Client identification and tracking
- Network quality monitoring
- Status retrieval (all and specific)
- File download (unencrypted)
- File download with decryption
- Health check
✅ Invalid Requests:
- Missing file upload (400 expected)
- Large file upload >100MB (413 expected)
- Nonexistent file status (404 expected)
- Nonexistent file download (404 expected)
- Invalid endpoint (404 expected)
✅ Error Handling:
- All error responses are JSON formatted
- User-friendly error messages
- Proper HTTP status codes
✅ Edge Cases:
- File corruption recovery
- Thread-safe concurrent access
- Empty file handling
- Unicode filename support
1. Test encryption roundtrip:
# Create test file
echo "Secret message" > test.txt
# Upload with encryption
python client.py upload test.txt --encrypt --filename encrypted_test.txt
# Download and verify
python client.py download encrypted_test.txt --output decrypted.txt
cat decrypted.txt
# Should output: "Secret message"2. Test priority queue:
# Upload files with different priorities
python client.py upload file1.txt --priority 1
python client.py upload file2.txt --priority 10
python client.py upload file3.txt --priority 5
# Check queue order
python client.py status
# Queue should show highest priority first3. Test large file handling:
# Create 10MB file
python -c "with open('large.bin', 'wb') as f: f.write(b'A' * (10*1024*1024))"
# Upload
python client.py upload large.bin
# Verify
python client.py status --file large.bin4. Test error handling:
# Try to download nonexistent file
python client.py download nonexistent.txt
# Should show user-friendly error
# Try to upload without running server
# (stop server first)
python client.py upload test.txt
# Should show connection error with helpful message5. Test new features:
# Batch upload multiple files
python client.py batch file1.txt file2.txt file3.txt --encrypt
# Cancel a transfer
python client.py cancel large_file.bin
# Get transfer history
python client.py history --status completed --limit 10
# Get statistics
python client.py stats
# Get file metadata
python client.py metadata image.jpg| Variable | Description | Default |
|---|---|---|
SERVER_HOST |
Server host address | 0.0.0.0 |
SERVER_PORT |
Server port | 8080 |
UPLOAD_DIR |
Upload directory path | storage/ |
LOG_FILE |
Status tracking file | transfer_status.json |
ENCRYPTION_KEY |
16-byte AES key | Dev default (change in production!) |
MAX_FILE_SIZE |
Max upload size (bytes) | 104857600 (100MB) |
AUTO_RESTART_ENABLED |
Enable auto-restart | true |
Production setup:
# Generate secure key
openssl rand -base64 16
# Set environment variable
export ENCRYPTION_KEY="your-generated-key"Additional security recommendations:
- Use HTTPS in production (reverse proxy with nginx/Apache)
- Implement authentication/authorization
- Rate limiting for upload endpoint
- Input sanitization (already included)
- Regular key rotation
- Secure key management (AWS KMS, Azure Key Vault, etc.)
Check port availability:
# Windows
netstat -ano | findstr :8080
# Linux/Mac
lsof -i :8080Solution: Change port in config or kill existing process.
Solution: Increase MAX_FILE_SIZE in config.py or environment:
export MAX_FILE_SIZE=209715200 # 200MBThe system automatically recovers from corruption using backups.
Manual recovery:
# Restore from backup
cp transfer_status.json.backup transfer_status.json
# Or reset to default
echo '{"transfers":{},"queue":[],"metadata":{"last_updated":null,"total_transfers":0,"version":"1.0"}}' > transfer_status.jsonWindows:
# Run as administrator
python server.pyLinux/Mac:
# Fix permissions
chmod 755 server.py
chmod 755 client.py
chmod -R 755 utils/
# Create storage directories
mkdir -p storage/temp storage/encrypted
chmod 755 storageSolution: Ensure all dependencies installed:
pip install -r requirements.txt --upgradeCheck if server is running:
python client.py healthIf not running:
python server.py1. One person sets up the backend:
git clone <repo>
cd backend
pip install -r requirements.txt
python server.py2. Share server URL with team:
http://<your-ip>:8080
# Find your local IP:
# Windows: ipconfig | findstr IPv4
# Linux/Mac: ifconfig | grep "inet "
3. Frontend team can immediately start using API:
// Upload file from frontend
const formData = new FormData();
formData.append('file', fileInput.files[0]);
formData.append('encryption', 'true');
fetch('http://192.168.1.100:8080/upload', {
method: 'POST',
body: formData
})
.then(res => res.json())
.then(data => console.log('Uploaded:', data.filename));Backend Team (this repo):
- ✅ Already done! Just run
python server.py - Optional: Add authentication, custom endpoints
- Testing: Use
client.pyfor demos
Frontend Team:
- Build UI for file upload/download
- Display transfer status table
- Show encryption toggle
- Progress bars for uploads
DevOps Team:
- Deploy to cloud (see Cloud Hosting section)
- Set up domain and HTTPS
- Monitor with health check endpoint
Demo/Presentation:
- Use
client.pyfor CLI demos - Show encryption working (upload encrypted, download decrypted)
- Show status tracking in real-time
- Explain security features
1. Parallel testing:
# Terminal 1 - Backend
python server.py
# Terminal 2 - Test uploads
python client.py upload file1.txt &
python client.py upload file2.txt &
python client.py upload file3.txt &
# Terminal 3 - Monitor status
watch -n 1 'python client.py status'2. Stress testing:
# Upload multiple files quickly
for i in {1..10}; do
echo "Test file $i" > test$i.txt
python client.py upload test$i.txt --priority $i &
done
wait3. Integration testing:
# Run full test suite before demo
python test_integration.py --auto-start4. Demo script:
#!/bin/bash
echo "=== Smart File Transfer Demo ==="
echo "\n1. Health Check"
python client.py health
echo "\n2. Upload File (Unencrypted)"
python client.py upload demo.txt
echo "\n3. Upload File (Encrypted)"
python client.py upload secret.txt --encrypt --priority 5
echo "\n4. Check Status"
python client.py status
echo "\n5. Download File"
python client.py download demo.txt -o downloaded.txt
echo "\n6. Verify Integrity"
diff demo.txt downloaded.txt && echo "✓ Files match!"
echo "\nDemo complete!"Issue 1: "Can't connect to server from other computer"
# Solution: Bind to 0.0.0.0, not localhost
export SERVER_HOST="0.0.0.0"
python server.py
# Or edit config.py:
SERVER_HOST = '0.0.0.0' # Not '127.0.0.1'Issue 2: "Port already in use"
# Solution: Use different port
export SERVER_PORT="9000"
python server.pyIssue 3: "CORS error in browser"
- Already fixed! Flask-CORS is configured for localhost
- For other origins, edit
server.py:
CORS(app, resources={
r"/*": {
"origins": ["http://your-frontend-url.com"],
# ...
}
})Issue 4: "Files too large"
# Increase limit temporarily
export MAX_FILE_SIZE="524288000" # 500MB
python server.pyIssue 5: "Server keeps crashing"
# Use auto-restart
python utils/auto_restart.py
# Will automatically restart on crashes1. Backend dev workflow:
# Create feature branch
git checkout -b feature/new-endpoint
# Make changes
edit server.py
# Test
python test_integration.py
# Commit and push
git add .
git commit -m "Add new endpoint: /list"
git push origin feature/new-endpoint
# Merge via PR2. Frontend dev workflow:
# Frontend team uses stable backend
# Point to deployed URL, not local
const API_URL = 'http://stable-backend.herokuapp.com';3. Shared environment variables:
# Create .env file (add to .gitignore!)
echo "ENCRYPTION_KEY=shared-dev-key-1234567890" > .env
echo "SERVER_PORT=8080" >> .env
# Share with team via secure channel (not git!)Pros: Instant deployment, free tier, browser-based IDE
Steps:
- Go to https://replit.com
- Click "Create Repl"
- Choose "Import from GitHub"
- Paste repository URL
- Click "Run"
URL: https://your-repl-name.your-username.repl.co
Secrets (environment variables):
- Click "Secrets" icon (lock)
- Add
ENCRYPTION_KEYwith secure value
Pros: Free tier, automatic HTTPS, easy scaling
Steps:
# Install Heroku CLI
# https://devcenter.heroku.com/articles/heroku-cli
# Login
heroku login
# Create app
heroku create your-app-name
# Set config
heroku config:set ENCRYPTION_KEY=$(openssl rand -base64 16)
# Deploy
git push heroku main
# Open
heroku openURL: https://your-app-name.herokuapp.com
Logs:
heroku logs --tailPros: Free tier, GitHub integration, automatic deploys
Steps:
- Go to https://railway.app
- Click "New Project" → "Deploy from GitHub repo"
- Select your repository
- Add environment variables in Settings
- Railway auto-deploys on every git push
URL: Auto-generated or custom domain
Pros: $5/month, good performance, easy management
Steps:
- Go to https://cloud.digitalocean.com/apps
- Click "Create App"
- Connect GitHub repository
- Configure:
- Build Command: (leave empty)
- Run Command:
python server.py - HTTP Port: 8080
- Add environment variables
- Deploy
URL: https://your-app-random.ondigitalocean.app
Pros: Full control, free tier (12 months), scalable
Steps:
# 1. Launch EC2 instance (Ubuntu 22.04)
# 2. SSH into instance
ssh -i your-key.pem ubuntu@your-ec2-ip
# 3. Setup
sudo apt update && sudo apt install python3-pip git -y
git clone <your-repo>
cd backend
pip3 install -r requirements.txt
# 4. Set environment
export ENCRYPTION_KEY=$(openssl rand -base64 16)
# 5. Run in background
nohup python3 server.py > server.log 2>&1 &URL: http://your-ec2-ip:8080
Security Group: Allow inbound on port 8080
Heroku:
heroku config:set ENCRYPTION_KEY="your-key"
heroku config:set MAX_FILE_SIZE="104857600"Railway:
- Go to project → Variables
- Add
ENCRYPTION_KEY,MAX_FILE_SIZE, etc.
DigitalOcean:
- App Settings → Environment Variables
- Add key-value pairs
AWS:
# Store in Systems Manager Parameter Store
aws ssm put-parameter --name /file-transfer/encryption-key \
--value "your-key" --type SecureString
# Retrieve in startup script
ENCRYPTION_KEY=$(aws ssm get-parameter --name /file-transfer/encryption-key \
--with-decryption --query Parameter.Value --output text)Dynamic Port (Heroku, Railway):
# In server.py, change:
if __name__ == '__main__':
port = int(os.environ.get('PORT', SERVER_PORT)) # Use dynamic port
app.run(host=SERVER_HOST, port=port, threaded=True)Fixed Port (EC2, DigitalOcean):
- Keep
SERVER_PORT = 8080 - Configure firewall to allow port 8080
Option 1: Allow specific domain
# In server.py
CORS(app, resources={
r"/*": {
"origins": [
"https://your-frontend.com",
"https://your-frontend.netlify.app",
"http://localhost:3000" # Local dev
],
# ...
}
})Option 2: Allow all (hackathon only!)
CORS(app) # Allows all originsProblem: Cloud platforms may not persist uploaded files across restarts
Solutions:
1. Use cloud storage (recommended):
# AWS S3
pip install boto3import boto3
s3 = boto3.client('s3')
# Upload to S3 instead of local storage
s3.upload_file('local_file.txt', 'bucket-name', 'remote_file.txt')
# Download from S3
s3.download_file('bucket-name', 'remote_file.txt', 'local_file.txt')2. Use volume mounting (DigitalOcean, AWS):
# Mount persistent volume
sudo mkdir /mnt/storage
sudo mount /dev/vdb /mnt/storage
# Set UPLOAD_DIR
export UPLOAD_DIR="/mnt/storage"3. Use database for small files (Heroku):
pip install psycopg2-binary# Store files in PostgreSQL BYTEA column
import psycopg2
conn = psycopg2.connect(os.environ['DATABASE_URL'])
cur = conn.cursor()
# Store file
with open('file.txt', 'rb') as f:
cur.execute("INSERT INTO files (name, data) VALUES (%s, %s)",
('file.txt', f.read()))
conn.commit()Health check monitoring:
# Use cron or monitoring service
*/5 * * * * curl https://your-app.com/health || echo "Server down!"Uptime monitoring services:
- UptimeRobot (free): https://uptimerobot.com
- Pingdom (free tier): https://www.pingdom.com
- StatusCake (free tier): https://www.statuscake.com
Log monitoring:
# Heroku
heroku logs --tail
# Railway
railway logs
# AWS CloudWatch
aws logs tail /aws/ec2/file-transfer --followFree Tier Options:
- Heroku: 550 hours/month (1 dyno always on)
- Railway: $5 credit/month (~700 hours)
- AWS: 750 hours/month (12 months)
- DigitalOcean: $200 credit for 60 days (new users)
- Replit: Always free (with limitations)
Paid Recommendations (Hackathon Scale):
- Heroku Hobby: $7/month (always on, custom domain)
- Railway Pro: $5/month (better resources)
- DigitalOcean Droplet: $6/month (1GB RAM)
backend/
├── server.py # Main API server
│ └── Routes: /upload, /download, /status, /health
├── client.py # CLI client
│ └── Commands: upload, download, status, health
├── config.py # Configuration
│ └── Validates on import
├── utils/
│ ├── hash_util.py # SHA-256 checksums
│ │ └── file_checksum(path) -> str
│ ├── encrypt_util.py # AES-128-EAX encryption
│ │ ├── encrypt_data(data, key) -> (nonce, ciphertext, tag)
│ │ └── decrypt_data(nonce, ciphertext, tag, key) -> plaintext
│ ├── status_handler.py # Status tracking
│ │ ├── update_status(filename, status, ...)
│ │ ├── get_status(filename) -> dict
│ │ └── get_all_status() -> dict
│ └── auto_restart.py # Process monitoring
│ └── monitor_server(script, interval=5)
└── test_integration.py # Integration tests
└── IntegrationTest.run_all_tests()
-
Add endpoint in server.py:
@app.route('/new_endpoint', methods=['GET']) def new_endpoint(): # Implementation return jsonify({'result': 'data'}), 200
-
Add client command in client.py:
# In argparse section new_parser = subparsers.add_parser('new_command') # In main section elif args.command == 'new_command': new_command_function()
-
Add tests in test_integration.py:
def test_new_feature(self): self.print_test("New Feature Test") # Test implementation
- Follow PEP 8
- Docstrings for all functions/classes
- Type hints where appropriate
- Comprehensive error handling
- Logging for debugging
MIT License - see LICENSE file for details.
For issues, questions, or contributions, please open an issue or submit a pull request.
- 🆕 WebSocket Real-Time Notifications - Live updates via WebSocket connections
- 🆕 Batch File Operations - Upload/download multiple files in one request
- 🆕 Transfer Cancellation - Cancel ongoing transfers mid-upload
- 🆕 File Metadata & Thumbnails - Rich file information with image thumbnails
- 🆕 Transfer History - Filterable transfer history with date and client filters
- 🆕 API Key Authentication - Secure access with API key authentication
- 🆕 Transfer Statistics - Comprehensive analytics and performance metrics
- 🆕 Enhanced Client CLI - New commands for batch operations, cancellation, history, stats, and metadata
- 🆕 New API Endpoints -
/upload_batch,/download_batch,/cancel,/metadata,/thumbnail,/history,/stats,/generate_key - 🆕 Comprehensive Testing - Updated test suite for all new features
- 🆕 Live Progress Tracking - Real-time upload/download progress with speed and ETA
- 🆕 Chunked Upload/Download - Break large files into chunks with resume capability
- 🆕 Automatic Retry - Exponential backoff retry logic for failed transfers
- 🆕 Multi-Client Tracking - Track which client uploaded/downloaded each file
- 🆕 Network Health Monitoring - Detect unstable connections and adapt transfer strategy
- 🆕 Speed Tracking - Display transfer speed in MB/s with network quality assessment
- 🆕 Queue Persistence - Transfer queue survives server restarts
- 🆕 New API Endpoints -
/upload_chunk,/resume_info,/clients,/ping - 🆕 Enhanced CLI - New options for retry, chunked upload, and progress tracking
- 🆕 Comprehensive Testing - Updated test suite for all new features
- Initial release
- File upload/download with encryption
- Status tracking and queue management
- CLI client
- Auto-restart monitor
- Comprehensive test suite