Turn Discord into a datastore that can manage and store your files.
DDrive is a lightweight cloud storage system that utilizes Discord as a storage device, written in Node.js. It supports an unlimited file size and storage capacity, implemented using Node.js streams with multipart upload and download.
Current stable branch: 4.x
Live demo: ddrive.forscht.dev
- Unlimited file size through 24MB file chunks using Node.js streams API.
- Simple yet robust HTTP front end.
- REST API with OpenAPI 3.1 specifications.
- Tested with storing 4TB of data on a single Discord channel (max file size of 16GB).
- Basic auth with read-only public access to the panel.
- Easily deployable on Heroku/Replit for private cloud storage.
Version 4.0 is a complete rewrite of DDrive, incorporating the most requested features and several improvements:
- Now uses
postgres
to store file metadata for improved performance and functionality. - Supports file and folder
rename
operations. - Enables
move
operations for files and folders via API. - Utilizes
webhooks
instead ofbot/user tokens
to bypass Discord rate limits. - Parallel file chunk uploads for increased speed (5GB file uploaded in 85 seconds).
- Public access mode for read-only access.
- Batch file uploads directly from the panel.
- Bug fixes, including improved
download reset
for mobile devices. - Optional file encryption for enhanced security.
- Proper REST API adhering to OpenAPI 3.1 standards.
- Dark/light mode support on the panel.
Your support is highly appreciated - Buy me a coffee
- Node.js v16.x or Docker
- PostgreSQL Database, Discord Webhook URLs
- Average technical knowledge
- Clone this project.
- Create several webhook URLs (at least 5 for optimal performance).
- Set up PostgreSQL using Docker:
- Navigate to
.devcontainer
directory. - Run
docker-compose up -d
.
- Navigate to
- Copy
config/.env_sample
toconfig/.env
and update with your details. - (Optional) If you have many webhook URLs, list them in
webhook.txt
, separated by newlines. - Run
npm run migration:up
. - Start the server with
node bin/ddrive
. - Open
http://localhost:3000
in your browser.
- Install pm2:
npm install -g pm2
. - Start DDrive with pm2:
pm2 start bin/ddrive
. - Check the status:
pm2 list
. - View logs:
pm2 logs
.
Variable | Description |
---|---|
DATABASE_URL |
PostgreSQL database URL |
WEBHOOKS |
Discord webhook URLs, comma-separated |
PORT |
HTTP port for the DDrive panel |
REQUEST_TIMEOUT |
Timeout for Discord API requests (in milliseconds) |
CHUNK_SIZE |
Size of file chunks (in bytes) |
SECRET |
Secret for file encryption |
AUTH |
Basic auth credentials (username:password ) |
PUBLIC_ACCESS |
Read-only access level (READ_ONLY_FILE or READ_ONLY_PANEL ) |
UPLOAD_CONCURRENCY |
Number of parallel uploads to Discord |
docker run --rm -it -p 8080:8080 \
-e PORT=8080 \
-e WEBHOOKS={url1},{url2} \
-e DATABASE_URL={database url} \
--name ddrive forscht/ddrive
npm install @forscht/ddrive
const { DFs, HttpServer } = require('@forscht/ddrive')
const DFsConfig = {
chunkSize: 25165824,
webhooks: 'webhookURL1,webhookURL2',
secret: 'somerandomsecret',
maxConcurrency: 3, // UPLOAD_CONCURRENCY
restOpts: {
timeout: '60000',
},
}
const httpConfig = {
authOpts: {
auth: { user: 'admin', pass: 'admin' },
publicAccess: 'READ_ONLY_FILE', // or 'READ_ONLY_PANEL'
},
port: 8080,
}
const run = async () => {
// Create DFs Instance
const dfs = new DFs(DFsConfig)
// Create HTTP Server instance
const httpServer = HttpServer(dfs, httpConfig)
return httpServer.listen({ host: '0.0.0.0', port: httpConfig.port })
}
run().then()
Migrating ddrive v3 to v4 is one way process once you migrate ddrive to v4 and add new files you can't migrate new files to v3 again but you can still use v3 with old files.
- Clone this project
- Create few webhooks (1 webhook/text channel). Do not create webhook on old text channel where you have already stored v3 data.
- Take pull of latest ddrive v3
- Start ddrive v3 with option
--metadata=true
. Ex -ddrive --channelId {id} --token {token} --metadata=true
- Open
localhost:{ddrive-port}/metadata
in browser - Save JSON as old_data.json in cloned ddrive directory
- Put valid
DATABASE_URL
inconfig/.env
- Run
node bin/migrate old_data.json
- After few seconds once process is done you should see the message
Migration is done
For support, join our Discord server or create a new issue.
DDrive is MIT licensed.