This repo contains a ready-to-apply patch that scaffolds a full Laravel application meeting the requirements for:
- Bulk CSV import (upsert by unique key) with a complete result summary
- Chunked & resumable image uploads with checksum validation
- Image variant generation at 256px, 512px, 1024px (aspect ratio preserved)
- Upload & Image records with one primary image per entity
- Idempotent primary image replacement
- Concurrency safety
- Unit test validating upsert behavior
⚠️ To keep the download size practical here, this bundle includes a setup script that creates a fresh Laravel app locally (via Composer) and then copies in all of the application code, migrations, routes, and tests.
- Laravel 11.x
- PHP 8.2+
- MySQL 8+ (or MariaDB 10.6+); SQLite works for tests
- Intervention/Image for image processing
- Flysystem (local) for storage
- PHP 8.2+
- Composer
- Node.js & npm (for Laravel Vite, optional for this backend-only task)
- MySQL running locally (or use SQLite for quick tests)
APP_NAME="BulkUpload"
APP_KEY=
APP_ENV=local
APP_DEBUG=true
APP_URL=http://localhost
# DB (use your own values)
DB_CONNECTION=mysql
DB_HOST=127.0.0.1
DB_PORT=3306
DB_DATABASE=bulkupload
DB_USERNAME=root
DB_PASSWORD=
# For SQLite testing quickly:
# DB_CONNECTION=sqlite
# DB_DATABASE=/absolute/path/to/app/database/database.sqlite
FILESYSTEM_DISK=local
QUEUE_CONNECTION=database # or redis, sync for dev
Create database and run migrations:
php artisan migrate
php artisan storage:link
php artisan test --testsuite=Unit
php artisan serve
CSV columns (required):
sku
(unique key)name
price
primary_image
(optional: filename that was uploaded via chunked flow)
Missing required columns → row counted as invalid but import continues.
- Multipart form with
file
(CSV). - Streams and upserts by
sku
. - Summary returned:
total
,imported
,updated
,invalid
,duplicates
If primary_image
is present, importer will attempt to attach the image upload (if exists) as primary for the product and trigger variant generation if not already generated. Re-attaching the same upload is a no-op.
Flow:
POST /api/uploads/initiate
- Body JSON:
{ "filename": "foo.jpg", "size": 1234567, "checksum": "<sha256 hex>" }
- Returns
{ "upload_uuid": "...", "chunk_size": 5242880 }
(5MB default)
- Body JSON:
POST /api/uploads/chunk
- Form-data:
upload_uuid
,index
(0-based),total
,chunk
(file),chunk_checksum
(sha256) - Safe to re-send the same chunk: it overwrites atomically.
- Form-data:
POST /api/uploads/complete
- Body JSON:
{ "upload_uuid": "...", "attach_to_sku": "SKU-123" (optional) }
- Server assembles chunks, verifies full-file checksum, creates
uploads
&images
rows, generates 256/512/1024 variants with aspect ratio preserved. - If
attach_to_sku
is provided, sets/updates the product's primary image (idempotent).
- Body JSON:
Rules enforced:
- Re-sent chunks do not corrupt data (chunk path is deterministic).
- Checksum mismatch blocks completion (HTTP 422).
- Concurrency: DB transactions + unique constraints (sku; image per upload).
storage/app/chunks/{upload_uuid}/{index}.part
storage/app/uploads/original/{yyyy}/{mm}/{uuid}-{filename}
storage/app/uploads/variants/{image_id}/{size}.jpg
products
(id, sku unique, name, price, primary_image_id nullable FK)uploads
(id, uuid, filename, bytes, checksum, status, total_chunks, received_chunks json)images
(id, upload_id FK, path_original, variants json)- DB queue tables (
queue:table
) recommended for image jobs
tests/Unit/UpsertTest.php
validates:- Creating a new product via importer
- Updating an existing product (same
sku
) - Proper summary counters
Run:
php artisan test --testsuite=Unit
scripts/generate_large_csv.php
→ generates 10,000+ rowsscripts/generate_images.php
→ creates hundreds of simple placeholder images
Run from project root:
php scripts/generate_large_csv.php storage/app/sample.csv 12000
php scripts/generate_images.php storage/app/upload_sources 400
- Upsert uses a
unique
index onsku
andupdateOrCreate
(wrapped in a transaction). - Chunked upload writes chunks to deterministic paths and uses
rename()
on completion; this prevents partial reads. Re-sending a chunk overwrites the same path. - Checksum is verified using SHA-256 across the fully assembled file; mismatch → 422 and no DB write.
- Primary image set uses
Product::update()
guarded by a check that avoids changing when already set to the same image (no-op). - Attach same upload twice → no change (checked by existing relation).
Import docs/postman_collection.json
for ready-to-use requests.