This project is part of my first task in the Deloitte Virtual Experience Program, where I worked on a real-world industrial problem involving IIoT (Industrial Internet of Things) data.
The objective was to unify different telemetry data formats generated by machines across factories into a single, standardized structure for better analysis and consistency.
Daikibo Industrials collects telemetry data from multiple machines across global factories.
However:
⚠️ Data is generated in multiple formats⚠️ Structures are inconsistent⚠️ Difficult to process and analyze
👉 The challenge was to convert different input formats into one unified schema
I developed a Python-based data transformation system that:
- 🔄 Converts multiple JSON formats into a standard format
- 🧩 Handles both nested and flat data structures
- ⏱️ Converts timestamps into milliseconds (epoch time)
- 📦 Ensures clean, consistent output for further processing
daikibo-telemetry/
│
├── main.py
├── data-1.json
├── data-2.json
├── data-result.json
- ✅ Schema unification across multiple formats
- ✅ Timestamp normalization (ISO → Epoch ms)
- ✅ Clean modular Python functions
- ✅ Automated testing using
unittest
python main.py✔️ All test cases pass successfully
- Working with real-world industrial data
- Understanding data transformation pipelines
- Handling JSON structures and schema mapping
- Writing testable and reliable Python code
I would like to sincerely thank Deloitte for providing this opportunity through the Virtual Experience Program.
This being my first task, it gave me valuable exposure to how real-world data problems are approached and solved in industry.
Always open to learning, building, and collaborating 🚀