Skip to content

ANISHTWAGLE/Snowflake_Hackathon

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Stock Management System - Comprehensive Documentation

Project Overview

An intelligent stock management solution designed for hospitals, public distribution systems, and NGOs to monitor inventory health, predict demand, and automate reordering processes. Built on Snowflake's data platform with real-time monitoring capabilities.

Business Problem

Healthcare facilities and distribution systems face critical challenges:

  • Stock-outs of essential medicines and supplies
  • Over-stocking leading to waste and expiration
  • Fragmented data across multiple systems
  • Reactive rather than proactive inventory management
  • Manual procurement processes prone to delays

Solution

A unified stock management system that provides:

  • Real-time inventory visibility across all locations
  • Early warning alerts for stock-outs
  • Automated demand forecasting
  • Data-driven reorder recommendations
  • Comprehensive audit trail of procurement actions

Architecture

Technology Stack

Data Platform: Snowflake

  • Worksheets/SQL: Core data transformations
  • Dynamic Tables: Auto-refreshing metrics and calculations
  • Streams: Change data capture for real-time alerts
  • Tasks: Scheduled automation and orchestration
  • Snowpark Python: Machine learning for demand forecasting
  • Streamlit: Interactive web dashboard

Programming Languages:

  • SQL (data layer)
  • Python 3.9 (forecasting & visualization)

Key Libraries:

  • snowflake-snowpark-python
  • pandas
  • plotly
  • streamlit

Data Model

Core Tables

1. daily_stock

Primary transaction table storing daily inventory movements.

Columns:
- record_date (DATE): Transaction date
- location (VARCHAR): Facility/warehouse identifier
- item_name (VARCHAR): Product identifier
- opening_stock (NUMBER): Stock at start of day
- received (NUMBER): Units received
- issued (NUMBER): Units dispensed/sold
- closing_stock (NUMBER): Stock at end of day
- lead_time_days (NUMBER): Supplier lead time

2. stock_health_metrics (Dynamic Table)

Auto-refreshed aggregated metrics.

Refresh Frequency: 1 hour
Calculates:
- avg_daily_usage: 30-day rolling average
- days_until_stockout: Current stock Ă· daily usage
- stock_status: CRITICAL/LOW/HEALTHY/OVERSTOCK

3. reorder_recommendations (Dynamic Table)

Automated purchase suggestions.

Refresh Frequency: 1 hour
Calculates:
- recommended_reorder_qty: Safety stock - current stock
- priority: 1 (Critical) to 4 (Overstock)

4. stock_alerts

Stream-powered alert log.

Populated by: process_stock_changes task
Triggers: Status changes to CRITICAL or LOW
Frequency: Every 5 minutes

5. demand_forecasts

ML-generated predictions.

Generated by: forecast_demand procedure
Horizon: 14 days ahead
Method: 7-day moving average with trend adjustment

Backend Implementation

Phase 1: Database Setup

CREATE DATABASE stock_management;
CREATE SCHEMA inventory;

Creates isolated workspace for all inventory objects.

Phase 2: Data Ingestion

Initial Load:

  • INSERT statements for sample data
  • Production: CSV upload via Snowflake UI or COPY INTO

Ongoing Updates:

  • Daily batch loads from operational systems
  • Real-time streaming for high-frequency updates

Phase 3: Transformation Layer

Dynamic Table: stock_health_metrics

Business logic:

  • Aggregates last 30 days of transactions
  • Calculates consumption velocity
  • Determines stock status thresholds:
    • CRITICAL: < lead_time_days coverage
    • LOW: < 1.5x lead_time_days coverage
    • OVERSTOCK: > 4x lead_time_days coverage
    • HEALTHY: Everything else

Dynamic Table: reorder_recommendations

Reorder formula:

recommended_qty = (avg_daily_usage Ă— lead_time_days Ă— 2) - current_stock

Safety factor of 2x provides buffer for demand variability.

Phase 4: Real-Time Monitoring

Stream: stock_health_changes

Captures all INSERT/UPDATE/DELETE operations on stock_health_metrics.

Task: process_stock_changes

Schedule: Every 5 minutes
Condition: WHEN SYSTEM$STREAM_HAS_DATA('stock_health_changes')
Action: Insert alerts into stock_alerts table

Benefit: Alerts fire only when status actually changes, avoiding spam.

Phase 5: Demand Forecasting

Procedure: forecast_demand

Algorithm:

  1. Extract last 30 days of usage data
  2. Calculate 7-day moving average
  3. Detect linear trend (recent vs. older periods)
  4. Project 14 days forward
  5. Apply non-negativity constraint

Returns: Table of (forecast_date, predicted_usage) pairs

Task: refresh_forecasts

Schedule: Daily at 6 AM UTC
Action: Regenerate forecasts for all active items

Phase 6: Automation

Procedure: check_critical_stock

Manual execution to validate alert logic:

CALL check_critical_stock();

Returns count of items in CRITICAL status.


Frontend Implementation

Application Structure

Framework: Streamlit in Snowflake Layout: Wide mode, 4-tab interface Data Connection: Snowpark session (auto-authenticated)

Tab 1: Stock Health Heatmap

Purpose: Visual overview of inventory status across locations

Implementation:

pivot_table(index='ITEM_NAME', columns='LOCATION', values='DAYS_UNTIL_STOCKOUT')

Visualization: Plotly heatmap

  • Green: Healthy stock levels
  • Yellow: Approaching reorder point
  • Red: Critical/imminent stock-out

User Benefit: Instantly identify problem areas across entire inventory.

Tab 2: Real-Time Alerts

Purpose: Actionable notifications of stock issues

Data Source: stock_alerts table (last 50 records)

Key Metric: Critical Alerts count

Features:

  • Chronological display (newest first)
  • Status-based filtering
  • Direct link to problematic items

User Benefit: Prioritize immediate attention to critical items.

Tab 3: Demand Forecasts

Purpose: Predict future consumption patterns

Interaction Flow:

  1. User selects location (dropdown)
  2. User selects item (filtered by location)
  3. Click "Generate Forecast" button
  4. System calls forecast_demand stored procedure
  5. Display 14-day projection as line chart

Visualization: Streamlit native line chart

  • X-axis: Future dates
  • Y-axis: Predicted usage units

Edge Cases:

  • < 7 days of data: Warning message displayed
  • Empty result: Graceful degradation

User Benefit: Plan procurement ahead of actual stock-outs.

Tab 4: Reorder Recommendations

Purpose: Export-ready purchase orders

Data Source: reorder_recommendations dynamic table

Metrics Display:

  • Total items needing reorder
  • Aggregate units to purchase

Table Columns:

  • Location
  • Item Name
  • Current Stock
  • Recommended Quantity
  • Priority (1-4 scale)

Export Feature:

download_button("Download Purchase Order", csv_data, "reorder_list.csv")

User Benefit: One-click generation of procurement worksheets.


Deployment Guide

Prerequisites

  1. Snowflake Account (Trial or Standard edition)
  2. Warehouse (COMPUTE_WH or equivalent)
  3. Permissions:
    • CREATE DATABASE
    • CREATE SCHEMA
    • CREATE TABLE/PROCEDURE/TASK
    • EXECUTE TASK
    • CREATE STREAMLIT

Step-by-Step Deployment

Step 1: Execute Backend SQL

  1. Open Snowflake Web UI
  2. Navigate to: Worksheets > + New Worksheet
  3. Copy entire SQL script from provided document
  4. Execute sequentially (do not run all at once)
  5. Verify each section completes without errors

Key Checkpoints:

-- After table creation
SELECT COUNT(*) FROM daily_stock; -- Should return 7

-- After dynamic tables
SELECT COUNT(*) FROM stock_health_metrics; -- Should return 2-3

-- After tasks
SHOW TASKS; -- Verify process_stock_changes exists

Step 2: Activate Tasks

ALTER TASK process_stock_changes RESUME;
ALTER TASK refresh_forecasts RESUME;

Verify status:

SHOW TASKS;
-- STATE column should show 'started'

Step 3: Deploy Streamlit App

  1. Navigate to: Streamlit > + Streamlit App
  2. Name: "Stock Management Dashboard"
  3. Warehouse: COMPUTE_WH
  4. Database: stock_management
  5. Schema: inventory
  6. Copy Python code from frontend file
  7. Click "Run"

Troubleshooting:

  • Error: "Table not found" → Verify USE DATABASE/SCHEMA context
  • Error: "Procedure not found" → Check forecast_demand was created
  • Slow loading → Increase warehouse size (M → L)

Step 4: Load Production Data

Option A: CSV Upload

  1. Navigate to: Data > Add Data > Load Data
  2. Select daily_stock table
  3. Upload CSV file
  4. Map columns (must match schema exactly)

Option B: SQL COPY INTO

COPY INTO daily_stock
FROM @my_stage/stock_data.csv
FILE_FORMAT = (TYPE = CSV SKIP_HEADER = 1);

Usage Guidelines

Daily Operations Workflow

Morning Routine (8:00 AM):

  1. Open Streamlit dashboard
  2. Check "Alerts" tab for critical items
  3. Review "Heatmap" for overall health
  4. Export reorder list if items in CRITICAL status

Weekly Planning (Monday):

  1. Review "Forecasts" tab for high-usage items
  2. Generate 14-day projections
  3. Compare to current stock levels
  4. Schedule proactive orders

Monthly Review:

  1. Analyze alert frequency by location/item
  2. Adjust safety stock thresholds if needed
  3. Review forecast accuracy
  4. Update lead times if supplier performance changed

Alert Response Protocol

CRITICAL Status:

  • Action: Immediate emergency order
  • Timeline: Same-day processing
  • Communication: Notify facility managers

LOW Status:

  • Action: Regular order process
  • Timeline: Within 2 business days
  • Communication: Standard procurement workflow

Forecast Interpretation

High Confidence (stable patterns):

  • Historical usage shows consistent trend
  • Seasonal variations accounted for
  • Use for long-term planning

Low Confidence (erratic patterns):

  • New item with limited history
  • Recent demand spikes/drops
  • Verify with facility managers before major orders

Configuration & Customization

Adjusting Stock Thresholds

Edit stock_health_metrics dynamic table:

-- Current: CRITICAL if < lead_time_days coverage
-- To change to 1.5x lead time:
WHEN MAX(closing_stock) / NULLIF(AVG(issued), 0) < (MAX(lead_time_days) * 1.5) 
  THEN 'CRITICAL'

Modifying Reorder Quantities

Edit reorder_recommendations dynamic table:

-- Current: 2x safety factor
-- To change to 3x:
ROUND((avg_daily_usage * lead_time_days * 3) - current_stock, 0)

Adjusting Task Schedules

-- Change alert checking to every 15 minutes:
ALTER TASK process_stock_changes SET SCHEDULE = '15 MINUTE';

-- Change forecast refresh to twice daily:
ALTER TASK refresh_forecasts SET SCHEDULE = 'USING CRON 0 6,18 * * * UTC';

Adding New Locations/Items

Simply INSERT into daily_stock:

INSERT INTO daily_stock VALUES
('2026-01-09', 'New Clinic C', 'Aspirin 100mg', 500, 0, 45, 455, 10);

Dynamic tables will automatically include new entries on next refresh.


Monitoring & Maintenance

Health Checks

Daily:

-- Verify tasks are running
SELECT * FROM TABLE(INFORMATION_SCHEMA.TASK_HISTORY())
WHERE NAME = 'PROCESS_STOCK_CHANGES'
ORDER BY SCHEDULED_TIME DESC
LIMIT 5;

-- Check for errors
SELECT * FROM TABLE(INFORMATION_SCHEMA.TASK_HISTORY())
WHERE STATE = 'FAILED'
AND SCHEDULED_TIME > DATEADD(day, -1, CURRENT_TIMESTAMP());

Weekly:

-- Audit alert generation
SELECT 
    DATE_TRUNC('day', alert_timestamp) AS alert_date,
    stock_status,
    COUNT(*) AS alert_count
FROM stock_alerts
WHERE alert_timestamp > DATEADD(day, -7, CURRENT_TIMESTAMP())
GROUP BY 1, 2
ORDER BY 1 DESC;

Performance Optimization

If dashboard loads slowly:

  1. Increase warehouse size: ALTER WAREHOUSE COMPUTE_WH SET WAREHOUSE_SIZE = 'LARGE'
  2. Add result caching: Streamlit caches automatically for 1 hour
  3. Reduce data retention: Archive records older than 90 days

If tasks are delayed:

  1. Check warehouse auto-suspend: Set to minimum 5 minutes
  2. Reduce TARGET_LAG on dynamic tables if data allows
  3. Consider dedicated warehouse for task execution

Data Retention Policy

Recommended retention periods:

  • daily_stock: 2 years (regulatory compliance)
  • stock_alerts: 6 months (operational history)
  • demand_forecasts: 3 months (validation tracking)

Archive older data:

CREATE TABLE daily_stock_archive AS
SELECT * FROM daily_stock
WHERE record_date < DATEADD(year, -2, CURRENT_DATE());

DELETE FROM daily_stock
WHERE record_date < DATEADD(year, -2, CURRENT_DATE());

Troubleshooting

Common Issues

Issue: Dynamic tables not refreshing

Solution:

-- Check target lag settings
SHOW DYNAMIC TABLES;

-- Manually refresh if needed
ALTER DYNAMIC TABLE stock_health_metrics REFRESH;

Issue: Forecasts returning empty results

Cause: Insufficient historical data (< 7 days)

Solution: Wait for more data accumulation or adjust forecast_demand procedure threshold

Issue: Tasks not executing

Solution:

-- Verify task is resumed
ALTER TASK process_stock_changes RESUME;

-- Check warehouse availability
SHOW WAREHOUSES;
-- Ensure COMPUTE_WH is not suspended

Issue: Streamlit session timeout

Solution: Refresh browser page - Snowpark session auto-reconnects


Security Considerations

Access Control

-- Create read-only role for dashboard users
CREATE ROLE stock_viewer;
GRANT USAGE ON DATABASE stock_management TO ROLE stock_viewer;
GRANT USAGE ON SCHEMA inventory TO ROLE stock_viewer;
GRANT SELECT ON ALL TABLES IN SCHEMA inventory TO ROLE stock_viewer;
GRANT SELECT ON ALL VIEWS IN SCHEMA inventory TO ROLE stock_viewer;

-- Create admin role for data management
CREATE ROLE stock_admin;
GRANT ALL ON DATABASE stock_management TO ROLE stock_admin;

Data Privacy

  • No personally identifiable information (PII) stored
  • Location codes instead of physical addresses
  • Generic item names without pricing information

Audit Logging

-- Track who accessed sensitive data
SELECT 
    user_name,
    query_text,
    start_time
FROM TABLE(INFORMATION_SCHEMA.QUERY_HISTORY())
WHERE query_text ILIKE '%stock_health_metrics%'
ORDER BY start_time DESC
LIMIT 100;

Future Enhancements

Planned Features

  1. Multi-tier Forecasting

    • ARIMA models for complex patterns
    • Seasonal decomposition
    • External factors (disease outbreaks, campaigns)
  2. Supplier Integration

    • API connections to vendor systems
    • Automated PO submission
    • Delivery tracking
  3. Mobile Notifications

    • SMS/email alerts for critical stock-outs
    • Push notifications via mobile app
    • Escalation workflows
  4. Advanced Analytics

    • ABC analysis (classify by importance)
    • Cost optimization
    • Expiry tracking and FEFO logic
  5. Collaboration Features

    • Comments on specific items
    • Approval workflows
    • Inter-facility transfers

Support & Resources

Documentation

Sample Data Generation

For testing with larger datasets:

import pandas as pd
import numpy as np
from datetime import datetime, timedelta

locations = ['Hospital A', 'Hospital B', 'Clinic C', 'Warehouse D']
items = ['Paracetamol 500mg', 'Insulin 100IU', 'Amoxicillin 250mg', 
         'Ibuprofen 400mg', 'Metformin 500mg']

data = []
for loc in locations:
    for item in items:
        for i in range(90):
            date = datetime.now() - timedelta(days=90-i)
            opening = 1000 if i == 0 else data[-1][6]
            issued = np.random.randint(50, 200)
            received = 500 if opening - issued < 200 else 0
            closing = opening + received - issued
            
            data.append([date.date(), loc, item, opening, received, 
                        issued, closing, np.random.choice([5,7,10,14])])

df = pd.DataFrame(data, columns=['record_date', 'location', 'item_name',
                                 'opening_stock', 'received', 'issued',
                                 'closing_stock', 'lead_time_days'])
df.to_csv('sample_stock_data.csv', index=False)

Contributing

This project is designed for AI for Good initiatives. Contributions welcome:

  • Bug fixes and optimization
  • Additional forecasting algorithms
  • Integration with ERP/pharmacy systems
  • Localization and internationalization

License

Open source for healthcare and humanitarian applications.


Contact

For technical support or implementation questions, consult Snowflake community forums or engage professional services for enterprise deployments.


Version: 1.0
Last Updated: January 2026
Maintained By: Internal Development Team

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published