Skip to content

SOP-ORD-03: Dispatch Ingestion v3.0

Automated processing of dispatch confirmation CSV files, updating order status to FULFILLED, creating shipment records, and monitoring fulfilment health.

Document ID: SOP-ORD-03-v3.0 Version: 3.0 Status: ✅ Production Ready Last Updated: 2026-03-19 Owner: Protocol Raw Operations Replaces: SOP-0Y v2.0 (renamed to SOP-ORD-03) Review Date: 2026-06-19


Key Changes in v3.0

  • ✅ Renamed from SOP-0Y to SOP-ORD-03 to align with order pipeline taxonomy (SOP-ORD-01 → ORD-02 → ORD-03)
  • ✅ Make.com "Process Dispatch Files" scenario replaced by native pg_cron → pg_net → poll-dispatch-files Edge Function
  • ✅ File-level idempotency added: processed_3pl_files table prevents duplicate processing (file name + SHA256 hash dual guard)
  • ✅ Quarantine system added: dispatch_file_failures table captures failed files with structured error types and controlled reprocessing via fn_reprocess_dispatch_file_v1()
  • ✅ Monitoring rebuilt to SOP-MON-01 pattern: fn_check_dispatch_processing_health_v2() registered in run-monitor, scheduled every 15 minutes
  • ✅ RLS enabled on processed_3pl_files (service_role bypasses, anon blocked)
  • ops_events audit logging added to file processing pipeline
  • processing_duration_ms and orders_fulfilled columns added to processed_3pl_files
  • ❌ Removed "production ready for 100k+" claims pending real-world validation at scale

Purpose

Process dispatch confirmation CSV files uploaded to Supabase Storage, creating shipment records with tracking information and transitioning orders from sent to fulfilled. The system handles files from self-fulfilment (via fulfillment.html), 3PL partners, and future in-house warehouse operations through a single CSV interface.

Scope

All CSV files uploaded to the dispatch-files/inbox/ Storage bucket containing dispatch confirmations with valid order references and tracking numbers.

Related SOPs: - SOP-ORD-01: Order Ingestion (webhook capture → database) - SOP-ORD-02: 3PL Order Export (database → 3PL, sets orders to sent) - SOP-DLV-01: Courier Watchdog (tracking status monitoring post-dispatch) - SOP-MON-01: Monitoring and Alerting Architecture (master pattern) - SOP-PACK-01: Intelligent Packing Operations


Architecture

dispatch-files/inbox/ (Supabase Storage)
pg_cron (every 5 min) → pg_net → poll-dispatch-files (Edge Function)
    ├── List inbox → Filter CSV files
    ├── Idempotency check (processed_3pl_files + dispatch_file_failures)
    ├── Download file content
    ├── Compute SHA256 hash
    ├── Call fn_import_3pl_dispatch(dispatch_data jsonb) via RPC
    │     ├── Parse and validate rows
    │     ├── Create shipment records
    │     ├── Update orders: export_state = 'fulfilled'
    │     └── Skip duplicates within file
    ├── Record in processed_3pl_files (file name, hash, duration, counts)
    ├── Log to ops_events
    └── Archive to dispatch-files/archive/

    On failure:
    ├── Insert into dispatch_file_failures (quarantine)
    ├── Alert via ops-alerter → Slack (#ops-alerts or #ops-urgent)
    └── File remains in inbox for investigation

Monitoring:
pg_cron (every 15 min) → pg_net → run-monitor
    → fn_check_dispatch_processing_health_v2()
    → ops-alerter → Slack (if thresholds breached)

Order State Context

PAID → allocated → QUEUED → SENT → FULFILLED
                             ↑        ↑
                         SOP-ORD-02  SOP-ORD-03

SOP-ORD-03 receives orders that have been exported to the 3PL (or self-fulfilment queue) by SOP-ORD-02 with export_state = 'sent'. Upon successful dispatch confirmation, the order transitions to export_state = 'fulfilled'.


CSV File Format

Required Structure:

external_order_id,dispatch_time_utc,courier,service,tracking_number,sku,qty,parcel_count,notes,status
12294292865399,2025-10-31T14:30:00Z,DPD,Next Day,DPD123456789GB,STARTER-8KG,1,1,,DISPATCHED

Field Specifications:

Field Type Required Description Example
external_order_id String Yes Shopify Order ID 12294292865399
dispatch_time_utc ISO DateTime Yes When dispatched (UTC) 2025-10-31T14:30:00Z
courier String Yes Courier company DPD, Royal Mail, Evri, UPS
service String No Service level Next Day, Standard
tracking_number String Yes Tracking reference DPD123456789GB
sku String Yes Product SKU STARTER-8KG
qty Integer Yes Quantity dispatched 1
parcel_count Integer No Number of parcels 1
notes String No Additional notes
status String Yes Must be 'DISPATCHED' DISPATCHED

File Rules: - Location: dispatch-files/inbox/ - Format: .csv only - Size limit: 10MB per file - Batch size: Unlimited orders per file - Processing frequency: Every 5 minutes via pg_cron


File Processing Pipeline

1. File Discovery and Filtering

The poll-dispatch-files Edge Function lists all files in dispatch-files/inbox/, filters to .csv files only, and processes them in upload order (oldest first, up to 20 files per cycle).

2. Idempotency Check (Dual Guard)

Before processing, each file is checked against two tables:

  1. processed_3pl_files: If the file name exists here, the file has already been successfully processed. It is silently archived and skipped.
  2. dispatch_file_failures: If the file name exists here with status pending_review, it is already quarantined. It is skipped to avoid repeated failure attempts.

This dual guard prevents both duplicate processing and infinite retry loops on known-bad files.

3. Download and Hash

The file content is downloaded from Storage. A SHA256 hash is computed for the audit record, providing content-level verification alongside the file name guard.

4. RPC Processing

The Edge Function calls fn_import_3pl_dispatch(dispatch_data jsonb) which handles: - CSV row parsing and validation - Order existence and state checks (must be export_state = 'sent') - Shipment record creation with tracking information - Order state transition to fulfilled - Duplicate row detection within the same file

5. Success Path

On successful processing: - Record inserted into processed_3pl_files with file name, SHA256 hash, rows processed, orders fulfilled, and processing duration - ops_events entry logged with processing metadata - File moved to dispatch-files/archive/ with timestamp prefix

6. Failure Path (Quarantine)

On any failure: - Record inserted into dispatch_file_failures with structured error type and message - Alert sent via ops-alerter to Slack (severity tiered by error type) - File remains in inbox for investigation - Other files in the same batch continue processing (one bad file does not block others)


Database Schema

Core Tables (Unchanged from v2.0)

raw_ops.shipments — Shipment records with tracking information (created by fn_import_3pl_dispatch)

New/Modified Tables in v3.0

raw_ops.processed_3pl_files (modified — added columns)

-- Existing table, enhanced with:
--   processing_duration_ms INTEGER
--   orders_fulfilled INTEGER
-- Provides file-level idempotency via UNIQUE constraint on file_name
-- RLS enabled in v3.0

raw_ops.dispatch_file_failures (new)

CREATE TABLE raw_ops.dispatch_file_failures (
  id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
  file_name TEXT NOT NULL,
  file_path TEXT NOT NULL,
  file_size_bytes INTEGER,
  error_type TEXT NOT NULL,
  error_message TEXT,
  rows_total INTEGER,
  rows_succeeded INTEGER DEFAULT 0,
  rows_failed INTEGER DEFAULT 0,
  failed_row_details JSONB,
  retry_count INTEGER DEFAULT 0,
  last_attempt_at TIMESTAMPTZ DEFAULT now(),
  status TEXT DEFAULT 'pending_review',
  resolved_at TIMESTAMPTZ,
  resolution_note TEXT,
  created_at TIMESTAMPTZ DEFAULT now(),

  CONSTRAINT chk_dispatch_file_failure_status
    CHECK (status IN ('pending_review', 'reprocessing', 'resolved', 'permanent_failure'))
);

CREATE INDEX idx_dispatch_file_failures_status
  ON raw_ops.dispatch_file_failures(status)
  WHERE status = 'pending_review';

CREATE INDEX idx_dispatch_file_failures_file
  ON raw_ops.dispatch_file_failures(file_name);

ALTER TABLE raw_ops.dispatch_file_failures ENABLE ROW LEVEL SECURITY;

Error Types:

error_type Severity Description
download_failed critical Could not download file from Storage
rpc_failed critical fn_import_3pl_dispatch RPC call failed
unexpected_error critical Unhandled exception during processing
empty_file warning File contains no content
processing_failed warning RPC returned failure (e.g. validation errors)

Core Processing Function (Unchanged)

raw_ops.fn_import_3pl_dispatch(dispatch_data jsonb) — Parses dispatch data, validates rows, creates shipment records, updates order state. Internal duplicate prevention for rows within a single file.

Reprocess Function (New)

raw_ops.fn_reprocess_dispatch_file_v1(p_failure_id UUID)

Controlled reprocessing of quarantined files: 1. Validates the failure record exists and is in pending_review status 2. Updates status to reprocessing and increments retry_count 3. Clears any stale processed_3pl_files record for the file name 4. Logs to ops_events 5. Next pg_cron cycle picks up the file from inbox and reprocesses it

-- Usage:
SELECT raw_ops.fn_reprocess_dispatch_file_v1('failure-uuid-here');

Monitoring Function (New)

raw_ops.fn_check_dispatch_processing_health_v2()

MON-01 compliant health monitor checking: - Quarantined files awaiting review (count and age) - Orders stuck in sent state beyond 48 hours (count and age)

Alert Thresholds:

Condition Channel Severity
Failed files pending > 24 hours #ops-urgent critical
Orders stuck in sent > 7 days #ops-urgent critical
Any failed files pending #ops-alerts warning
More than 5 orders stuck in sent #ops-alerts warning

Edge Functions

poll-dispatch-files

Purpose: Lists files in inbox, downloads each CSV, calls RPC, archives successes, quarantines failures.

Location: edge-functions/supabase/functions/poll-dispatch-files/

Trigger: pg_cron every 5 minutes

Dependencies: Supabase Storage (dispatch-files bucket), fn_import_3pl_dispatch RPC, ops-alerter Edge Function

Key Behaviours: - Processes up to 20 files per cycle - Oldest files first (FIFO) - Dual idempotency check before processing - SHA256 hash computed for audit - Tiered alerting on failures (infrastructure errors → #ops-urgent, data errors → #ops-alerts) - Archive uses timestamp prefix: archive/{ISO-timestamp}_{filename} - Non-fatal archive failures do not block processing (idempotency prevents reprocessing)

run-monitor (Updated)

Dispatch map entry added:

dispatch_processing_health: "fn_check_dispatch_processing_health_v2",

Scheduled Jobs

Job Name Schedule Target Purpose
poll-dispatch-files-every-5-min */5 * * * * poll-dispatch-files Edge Function Process CSV files from inbox
monitor-ord-03-dispatch-health */15 * * * * run-monitordispatch_processing_health Monitor quarantine and stuck orders

Fulfilment Interface

fulfillment.html

Purpose: Brand-aligned HTML interface for manual dispatch recording during self-fulfilment phase.

Access: Password-protected (protocol2025). Acceptable for Phase A behind Cloudflare Access. Should migrate to proper auth (Supabase Auth or Cloudflare Access gate) in Phase B.

Workflow: 1. Enter Order ID → System fetches customer, products, batch codes 2. Select courier, enter tracking number 3. Add to batch (collect multiple dispatches in session) 4. Submit batch → Creates single CSV in dispatch-files/inbox/ 5. poll-dispatch-files processes within 5 minutes

Key Features: - Order lookup prevents re-keying - Duplicate detection within session - Brand-styled (Protocol Raw visual identity) - Batch traceability (links dispatches to batch codes)


Monitoring

Automated Monitoring (SOP-MON-01 Pattern)

pg_cron (*/15) → pg_net → run-monitor
    → fn_check_dispatch_processing_health_v2()
    → monitoring_runs (result logged)
    → ops-alerter → Slack (if thresholds breached)

Key Metrics

Metric Query Target
Files processed today SELECT COUNT(*) FROM raw_ops.processed_3pl_files WHERE processed_at >= CURRENT_DATE All files processed within 5 min
Pending quarantine files SELECT COUNT(*) FROM raw_ops.dispatch_file_failures WHERE status = 'pending_review' 0
Orders stuck in sent SELECT COUNT(*) FROM raw_ops.orders WHERE export_state = 'sent' AND exported_at < now() - interval '48 hours' 0
Fulfilment latency Time from exported_at to dispatched_at < 48 hours
Processing error rate Failed files / total files < 1%

Metabase Dashboard (To Build)

Question Description
Files processed per day Daily count from processed_3pl_files
Average processing time Mean processing_duration_ms from processed_3pl_files
Failure rate Failed / (failed + processed) by day
Orders stuck in sent Count with age breakdown
Fulfilment latency distribution Time from export to dispatch, grouped by day
Courier breakdown Orders fulfilled by courier
Daily shipments Count from shipments table by day

Manual Diagnostic Queries

-- Recent file processing
SELECT file_name, rows_processed, orders_fulfilled, processing_duration_ms, processed_at
FROM raw_ops.processed_3pl_files
ORDER BY processed_at DESC LIMIT 10;

-- Quarantined files
SELECT file_name, error_type, error_message, retry_count, status, created_at
FROM raw_ops.dispatch_file_failures
ORDER BY created_at DESC;

-- Orders stuck in sent
SELECT shopify_order_id, exported_at, now() - exported_at AS stuck_duration
FROM raw_ops.orders
WHERE export_state = 'sent'
ORDER BY exported_at ASC;

-- Monitor execution history
SELECT run_at, status, duration_ms, result_summary, result_json
FROM raw_ops.monitoring_runs
WHERE check_name = 'dispatch_processing_health'
ORDER BY run_at DESC LIMIT 10;

-- Recent shipments
SELECT shopify_order_id, courier, tracking_number, dispatched_at
FROM raw_ops.shipments
ORDER BY dispatched_at DESC LIMIT 20;

-- Daily fulfilment metrics
SELECT
  DATE(dispatched_at) AS dispatch_date,
  COUNT(*) AS orders_fulfilled,
  COUNT(DISTINCT courier) AS couriers_used
FROM raw_ops.shipments
WHERE dispatched_at >= CURRENT_DATE - INTERVAL '7 days'
GROUP BY DATE(dispatched_at)
ORDER BY dispatch_date DESC;

Error Handling

Common Issues

Issue Cause Resolution
Order not found Invalid order ID in CSV Check Shopify order exists, verify ID format
Order already fulfilled Duplicate dispatch record Skipped automatically (within-file duplicate prevention)
Missing tracking number Empty or invalid tracking field Row rejected, logged in processing log
Wrong order state Order not in sent status Ensure order exported via SOP-ORD-02 first
CSV format error Wrong headers or field count File quarantined, review and fix CSV structure
File download failed Storage connectivity issue Quarantined as download_failed, auto-retried on next cycle if moved back to inbox
RPC call failed Database connectivity or function error Quarantined as rpc_failed, alert sent to #ops-urgent

Recovery Procedures

Quarantined file with data errors: 1. Check dispatch_file_failures for error details 2. Fix the CSV file 3. Re-upload corrected file to dispatch-files/inbox/ with a new file name 4. Mark the failure as resolved: UPDATE raw_ops.dispatch_file_failures SET status = 'resolved', resolved_at = now(), resolution_note = 'Fixed and re-uploaded as new file' WHERE id = 'uuid';

Quarantined file for reprocessing (same file):

SELECT raw_ops.fn_reprocess_dispatch_file_v1('failure-uuid');
-- File will be picked up on next 5-min cycle

Order stuck in wrong state:

-- Reset order for re-processing (use carefully)
UPDATE raw_ops.orders
SET export_state = 'sent'
WHERE shopify_order_id = '[order_id]';


Storage Structure

dispatch-files/              (Supabase Storage bucket)
├── inbox/                   # New files to process
└── archive/                 # Processed files: {ISO-timestamp}_{filename}

Files are archived with timestamp prefix to preserve history while keeping inbox clean. Archive retention is unlimited.


Integration Points

Upstream: SOP-ORD-02 (3PL Order Export)

Orders must be exported with export_state = 'sent' before dispatch processing can fulfil them. SOP-ORD-03 rejects dispatch confirmations for orders not in sent state.

Downstream: SOP-DLV-01 (Courier Watchdog)

Once a shipment record is created with tracking information, SOP-DLV-01 monitors delivery status via courier APIs. The handoff is the shipment record in raw_ops.shipments.

Fulfilment Sources

Source Mechanism Phase
Self-fulfilment fulfillment.html → CSV → inbox Phase A (Months 0-6)
3PL partner Automated CSV upload to inbox Phase B (Months 6-18)
In-house warehouse Barcode scanner → CSV → inbox Phase C (18+ months)

No system changes required between phases. The CSV interface scales across all fulfilment models.


Capacity

Metric Current Notes
Orders per CSV file 1,000+ Tested Oct 2025
Processing cycles per hour 12 Every 5 minutes
Daily capacity 17,000+ orders 12 cycles × ~1,400 orders
Processing time per file < 5 seconds (50 orders) Measured via processing_duration_ms
Files per cycle Up to 20 FIFO processing

Outstanding Items

Must Do (Before Seed Deck)

Item Detail Effort Owner
Metabase dashboard Build the 7 questions defined in the Monitoring section 2-3 hours (Claude Code) Anton
Investigate stuck orders 8 orders in sent state for 150+ days detected by monitor. Likely stale test records. Resolve or archive 30 min (manual) Anton
Deactivate Make.com scenario After 48 hours stable operation, toggle OFF "Process Dispatch Files" Make.com scenario. Do not delete 5 min (manual) Anton

Should Do (Phase A)

Item Detail Effort
Fulfilment interface auth Migrate from password (protocol2025) to Supabase Auth or Cloudflare Access gate 2-3 hours
Storage event triggers When Supabase adds native Storage event hooks, replace pg_cron polling with event-driven processing (file uploaded → trigger → process immediately) 1-2 hours

Parked (Phase B)

Item Detail Trigger
Barcode scanner integration Add barcode scanning to fulfilment interface for warehouse operations In-house warehouse decision
Multi-station workflow Shared fulfilment interface for multiple packing stations Volume exceeds single-station capacity
Courier API direct integration Replace CSV-based dispatch with direct courier API booking (DPD, etc.) Volume justifies API integration cost

Deployment Checklist

Database Objects

  • [x] shipments table (existing)
  • [x] processed_3pl_files table (enhanced with processing_duration_ms, orders_fulfilled)
  • [x] dispatch_file_failures quarantine table (new)
  • [x] fn_import_3pl_dispatch() core processing function (existing)
  • [x] fn_reprocess_dispatch_file_v1() controlled reprocess function (new)
  • [x] fn_check_dispatch_processing_health_v2() monitoring function (new)
  • [x] RLS enabled on processed_3pl_files
  • [x] RLS enabled on dispatch_file_failures
  • [x] Indexes on dispatch_file_failures (status partial, file_name)

Edge Functions

  • [x] poll-dispatch-files enhanced with quarantine, idempotency, audit logging
  • [x] run-monitor updated with dispatch_processing_health check
  • [x] ops-alerter available for Slack delivery

Scheduled Jobs

  • [x] poll-dispatch-files-every-5-min pg_cron job (every 5 min)
  • [x] monitor-ord-03-dispatch-health pg_cron job (every 15 min)

Integrations

  • [x] Supabase Storage: dispatch-files bucket (inbox + archive)
  • [x] Slack channels: #ops-alerts, #ops-urgent

Deactivated (Pending 48-Hour Verification)

  • [ ] Make.com "Process Dispatch Files" scenario (toggle OFF, do not delete)

Rollback

Rollback files saved in rollback/ord-03/: original poll-dispatch-files, run-monitor, fn_import_3pl_dispatch SQL, and cron job config.


Security and Compliance

Access Control

  • Service role authentication for all database operations
  • RLS enabled on all dispatch tables (anon blocked, service_role bypasses)
  • Password-protected fulfilment interface (Phase A; upgrade planned for Phase B)
  • Encrypted storage for all CSV files
  • No PII in error messages or Slack alerts

Audit Trail

  • ops_events entry for every file processed
  • SHA256 hash recorded for content verification
  • Processing duration tracked for performance monitoring
  • Complete chain: order → allocation → batch → dispatch → shipment
  • Quarantine records preserved for investigation

Version History

Version Date Changes Author
3.0 2026-03-19 Renamed from SOP-0Y to SOP-ORD-03. Replaced Make.com polling with pg_cron + Edge Function. Added file-level idempotency (dual guard). Added quarantine system with controlled reprocessing. Rebuilt monitoring to SOP-MON-01 pattern. Enabled RLS. Added ops_events audit logging. Added SHA256 file hashing. Anton / Claude
2.0 2025-10-31 Added fulfillment.html interface, order lookup, batch processing, Protocol Raw brand styling, frozen food workflow guidance, error handling and monitoring. Anton
1.0 2025-10-25 Initial CSV processing implementation, basic file upload/archive system. Anton

Escalation Path

  1. Check Slack alerts (#ops-alerts, #ops-urgent) and this documentation
  2. Query dispatch_file_failures for quarantined files
  3. Query monitoring_runs for execution history: SELECT * FROM raw_ops.monitoring_runs WHERE check_name = 'dispatch_processing_health' ORDER BY run_at DESC LIMIT 10;
  4. Review Supabase Dashboard logs (Edge Functions, Database)
  5. If unresolved: Escalate to Anton

End of SOP-ORD-03 v3.0