PI3 — Earth Observation Imager (Level 3: Interpretation & Trade-offs)
Mission Goal
Capture images on a schedule, store them with clear metadata, and extract meaning from the images (change detection, classification, or measurement) while acknowledging limitations.
Why This Matters
Earth observation is useful only when imaging is consistent, metadata is trustworthy, and interpretation is careful. Level 3 is about turning images into evidence.
What Data You Collect
- Image frames (JPEG/PNG)
- Capture time
- Optional: exposure settings, resolution
- Optional: paired sensor data (temp/light) for context
Hardware / Software Needed
- Raspberry Pi + camera (Pi Camera module or USB webcam)
- Python 3
- Optional: OpenCV for analysis
- Stable mounting (tripod/books) for repeatability
Inputs From Other Teams
- Data: Agree metadata naming and folder structure.
- Command & Control: Define mission question (e.g., “Detect change over time”).
- Risk & Safety: Define privacy boundaries (no faces, no personal spaces).
What You Must Produce (Deliverables)
- A capture script that takes images at fixed intervals.
- A dataset folder with at least 20 images + metadata CSV.
- A simple analysis output (change score, classification rule, or measurement).
- An interpretation note listing at least 2 limitations.
Step-by-Step Build
- Choose a mission target that is privacy-safe:
- Sky brightness, cloud cover (no people)
- Plant growth area, desk objects arrangement
- Shadow movement (sun position proxy)
- Mount camera so the scene is stable and repeatable.
- Capture an image every 30–60 seconds for 20+ frames.
- Write a metadata CSV row per image:
- filename, timestamp, notes
- Run a simple analysis:
- Pixel difference between frames
- Average brightness over time
- Object count estimate (very simple thresholding)
- Write interpretation: what changed, what might be misleading, what you’d do next.
Data Format / Output
- Images in
data/pi3/images/ - Metadata file:
data/pi3/metadata.csv - Recommended columns:
filename,ts_iso,scene_id,notes
Analysis Ideas
- Brightness vs time plot; discuss sensor/exposure effects.
- Change detection score per frame; identify “event moments.”
- Try two capture intervals and compare usefulness vs storage cost.
Success Criteria
- Dataset is consistent: stable framing + predictable filenames.
- Metadata matches images reliably.
- Interpretation includes uncertainty and trade-offs (exposure, lighting changes, camera auto-settings).
Evidence Checklist
- Folder listing showing images + metadata.csv
- 3 sample images (early/middle/late) and what you notice
- Analysis output (chart/table)
- Interpretation note with limitations
Safety & Privacy
- No faces, no private screens, no personal documents in frame.
- Store images locally; don’t upload without permission.
- Clearly label the camera area so others know it’s active.
Common Failure Modes
- Camera moves between captures → false “change.”
- Auto-exposure causes brightness shifts unrelated to the scene.
- Missing metadata rows or mismatched filenames.
- Analysis claims too much from noisy data.
Stretch Goals
- Lock camera settings (exposure/white balance) if supported.
- Add a “calibration target” in scene (grey card) to normalise brightness.
- Create a timelapse video from frames.
Scaffolding Example (optional)
You are allowed to reuse structures and formats from other teams — but not their decisions.
Template: “Weather Satellite” data story
- Where was the sensor placed (shade/sun/indoors)?
- Sampling rate and duration
- Graph/table + one observation
- One limitation (noise, placement, calibration)
Example output files
data.csv,summary.txt, optionalplot.png