Sensor fusion
Sensor fusion is three drone-centric visualisation layers that ingest the selected aircraft’s live telemetry and paint it directly onto the Cesium 3D scene. You use sensor fusion to answer: has the drone actually seen this patch of ground? / where did we get the most detections? / what does the thermal image project onto the terrain?
Sensor fusion is Cesium-only. It’s distinct from Panoptic — Panoptic is external feeds (ADS-B, earthquakes, CCTV); sensor fusion is our own drone’s sensor data.
The three layers
Open the Layers sidebar → FUSION tab. Each layer is an independent toggle:
1. Coverage grid (green / red)
Quantises the ground into ~22 m × 22 m cells and colours each cell green when the aircraft’s camera has actually looked at it. As the drone flies a grid pattern, the covered area fills in in real time.
When a SAR polygon is active, uncovered cells inside the SAR boundary are additionally tinted red, so you instantly see coverage gaps.
- Cells use a 0.0002° (≈22 m) quantisation — tight enough to track spot coverage, loose enough to keep the grid visible at cesium’s default zoom.
- Covered cells are persisted for the operation; switching drones doesn’t erase what the previous drone already covered.
- Rendered as a canvas overlay on the globe via a custom Cesium primitive.
2. Detection density heatmap
Every AI-detected object (from YOLO or DJI SEI) gets its screen-space bounding box projected through the live camera footprint into ground coordinates via bilinear interpolation. The resulting ground point stamps into a grid cell identical to the coverage grid but accumulates count.
Rendered as a scatter-plot heatmap (blue → yellow → red as count grows). Great for:
- Quickly seeing where the drone has been accumulating hits.
- Identifying hot-spots during SAR — clusters of PERSON_DETECTED stamps point to where a missing person might be.
- Post-mission analytics — how much of the search effort actually produced detections.
3. Thermal mosaic
Only available when the drone is in FLIR mode. As the thermal camera’s footprint paints the ground, the coverage cells render with a black → orange → white thermal ramp based on the thermal pixel values. Effectively a live mosaic of the area’s thermal image draped onto terrain.
Turn thermal off on the drone and this layer switches back to plain green for the coverage cells.
Starting / stopping
Toggles are in the FUSION tab. Independent — you can have coverage on without heatmap, or vice versa. Default at the start of an operation: all three off.
When a layer is turned on:
- The manager starts listening to the selected drone’s
FlightStatus+ detection stream. - The canvas overlay initialises and registers a per-frame render callback.
When off:
- Overlay is torn down.
- Collected data remains in memory — flip back on and you keep your history.
Interaction with drone selection
Sensor fusion follows the selected drone. When you select a different drone in the fleet tile:
- Coverage grid keeps the data collected by the previous drone (coverage is cumulative over the operation).
- Heatmap keeps its points.
- Thermal mosaic clears — the new drone may not be thermal, and the baked ramp from the old drone isn’t meaningful for the new one.
Simulation mode
With debug mode on in the settings drawer, you can start a sensor-fusion simulator:
- Generates synthetic coverage cells along a scripted flight path.
- Produces fake detections to exercise the heatmap.
- Auto-stops when you turn debug mode off.
Used for training and for demonstrating sensor fusion without a live drone.
What sensor fusion is NOT
- Not AIS / ADS-B / radar fusion. External-sensor fusion is a separate thing and lives in Panoptic (ADS-B only in the current build — AIS + radar are planned).
- Not a recording. Sensor fusion is live; closing the map tile or switching missions discards the in-memory canvas state (though the underlying Firestore records of detections remain in the mission).
- Not a replacement for video. The thermal mosaic is a projection; it has artifacts where the camera footprint estimate is off. For decision- making, stay on the live drone stream.
Performance notes
- Coverage + heatmap together add ~1 ms per frame on a modern laptop GPU — negligible.
- Thermal mosaic is more expensive — on GPU-constrained hardware, you may want to toggle it off during high-motion passes.
- Canvas overlays cull automatically when the camera zooms out beyond the data area.
Related
- Cesium 3D
- Panoptic — external feeds, complement to sensor fusion.
- Drone stream tile — the live video source that feeds detection data.
- Settings drawer → debug mode — gates the simulator.