You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
RuView now generates **real-time 3D point clouds** by fusing camera depth + WiFi CSI + mmWave radar. All sensors stream simultaneously into a unified spatial model.
Copy file name to clipboardExpand all lines: docs/user-guide.md
+99Lines changed: 99 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -536,6 +536,105 @@ Both UIs update in real-time via WebSocket and auto-detect the sensing server on
536
536
537
537
---
538
538
539
+
## Dense PointCloud (Camera + WiFi CSI Fusion)
540
+
541
+
RuView can generate real-time 3D point clouds by fusing camera depth estimation with WiFi CSI spatial sensing. This creates a spatial model of the environment that updates in real-time.
This captures frames, runs depth calibration (grid search over scale/offset/gamma), trains occupancy thresholds, exportsDPO preference pairs, and submits results to the ruOS brain.
605
+
606
+
### Output Formats
607
+
608
+
-**PLY** — Standard 3D point cloud (ASCII, withRGB color)
609
+
-**Gaussian Splats** — JSON format forThree.js rendering
610
+
-**Brain Memories** — Spatial observations stored as `spatial-observation`, `spatial-motion`, `spatial-vitals`
0 commit comments