Skip to content

Commit 4ab6935

Browse files
committed
Update README + user guide with dense point cloud features
Added pointcloud section to README (quick start, CLI, performance). Added comprehensive user guide section: setup, sensors, commands, pipeline components, API endpoints, training, output formats, deep room scan, ESP32 provisioning. Co-Authored-By: claude-flow <ruv@ruv.net>
1 parent ae792aa commit 4ab6935

File tree

2 files changed

+135
-0
lines changed

2 files changed

+135
-0
lines changed

README.md

Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -96,6 +96,42 @@ node scripts/mincut-person-counter.js --port 5006 # Correct person counting
9696
>
9797
---
9898

99+
### Real-Time Dense Point Cloud (NEW)
100+
101+
RuView now generates **real-time 3D point clouds** by fusing camera depth + WiFi CSI + mmWave radar. All sensors stream simultaneously into a unified spatial model.
102+
103+
| Sensor | Data | Integration |
104+
|--------|------|-------------|
105+
| **Camera** | MiDaS monocular depth (GPU) | 640×480 → 19,200+ depth points per frame |
106+
| **ESP32 CSI** | ADR-018 binary frames (UDP) | RF tomography → 8×8×4 occupancy grid |
107+
| **WiFlow Pose** | 17 COCO keypoints from CSI | Skeleton overlay on point cloud |
108+
| **Vital Signs** | Breathing rate from CSI phase | Stored in ruOS brain every 60s |
109+
| **Motion** | CSI amplitude variance | Adaptive capture rate (skip depth when still) |
110+
111+
**Quick start:**
112+
```bash
113+
cd rust-port/wifi-densepose-rs
114+
cargo build --release -p wifi-densepose-pointcloud
115+
./target/release/ruview-pointcloud serve --port 9880
116+
# Open http://localhost:9880 for live 3D viewer
117+
```
118+
119+
**CLI commands:**
120+
```bash
121+
ruview-pointcloud demo # synthetic demo
122+
ruview-pointcloud serve --port 9880 # live server + Three.js viewer
123+
ruview-pointcloud capture --output room.ply # capture to PLY
124+
ruview-pointcloud train # depth calibration + DPO pairs
125+
ruview-pointcloud cameras # list available cameras
126+
ruview-pointcloud csi-test --count 100 # send test CSI frames
127+
```
128+
129+
**Performance:** 22ms pipeline, 905 req/s API, 40K voxel room model from 20 frames.
130+
131+
**Brain integration:** Spatial observations (motion, vitals, skeleton, occupancy) sync to the ruOS brain every 60 seconds for agent reasoning.
132+
133+
See [PR #405](https://github.com/ruvnet/RuView/pull/405) for full details.
134+
99135
### What's New in v0.7.0
100136

101137
<details>

docs/user-guide.md

Lines changed: 99 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -536,6 +536,105 @@ Both UIs update in real-time via WebSocket and auto-detect the sensing server on
536536

537537
---
538538

539+
## Dense Point Cloud (Camera + WiFi CSI Fusion)
540+
541+
RuView can generate real-time 3D point clouds by fusing camera depth estimation with WiFi CSI spatial sensing. This creates a spatial model of the environment that updates in real-time.
542+
543+
### Setup
544+
545+
```bash
546+
# Build the pointcloud binary
547+
cd rust-port/wifi-densepose-rs
548+
cargo build --release -p wifi-densepose-pointcloud
549+
550+
# Start the server (auto-detects camera + CSI)
551+
./target/release/ruview-pointcloud serve --port 9880
552+
```
553+
554+
Open `http://localhost:9880` for the interactive Three.js 3D viewer.
555+
556+
### Sensors
557+
558+
| Sensor | Auto-detected | Data |
559+
|--------|--------------|------|
560+
| Camera (`/dev/video0`) | Yes (Linux UVC) | RGB frames → MiDaS depth → 3D points |
561+
| ESP32 CSI (UDP:3333) | Yes (if provisioned) | ADR-018 binary → occupancy + pose + vitals |
562+
| MiDaS depth server (port 9885) | Optional | GPU-accelerated neural depth estimation |
563+
564+
### Commands
565+
566+
| Command | Description |
567+
|---------|-------------|
568+
| `ruview-pointcloud serve --port 9880` | Start HTTP server + Three.js viewer |
569+
| `ruview-pointcloud demo` | Generate synthetic point cloud (no hardware needed) |
570+
| `ruview-pointcloud capture --output room.ply` | Capture single frame to PLY file |
571+
| `ruview-pointcloud cameras` | List available cameras |
572+
| `ruview-pointcloud train --data-dir ./data` | Depth calibration + occupancy training |
573+
| `ruview-pointcloud csi-test --count 100` | Send test CSI frames (no ESP32 needed) |
574+
575+
### Pipeline Components
576+
577+
1. **ADR-018 Parser** — Decodes ESP32 CSI binary frames from UDP, extracts I/Q subcarrier amplitudes and phases
578+
2. **WiFlow Pose**17 COCO keypoint estimation from CSI (loads `wiflow-v1.json`, 186K params)
579+
3. **Vital Signs** — Breathing rate from CSI phase analysis (peak counting on stable subcarrier)
580+
4. **Motion Detection**CSI amplitude variance over 20 frames, triggers adaptive capture
581+
5. **RF Tomography** — Backprojection from per-node RSSI to 8×8×4 occupancy grid
582+
6. **Camera Depth** — MiDaS monocular depth (GPU) with luminance+edge fallback
583+
7. **Sensor Fusion** — Voxel-grid merging of camera depth + CSI occupancy
584+
8. **Brain Bridge** — Stores spatial observations in the ruOS brain every 60 seconds
585+
586+
### API Endpoints
587+
588+
| Endpoint | Method | Returns |
589+
|----------|--------|---------|
590+
| `/health` | GET | `{"status": "ok"}` |
591+
| `/api/status` | GET | Camera, CSI, pipeline state, vitals, motion |
592+
| `/api/cloud` | GET | Point cloud (up to 1000 points) + pipeline data |
593+
| `/api/splats` | GET | Gaussian splats for Three.js rendering |
594+
| `/` | GET | Interactive Three.js 3D viewer |
595+
596+
### Training
597+
598+
The training pipeline calibrates depth estimation and occupancy detection:
599+
600+
```bash
601+
ruview-pointcloud train --data-dir ~/.local/share/ruview/training --brain http://127.0.0.1:9876
602+
```
603+
604+
This captures frames, runs depth calibration (grid search over scale/offset/gamma), trains occupancy thresholds, exports DPO preference pairs, and submits results to the ruOS brain.
605+
606+
### Output Formats
607+
608+
- **PLY** — Standard 3D point cloud (ASCII, with RGB color)
609+
- **Gaussian Splats**JSON format for Three.js rendering
610+
- **Brain Memories** — Spatial observations stored as `spatial-observation`, `spatial-motion`, `spatial-vitals`
611+
612+
### Deep Room Scan
613+
614+
Capture a high-quality 3D model of the room:
615+
616+
```bash
617+
# Stop the live server first (frees the camera)
618+
# Then capture 20 frames and process with MiDaS
619+
ruview-pointcloud capture --frames 20 --output room_model.ply
620+
```
621+
622+
Result: 40,000+ voxels at 5cm resolution, 12,000+ Gaussian splats.
623+
624+
### ESP32 Provisioning for CSI
625+
626+
To send CSI data to the pointcloud server:
627+
628+
```bash
629+
python3 firmware/esp32-csi-node/provision.py \
630+
--port /dev/ttyACM0 \
631+
--ssid "YourWiFi" --password "YourPassword" \
632+
--target-ip 192.168.1.123 --target-port 3333 \
633+
--node-id 1
634+
```
635+
636+
---
637+
539638
## Vital Sign Detection
540639

541640
The system extracts breathing rate and heart rate from CSI signal fluctuations using FFT peak detection.

0 commit comments

Comments
 (0)