make mobile capture reliable enough for real-world robotics workflows: big files, long-running recordings, unstable device states, and no room for flaky pipelines.
- part-time mobile engineer
- nov 2025 to present
- flutter · dart
joining fpv labs
i joined fpv labs in nov 2025 as a part-time remote mobile engineer. the job was not a typical app surface with a few forms and screens. the phone sat in the middle of a much harder workflow: record the world, move large datasets, and keep the session usable when devices, sensors, and networks behave badly.
that changed the standard for "done". a flow was only good if it survived field conditions, not just the simulator.
stability first, then speed
the first push was reliability. some modules were unstable enough to drag the app crash rate toward ~6%, which is unacceptable when the app is part of a recording workflow instead of a casual consumer session.
i refactored the shakier parts of the codebase and tightened error handling across ios and android, pushing the crash rate to under 1%. the benefit wasn't only cleaner logs. it meant operators could trust that a recording session would stay alive when something unexpected happened.
for capture tooling, reliability is the feature. everything else is decoration.
large uploads that actually finish
capture sessions get heavy fast, so i built a cross-platform upload pipeline for 10gb+ files with resumable uploads, background execution, and failure recovery.
the goal was simple: if an upload gets interrupted, the app should recover instead of making someone start from zero. that turned uploads from a best-effort feature into infrastructure the team could rely on.
from ar session to dataset
i also built a high-fidelity ar data capture system on top of arkit and arcore, recording video, pose, point cloud, depth, and imu streams from the same session.
on top of capture, i designed a structured recording pipeline that packages sessions into mcap datasets compatible with ros2. the system keeps sampling deterministic at roughly 30 fps and handles session lifecycle cleanly, so what comes out is useful for downstream robotics and physical intelligence workflows, not just raw sensor noise.

