Hello Everyone ![]()
I wanted to share BitePulse AI, a project I’ve been working on that uses real time webcam input to detect bites and estimate eating pace, all on device and privacy preserving.
The app runs entirely in the browser using Streamlit and WebRTC and relies on MediaPipe pose and face landmarks to detect intake events in real time. No video is stored or uploaded. Along the way, I also experimented offline with several temporal ML models including TCNs, 3D CNNs, and MS TCNs to better understand window level versus frame level modeling. For the live app, I focused on a lightweight heuristic that performs well on typical laptops and keeps latency low.
What the app does:
-
Live bite detection from a webcam
-
Eating pace feedback and session level statistics
-
Fully on device processing with privacy in mind
-
Simple UI designed for quick experimentation
Try the app here
Publication PDF
GitHub repository
I would love feedback from anyone who has built real time or computer vision heavy Streamlit apps, especially around WebRTC performance, UI patterns, or ideas for improving temporal signals in live settings.
Thanks!