Hi, I have a presentation tommorrow! We are simulating hand movements for a prosthetic arm and so far, we haven’t been able to do it in real time using the actual sEMG sensor.
However, I’d like to simulate different movements for the arm to display what each class corresponds too after classification.
Is there any way I can add my Blunder animation for the various hand movements to the streamlit app?
I don’t mind if it be static… like clicking a button for the label and show the simulation (Basically, just embedding the video and executing with one button)
Or rather, is there any other hacky or tricky way? I’ve got to demonstrate it tommorrow! Any idea will be helpful!