Hi, I have a presentation tommorrow! We are simulating hand movements for a prosthetic arm and so far, we havenāt been able to do it in real time using the actual sEMG sensor.
However, Iād like to simulate different movements for the arm to display what each class corresponds too after classification.
Is there any way I can add my Blunder animation for the various hand movements to the streamlit app?
I donāt mind if it be staticā¦ like clicking a button for the label and show the simulation (Basically, just embedding the video and executing with one button)
Or rather, is there any other hacky or tricky way? Iāve got to demonstrate it tommorrow! Any idea will be helpful!