https://www.linkedin.com/in/erjieyong
Imagine controlling your computer using gestures alone!
There are practical uses of gesture control as well when you are not able to access your computer directly such as
- Hands free control when your hands are dirty (eg. Cooking while refering to recipe, turn off the tap)
- Hands free control when you are far away from computer or any input source (eg. controlling the audio from the back of the car, controlling TV without using the remote, controlling presentation slide at a distance without mouse or clicker)
- More flexible control when controlling multi axis objects such as drone
- Gloveless control in Metaverse
There have also been well documented use case of interpreting sign languages to voice or text to facilitate conversation between sign language users and non-users.
This project only serve to explore the power of gesture control through transfer learning and to facilitate the ease of user interaction through a user interface. Please contact me directly for other requests.
- Allow window to always stay on top. (allow first time user to familiarise with the gestures while viewing the actions in play)
- Change sleep interval between gesture's actions
- Select from a pre-defined list of action (keystrokes) to run upon detecting the corresponding gestures
- Save the keystrokes locally for futher usage.
- One-click run to activate the gesture detection and perform actions based on saved settings
- Real time view of webcam and it's predicted gesture
- Real time output of actions performed
Gesture.Control.mp4
Gesture control in action!
There are 2 ways to run it
This was made possible using cx_Freeze library
- Download the zip file from this link: https://bit.ly/3gKBFzC
- Unzip
- Double click on Gesture_Control.exe
- Clone this repository locally
- Create a new environment based on requirements.yml
- Navigate to local folder where this repo is clone and activate the new environment
- Run
python Gesture_Control.py