This is the world's very first wearable cognitive assistance application! We chose a deliberately simplified task (assembling 2D lego) since it was our first attempt. The demo seems easy, but the code to implement it reliably was challenging (especially with flexible user actions and under different lighting conditions).
- OpenCV: 2.4.9.1 (not working with more recent version)
- numpy: 1.11.1
An Android client is available on the Google PlayStore
Google Play and the Google Play logo are trademarks of Google LLC.
Running the server application using Docker is advised. If you want to install from source, please see Dockerfile for details.
We used the lego set when building this application. Any standard lego bricks would work. However, these bricks need to be placed on this particular lego board. Print the board on a piece of paper would also work.
From the main activity one can add servers by name and IP/domain. Subtitles for audio feedback can also been toggled. This option is useful for devices that may not have integrated speakers(like ODG R-7). Pressing the 'Play' button next to a server will initiate a connection to the Gabriel server at that address.
docker run --rm -it --name lego \
-p 0.0.0.0:9098:9098 -p 0.0.0.0:9111:9111 -p 0.0.0.0:22222:22222 \
-p 0.0.0.0:8080:8080 \
cmusatyalab/gabriel-lego:latest