This is a proof-of-concept program. It shows how to use libusb and Haskell to write a simple video driver based on the USB Video Class specifications (aka UVC).
This example is mainly based on the old version 1.0a of the specs. Devices implementing the new version 1.1 might worked, but none was test so far.
Check that you have the right permission on your video USB device file handler in /dev/bus/usb/*/*.
Run ./test_images.sh
to retrieve a bunch of BMP images and display
them (with feh).
Run ./test_video.sh
to store on the disk and display with gst-launch
or ffplay (tools from gstreamer-tools or FFMPEG) a raw YUY2 video
stream.
Run ./test_iter.sh
to store on the disk and display with gst-launch or
ffplay (tools from gstreamer-tools or FFMPEG) a raw YUY2 video stream.
This program is based on usb-iteratee and you must compile with the
Iteratee
flag to test it.
You can also directly run ./dist/build/test/test (--help)
to try with
different video frames. The resulting raw video file (typically
/tmp/uvc\_x\_y.yuy2
) should be readable with the read_video.sh
helper script.
-
UVC Video Control and Video Streaming descriptors are correctly parsed.
-
Video probe and commit controls seem to work. Even if the negotiation scheme is very limited.
-
Video streams can be retrieved and raw data is readily accessible. Two modules
Codec.UVC.RGBA
andBMP
allow export to bmp files.
Background threads reading isochronous requests may emit exceptions and deadlock the whole program.
Real stream interface, as a lazy-build list, a Concurrent.Chan or an iterator.
The system load is to high when processing (!= not retrieving) video raw data. Improvement to the RGBA conversion would be welcome.
Perhaps, a source of optimisation would be to tweak isopackets memory allocation as in a regular C program. In C one usualy allocates a small set of isopackets (6 * 64 packets) and submit them again and again where this Haskell implemention allocates new isopackets for every submission.
A binding to gstreamer/xine/ffmpeg/whatever to directly display the video.
Only the probe/commit requests are implemented. Others requests (including camera pantilt, zoom, …, processing unit brightness, contrast, hue, saturation, … and vendor specific controls are NOT implemented.
Handling multiple video streaming interface. For now, only the first video streaming interface (and its endpoint) is used.
Using the ColorMatching descriptor to get the conversion to RGB(A) right.
Only uncompressed streams are supported.
Only YUY2 uncompressed data can be decoded to BMP images. NV12 support is experimental.