Pages

Showing posts with label User Interface. Show all posts
Showing posts with label User Interface. Show all posts

Saturday, 11 September 2010

Real Time Regional Differencing

I little while ago I produced a method for extracting interesting regions from images which I called regional differencing (see post). I have optimised the code, and it now runs in real time. It works very well as a simple edge detector with the ability to produce much thinner lines than the Sobel detector, notice in the image below the maximum line thickness is one pixel. The algorithm works by comparing more and less pixelated versions of an image to find local differences compared to a local mean.

To improve the algorithm I switched from the RGB colour space to NUV as this improved results (albeit reducing performance marginally). The detector doesn't quite have the quality of something like the Canny detector, but it requires a lot less processing which in flash is very important!

Anyway check out the demo by clicking on the link below, the slider changes the pixel size of the comparison bitmap, with some interesting results. The default settings use a size of 2 pixels producing lines of a maximum one pixel thick.

Friday, 10 September 2010

Visual Human Interfaces

Making human interfaces intuitive, simple and accurate is a huge challenge. Some of the biggest challenges lie in the field of motion and gesture detection. Analysing visual data in real time can be very processor intensive.

This is a fairly simple implementation of an interface which has the ability to track on screen button presses and swipes (it is part of an ongoing research project):

Click to launch a demo video:

Sunday, 1 August 2010

Webcam Gestures in Flash

I'm currently creating a game which involves gesturing as the primary control method. For those who aren't familiar with the term gesturing, it was created to describe a set of motions carried out on a touch screen. I thought it would be pretty cool to make the keyboard and mouse completely obsolete and applied my gesturing algorithm to the webcam. Getting timings right with the webcam was the hardest part. You can choose when to start and stop using a mouse, but how do you tell when one gesture ends and the next begins on the webcam.

Click below to watch a video of the experiment in action:


It works by calculating the angles travelled in a gesture. It compares these to a list of angles which relates to a certain gesture. For example for a straight line in the positive horizontal direction this would be 0-0-0-0-0 or in the negative horizontal 180-180-180-180. The program determines which preset gesture the carried out gesture is most like and then performs an action based on this choice. The result is by no means stable, but using a better set of thresholds and presets can improve the performance dramatically.

Click here to launch the experiment and have a go yourself. Remember you'll need a webcam to do so.