Saturday, 26 May 2007

Elephant in the cupboard

The register has good article called "Why do robot experts build such lousy robots?" which explorers some of the less talked about aspects of this "cusp of a revolution", such as why "making and playing with robots evidently beats using and ignoring them" and Thrun's "apparently simple challenge: how a computer vision system might detect moving objects and predict their motion, a task at which frogs are still in the lead".

Most of it is bang on, but there are some hopeful signs out there in machine learning (most of which fall into the overflowing "soon to be released" category.

Sentience stereo vision slam
braintech volts-iq *** ITS ALIVE! ***
SRI Slam as a webservice
Incremental of Linear Model trees PDF

[*** Update June 12th 2007] Volts-IQTM Community Technical Preview
is released. Look forward to checking it out...


Bob Mottram said...

There have been so many false dawns in the robotics world that it's easy to be sceptical. One of the problems which I think has held back progress is that often researchers don't appreciate how crucial good quality perception is, instead believing that a few ultrasonics or bump switches will suffice. Also, although I have a lot of respect for Rod Brooks I think his emphasis upon relatively simple perceptual cues often directly linked to behaviours rather than creating realistic models has tended to lead researchers in the wrong direction for many years. Such simple forms of perception and action can only get you so far, but perhaps not much further than insect-like intelligence.

Chris Kilner said...

The insect behaviours were the ~easy proofs of reactivity that gave the impression of life without needing understanding, context, inteligence or wisdom.

Even humans use these reactive behaviours, in their pupils, their laziness, their balance and their avertion to heat and pain etc., so the work was necccesary and may well go on to form part of the fuller solution, but yes, areas such as vision, context and inference need all the attention they can get.

P.S. I 've got a second matching cam, but didn't get a quick win on a first try with fishbowl and sentience, so will come back to it after I've tied up a few other loose ends.

Rob Sim said...

Hi Chris,

As of today, you can count VOLTS-IQ as a real product. We've released a community technical preview of our real-time tracker. Details and downloads are at

Preston said...

I think the bump switches and the ultrasonics are perfect for robotics today. We are not out trying to build B9s and Rosie's, we are trying to take the very first steps into the robotic age. We are not trying to build HKs (from terminator) or anything of the such. We are building basic basic machines right now. Vaccuum cleaners, lawn mowers, pool cleaners, snow shovelers, these do not require eyes and ears. Just a few sensors to know where it is and how to get to the next point. What I am interested in seeing is mapping technology put into these machines. That my lawn mower will know where the next tree is and that it needs to turn in a certain direction to get in between that flower be in front of it, rather than just a hit or miss thing. Those are the improvements I want in robotics, not eyes and ears and intelligence, but programming. For more reviews on robotic lawn mowers check out

Chris Kilner said...

Hi Preston,

Maps are great and hopefully just around the corner. I think the point about eyes is that they are cheaper than ultrasonics. One webcam provides far more information than 10 times the price of ultrasonics.

I hope the next couple of years may start to provide chip-based or network interpretation of images so as to avoid the cost of the big onboard cpu currently needed.

Stefan said...

Good Job!" :)