2.11.2006

Robot Love

In recent interview in New Scientist with artist Mari Velonaki, we get a glimpse of an interactive exhibit where robots and people can interact.
Velonaki has collaborated with robotics scientists at the centre to create Fish-Bird, a live exhibition comprising a pair of moody, love-struck robots disguised as wheelchairs that can communicate through movement and written text.
What is strikingly different about this approach to machine intelligence is the emphasis on emotion over problem-solving. This is something I've pondered for a while. If we really want to be able to talk to computer-based intelligence, doesn't it have to at least attempt to emulate emotion? Click on the photo below to see one attempt.But what are the limits for machine emotion? If we translate it into human terms, this becomes even more interesting. Computer programs are changeable in a way that human brains are not. A computer program can change itself if it's so designed. To a point we can do the same with our own brains--teaching ourselves to learn to play piano, for example. More recently, we can swallow pretty little pills that adjust our brain chemistry. What are the limits of this trend? Suppose that we, like a flexible computer program, could change our own programming?

Suppose, for example, that you want to be happy all the time. If there existed a way to modify brain chemistry by swallowing a pill to satisfy your desire, it seems inevitable that many would do just that. It's hard to argue otherwise given the problem with drugs like crack.

Now suppose some general 'desire' emotion exists, and we can make it go away with a pill. In a few heartbeats we would become completely satisfied, if catatonic citizens. What's to prevent that? Only other desires (not to be catatonic, for example). But in the most general case, all desire is satisfed with this pill, so only an initial inclination not to take that first one could save you.

This initial inclination is a matter of chance, begetting an evolutionary mechanism for selecting for specimens not inclined to take the first pill. Over time, this becomes ossified by evolution into a real limit on behavior. In other words, behaviors that lead to severe disability will be strongly selected against. That's why it's hard bring yourself to jump off of a bridge.

The conclusion is that these hard-coded safeguards are essential to survival. Therefore any program that can modify itself will be selected against in favor of one with limits. So in order to build intelligent machines, we should be interested in what these limits should be. We could call them, say, emotions.

2.05.2006

Superbowl Tag

Over on Engadget they have a nice article about tech at the superbowl, with nice little tidbits like the fact that they have 120 footballs on hand for the game. That's a lot of footballs for one game. Even more interesting, from a geek POV is the fact that each of them is being labled with a particlular strand of DNA so that the balls can be identified later. This is because they can sell for $1000 each, and apparently it has been easy for fakes to be passed off as the real thing.

2.02.2006

Where's George?

No, I don't mean a political slogan from the 90's. Somebody had the clever idea to track dollar bills on the internet. Go to www.wheresgeorge.com where you can enter the serial numbers of a bill in the US or Canada and then follow it later if someone else also enters the number. There are about 76 million bills being tracked this way, and according to a Science Daily article scientists are using the site as a way to predict how disease spreads. The map below shows one of the most travelled bills.

Nervous Crustacea

At Shug's Smokehouse down the road I had a dinner last Wednesday with some colleagues. At the door there was a folding blackboard with the day's specials on it. At the top it read:
ADD boiled shrimp $2.95
Now I knew that Attention Deficit Disorder was becoming ubiquitous, but this is really too much! I wonder how they caught the jumpy little fellows anyway? With Ritalin, pehaps.