Monday, July 18, 2011

Nona installation

1. Download the Hugin bundle from http://sourceforge.net/projects/hugin/
2. Copy the Hugin application to the Applications folder
3. Copy the initialize_environment.txt to your home folder, like /Users/angelica
4. From your home folder, run
source ./initialize_environment.txt

Robot accompanist


To do's
-       test with .wav file, sound not loud enough
-       have to make media location correct

Tuesday, July 12, 2011

Chapter 1 - Why should we make humanoid robots?

Translations from "Why I Make Humanoid Robots" by Hiroshi Ishiguro 

From Computer Vision to Robot Research

For 10 years, I've been completely absorbed in robot research and development, but in the beginning, I studied computer vision. Some of it had to do with writing Prolog, but researching computer vision  meant mainly analyzing computer images from a camera, so the computer could recognize what was in the photo.

Upon digging deeper into computer vision, this question sprang to mind: "Can a computer recognize reality without a body?"

For a computer to recognize an image, it needs to be loaded with knowledge about the contents of that image. But how much knowledge should we store? For example, to recognize a chair, we'd have to teach the computer every type of chair in the world. How do humans accomplish this kind of thing?

Based on our own human experience, we recognize that a chair is basically something we can sit on. Even if it's the first time seeing a particular chair, we can recognize that it's a chair.

That is, humans can recognize objects using their bodies, and don't need to comprehend that shape in the photo.

We observe what features we need to recognize whether we can sit on it or not, and find it possible to have a kind of generalized recognition.

For computers to have an equivalent recognition ability, they should be able to move around in their environment, just like humans, and have a body that can touch things.

That's the reason I widened my research from the world of computer vision to the world of robotics.

Research on making robots with human-like vision
After that experience in computer vision, the first thing I tackled in robot research was giving them a human-like sense of sight. That research could be classified into two categories: omni-directional vision and active vision.

In humans, two types of eye movements happen. The first kind is called omni-directional eye movement, used to survey our environment: we first recognize where we are, then figure out how to get there. The second type is continually looking at an object of interest to examine it in detail. This is called active vision. We need this research in eye movements for robots to play an active role in our daily lives.

Saturday, July 2, 2011

Improvements

Some things I want to do to improve NAO's theremin playing:
  • 1. Turn off his head fan during recording.