Brooks lectures on the future of robots, AI field

Dr. Rodney A. Brooks, Director of the Artificial Intelligence Laboratory and professor at Massachusetts Institute of Technology, gave a lecture discussing and demonstrating some of the technological advances in the robotics field as well as his views on how robots will impact our lives in the future based on their contribution to us today on Thursday, April 13.

Brooks’ lecture, part of this year’s faculty lecture series “Future Selves: New Technologies of Individuality,” was well attended and consisted of numerous clips of his robots in action.

Brooks began by illustrating how much machines had “defined human civilization.” He named agriculture and the Industrial Revolution as only a few examples of how machines have been essential to our lives over the past few millennia and how we have come to depend upon them.

Machines, he said, “have given us a level of autonomy that we had only dreamed of before.” Tasks that in the past took an exceptional amount of time are now easily completed with the aid of machines.

He showed numerous slides of robots that he and his students had designed starting with a worm-like creature that crawled about on its own and was also able to sense changes in the topography of the area around which it was crawling. It was able to independently crawl up “steps” a motion simulated by various books stacked against each other.

He has been integral in designing a six-legged robot that NASA has used to collect soil samples in Mars. The robot collects soil samples and brings them back to the shuttle by detecting the heat emitted from its base.

Brooks explained that after building robots in insect forms, he became interested in the challenge of developing robots closer to the human form.

Another robotic project was Polly, a tour guide robot. Polly was conceived because of the influx of tourists to the MIT labs. The audience was treated to a short clip of Polly in action. Polly was designed only with the intention of giving tours to interested persons.

He then went on to explain the means by which Polly was to identify visitors and give them tours. He and his students made the assumption that visitors to the lab would be more likely to stand aimlessly in the hallways going nowhere because they were unfamiliar with their surroundings and, while lab personnel would just ignore the robot, visitors would be inclined to stop and stare at it.

Consequently, Polly was programmed to look for “vertical objects that were still in the middle of a corridor .” She would then say, “Hello, my name is Polly, would you like a tour?”

The designers did not want the robot to have microphones, so it then said, “If so please wave your foot around.” Polly was equipped with the software necessary to sense the motion of the foot. If a person shook his foot Polly would then go off into her spiel and provide a tour of the facilities. Polly, Brooks mentioned, was unable to sense if its visitor had stopped following her. She would continue on touring even if the person had left.

Brooks then focused on his latest project Cog, a humanoid robot. Scientists have been trying to replicate the human form since 1947.

He said of Cog, “we’re not trying to make a Disney thing here…we’re really trying to emulate what happens in humans.”

In order to make this a reality, there were some basic principles that he included in the robot based his research. This meant that he had attempted to make sure that the robots had, for example, embodiment and physical coupling — that is, the robots resembled to the human form.

He also aimed for the robots to have development capabilities so that they would, in essence, grow up and mature as they gained more knowledge. He mentioned that a more recent goal was to make the robots capable of social interaction.

A development of particular significance in Cog was its ability to follow and react to motion with its eyes. It could locate movements in its periphery and turn its head towards the motion, thus allowing it to watch people and objects.

Brooks has also enabled the coordination of Cog’s extremities; that is, Cog’s arms can move towards things that it sees. He can play the drums as well as give and take objects to other people if they hold out their hand. Brooks and his associates are also working on, and have made much progress in, getting robots to replicate the facial expressions of humans.

His prototypes can show boredom if they have been provided with the same stimulus for too long. They can also show anger, happiness and various other facial expressions.

After his demonstrations, Brooks concluded by stating his opinion on the role that robots will play in human life. He thinks that humans will eventually “become the machines.” He claims everyday we become more and more dependent on machines and eventually we will totally assimilate our way of life with them.

Evidence of this he says is in things like cochlea implants and leg and arm prostheses. He thinks that with technological advances it is possible that the gap that differentiates humans from robots and machines will be greatly reduced.

Leave a reply

Your email address will not be published. Required fields are marked *