Friday, November 2, 2012

Programmers Anonymous notes, 1110


Robot news:

Human edges out robot car on race track (BBC)

... 
The robot car in the race has been developed by researchers at the Centre for Automotive Research at Stanford University (Cars).
Called Shelley, the autonomous vehicle is fitted with sensors that work out its position on the road, feed back information about the grip of its tyres and help it plot the best route around the circuit.
Prof Chris Gerdes, head of the Cars Lab at Stanford, said Thunderhill was chosen because its 15 turns present the car's control systems with a wide variety of challenges. Some corners can be taken at high speed, some are chicanes, others are sharp and come at the end of long straights down which the car hit a top speed of 115mph (185kph).
... 
"As we set up these systems in the future, it's important not to build autonomous vehicles that are merely a collection of systems designed for human support but to think a little bit more holistically about making them as good as the very best human drivers," said Prof Gerdes. "It's not so much the technology as the capability of the human that is our inspiration now."

3D printed custom exoskelteton
Two-year-old Emma wanted to play with blocks, but a condition called arthrogryposis meant she couldn't move her arms. So researchers at a Delaware hospital 3D printed a durable custom exoskeleton with the tiny, lightweight parts she needed. 



"The current methods we have for monitoring or interacting with living systems are limited," said Lieber. "We can use electrodes to measure activity in cells or tissue, but that damages them. With this technology, for the first time, we can work at the same scale as the unit of biological system without interrupting it. Ultimately, this is about merging tissue with electronics in a way that it becomes difficult to determine where the tissue ends and the electronics begin." 
The research addresses a concern that has long been associated with work on bioengineered tissue -- how to create systems capable of sensing chemical or electrical changes in the tissue after it has been grown and implanted. The system might also represent a solution to researchers' struggles in developing methods to directly stimulate engineered tissues and measure cellular reactions.


Robot learns to recognise itself in mirror

So far the robot has been programmed to recognise a reflection of its arm, but ultimately Mr Hart wants it to pass the "full mirror test".
The so-called mirror test was originally developed in 1970 and has become the classic test of self-awareness.
More usually performed on animals, the creature is given time to get used to the mirror and is then anesthetized and marked on the face with odourless, non-tactile dye.
The animal's reaction to their reflection is used as a gauge of their self-awareness, based on whether they inspect the mark on their own body, or react as if it does not appear on themselves.
Increasingly scientists have used similar tests to analyse self-awareness in robots but none have yet programmed a robot to fully recognise itself from appearance alone.To date, only a few non-human species pass these tests, including some primates, elephants and dolphins. Human babies are unable to pass the test until they are 18 months old.



I Made the Robot Do It (NY Times, by Thomas L. Friedman)

And therein lie the seeds of a potential revolution. Rethink’s goal is simple: that its cheap, easy-to-use, safe robot will be to industrial robots what the personal computer was to the mainframe computer, or the iPhone was to the traditional phone. That is, it will bring robots to the small business and even home and enable people to write apps for them the way they do with PCs and iPhones — to make your robot conduct an orchestra, clean the house or, most important, do multiple tasks for small manufacturers, who could not afford big traditional robots, thus speeding innovation and enabling more manufacturing in America.




No comments:

Post a Comment