Hang on a second while we grab that post for you.
TECH NEWS ALERT:
In five years, IBM thinks computers will touch, taste, smell, hear, and see. Sensing devices will aid online shoppers, parents, chefs (cooking a perfectly tasty and healthy meal), and doctors.
Watch the forecast here.
Image: Anthony Reeves
Forget Skype: The telepresence robots being developed in labs – such as this one being controlled at University College London by a person in Spain – suggest the technology will become ever more immersive. Eventually these surrogates will feed back a sense of touch to their controllers, and could be operated by thought alone.
Like a bored child who can’t be bothered to read, this robot flips from page to page. This odd contraption is actually a new way to scan and digitise the world’s books - at a speed of 250 pages per minute. (via One Per Cent: Book-riffling robot scans one page at a time)
'Watching sci-fi is a must for any innovations and science writer ' I told myself today when my research for the book required to grab a bowl of popcorn, pour a glass of white wine and get comfy in front of the tele to watch I, Robot. I had seen the movie several times before but this time I was taking notes.
Released in 2004, I, Robot is a movie packed with technology and gadgets which were fiction then, but today, a mere eight years later, there is a whole host of these fictional items that have turned into reality. Let’s have a look at them, shall we?
Movie: Set in Chicago in 2035, the first few scenes give an overview of what life looks like in 2035. Robots deliver FedEx parcels, walk dogs or collect the shopping. They are part of people’s daily lives.
Reality: Today, researchers all over the world are working eagerly on developing robots that can help us with our daily chores. This year a range of successful robots have been deployed as so-called service robots. Toyota, for example, unveiled a new single-arm robot to assist homebound residents with limited mobility. Then there is RoboCourier - an autonomous mobile carrier that could carry and transport small weights – up to 50 pounds. It was designed specifically for use in laboratories within hospitals. It activates automatic doors, navigates hallways and knows how to avoid obstacles.
Robots - a new living species
Movie: I, Robot touches on a subject we all (one way or another) have been thinking of: Will robots one day take over our world? Are we heading towards a world in which humans and robots live together?
Reality: Professor Rosalind Picard at the MIT has developed a software that enables computers to distinguish a real smile from a fake smile. That’s right, she is helping machines read emotions.
In February of 2012 the first Global Future 2045 Congress was held in Moscow. There, over 50 world leading scientists from multiple disciplines met to develop a strategy for the future development of humankind. One of the main goals of the Congress was to construct a global network of scientists to further research on the development of cybernetic technology, with the ultimate goal of transferring a human’s individual consciousness to an artificial carrier.
Movie: Will Smith’s character, Detective Spooner, is leaning back in his seat behind the wheel of his Audi. He is flipping through pages in a folder, checks the windscreen which displays the car’s driving speed of 125mph, and then gets back to reading the document. All the while the car drives itself.
Cashless, wireless payments
Movie: We see Spooner paying for his bar tap by tapping a scanner with a foblike device.
Reality: Signing credit card receipts is a thing of the past and even pin number verification is on the decline. Integrated NFC technology in phones and credit cards means we will increasingly be paying for things with just a tap. However, credit cards and phones can be stolen. Researchers are looking into a payment method that is unique to every single person and cannot be cloned or easily stolen: The human hand. Biometrics will make it possible to pay for items with a scan of our palms.
Augmented Reality Glasses
Movie: In another scene Spooner is seen racing on his motorbike. He is wearing sunglasses. We can see what Spooner can see through his lenses: an integrated digital display that not only shows him the speed he is driving at and a compass but also scans the road for obstacles.
Reality: In 2012 Google revealed its virtual reality glasses which they called Project Glass. Research and development continues to produce Project Glass products that will display information in smartphone-like format hands-free and could interact with the Internet via natural language voice commands.
There is a sequel planned for 2015 and I can’t wait to see what new gadgets and technologies will be used in I, Robot 2.
Reminiscent of the visual style of the Hunger Games, Absolute's latest commercial combines dance music (Swedish House Mafia), futuristic looks (oh, the fashion) and human controlled racing robots (awesome interfaces). A toast to a fab video!
This is the latest work by Carl Erik Rinsch, a commercial director who has mastered the art of storytelling with strong images, minimal use of words, and immersive, engaging style.
OK this is pretty damn cool!
Boston Dynamics Cheetah robot beats Usain Bolt
Cheetah Robot is a fast-running quadruped developed by Boston Dynamics with funding from DARPA. It just blazed past its previous speed record, getting up to 28.3 mph, about 0.5 mph faster than Usain Bolt’s fastest 20 meter split. This version of the Cheetah Robot runs on a treadmill with offboard power. Testing on an untethered outdoor version starts early next year.
Home made noodles made by robots.
Meet Chef Cui, the noodle shaving robot. Chef Cui Runquan was sick and tired of having to shave noodles by hand and pouring them into a pot of boiling water, so he built a robot to do the job for him.
Here is his robot noodle chef army!
A floating robot has been deployed to track great white sharks in the Pacific as part of efforts to understand the giant predators.
The “wave glider”, which from above looks like a yellow surfboard, picks up signals from tagged fish up to 1,000 feet away in the ocean and then sends their positions to researchers via a satellite transmitter.
Scientists have only a hazy understanding of where great white sharks, portrayed as ferocious killers in films like “Jaws”, swim in the oceans. The new robot will give insights into their movements.
"Here we are in the 21st century and scientists have just put a rover on Mars. And we don’t understand what is going on in the oceans," said Barbara Block, a marine sciences professor at Stanford University in California in charge of the project.
"We will send a wave glider out to follow the sharks," she told Reuters. In one eight-day test, the glider, made by California-based Liquid Robotics and which moves at less than walking speed, made 200 detections of 19 individual sharks.
The glider, about 7 feet long with solar panels above and a wave-power system below, could also give clues to other tagged creatures ranging from mako sharks to tuna and salmon.
It can only notice creatures that have been previously tagged by scientists with tiny battery-powered acoustic transmitters that bleep once every two minutes. Thousands of creatures carry the tags, Block said.
The glider, and listening buoys in fixed positions chained to the seabed, are building on a previous project for the tagging of Pacific predators, which was part of an international census of marine life from 2000-10.
Block said that scientists already knew that great white sharks wandered across the Pacific from North America, often all the way to Hawaii.
There were also mysterious gatherings, including in one mid-ocean area dubbed the “white shark cafe”. Unlike the fixed buoys, the gliders can monitor such “ocean wi-fi hotspots”.
The researchers are setting up an app, “Shark Net” allowing people to track the fish.
Block hopes to extend the ocean observing network down the west coast of the United States, likening the region to a “blue Serengeti” as rich in wildlife as Tanzania or Kenya.
NAO getting the dirt of its shoulders Jay-Z style!
(from 5:12 in the video ).
Judging from this awesome progress NAO’s made jumping, spinning and some back flips will surely be added soon.
Probably one of the coolest recruitment videos ever - if you love robots that is…!
Aldebaran Robotics, maker of the NAO robot, has released a recruiting video seeking “Europe’s best engineering talents” to help it build the next phase of the NAO dream team.
Kahp-Yang Suh and colleagues at Seoul National University in South Korea wove together thousands of individual polymer nanohairs to make a flexible touch sensor that is more sensitive than human skin.
The idea for the device came from the interlocking of cells in human hair and organs.These organically woven-together cells translate inputs of force into electrical signals that are then interpreted by the brain. Similar to their organic counterparts, the 50-nanometre-wide hairs of Suh’s device twist and bend against each other when an external force like a beating heart or a soft touch is applied.
The contact between the hairs generates an electrical current which the sensor identifies as specific changes in pressure, shear or torsion. These results are displayed on a computer monitor in real time.
Researchers demonstrated the sensor’s extreme sensitivity in more than 10,000 test cycles. It could detect the dynamic motion of a tiny water droplet bouncing on a hydrophobic plate and the physical force of a heartbeat. A skin of hairy sensors like these could clothe prosthetic limbs and robots.
When sandwiched together, two layers of tiny hairs can sense pressure, shear and torsion. Scale bar, 1µm.
It only takes a single millisecond for the robot to recognize what shape your hand is in, and just a few more for it to make the shape that beats you, but it all happens so fast that it’s more or less impossible to tell that the robot is waiting until you commit yourself before it makes its move, allowing it to win 100% of the time.