Hang on a second while we grab that post for you.
An artificially intelligent virtual gamer created by computer scientists at The University of Texas at Austin has won the BotPrize by convincing a panel of judges that it was more human-like than half the humans it competed against. The competition was sponsored by 2K Games and was set inside the virtual world of “Unreal Tournament 2004,” a first-person shooter video game. The winners were announced this month at the IEEE Conference on Computational Intelligence and Games. “The idea is to evaluate how we can make game bots, which are nonplayer characters (NPCs) controlled by AI algorithms, appear as human as possible,” said Risto Miikkulainen, professor of computer science in the College of Natural Sciences. Miikkulainen created the bot, called the UT^2 game bot, with doctoral students Jacob Schrum and Igor Karpov.
Teaching a computer how to lip read isn’t science fiction, it’s a reality and it’s happening right now in Malaysia. Here, researchers at the International University in Selangor are teaching a computer to interpret human emotions based on lip patterns in order to improve the way people interact with computer.
The scientists developed their system using a genetic algorithm that improves with each iteration to match irregular ellipse fitting equations to the shape of a human mouth displaying different emotions. The team used photos of individuals from South-East Asia and Japan to train the computer to recognize the six commonly accepted human emotions — happiness, sadness, fear, anger, disgust, surprise — and neutrality. The algorithm then analyzed the upper and lower lip as two separate ellipses.
In the current study, the researchers’ algorithm successfully classified all seven emotions, along with a neutral expression. The researchers suggested that initial applications of such an emotion detector could involve, for instance, helping disabled patients lacking speech to interact more effectively with computer-based communication devices.
Image Credit: Dmitriy Kiryushchenkov / Shutterstock
Remember, Will Smith in the 2004 movie iRobot, reclining back in his seat of his futurist Audi while it was driving him to his destination? Well, it seems it’s not so futurist any longer. Not even 10 years later and today we see Google’s self-driving Prius go for distance, recently passing 300,000 miles, and Stanford’s self-driving Audi TTS showing off its speed.
Google’s will most likely be the first self-driving cars to legally take passengers, but they still can’t slide into parking spaces James Bond style.
The Audi self driving car, known as Shelley, sped around the Thunderhill Raceway track north of Sacramento topping 120 miles per hour on straightaways. The less than two and a half minutes it took to complete the 3-mile course is comparable to times achieved by professional drivers.
It’s only a matter of time before robotic cars outperform their human-driven counterparts. But the Stanford team’s ultimate goal is not to create yet another robot who can best their human competitors. Extreme speeds on a racetrack can sometimes simulate perilous conditions it might encounter on the normal road. The algorithm it needs for a spinning wheel to regain traction could be very similar to straightening out after sliding on an icy surface. And as we all know, even everyday driving in fine weather can suddenly turn into a death-defying act of avoidance and control. Lessons learned on the racetrack could better prepare Shelley and her successors for those split-second reactions.
The road ahead for Shelley and Google’s self-driving car is still a long one, and both seem to be taking different routes to providing the world with a safe self-driving car. In the time being, watch Shelley tear it up in this video.
June 23rd was the centenary of the birth of Alan Turing, father of computer science and artificial intelligence, who committed suicide just shy of 42. (Kings College, University of Cambridge).
View a short video bio here.
What is Neurorobotics?
A combined study of neuroscience, robotics, and artificial intelligence(AI), is the science and technology of embodied autonomous neural systems. In other terms in order to endow robots with human-like abilities to characterize and identify objects, they must be provided with tactile sensors and intelligent algorithms to select, control, and interpret data from useful exploratory movements.
Creating a robot that can feel.
Researchers at the University of Southern California’s Viterbi School of Engineering published a study in Frontiers in Neurorobotics showing that a specially designed robot can outperform humans in identifying a wide range of natural materials according to their textures, paving the way for advancements in prostheses, personal assistive robots and consumer product testing.
The robot was equipped with a new type of tactile sensor built to mimic the human fingertip. It also used a newly designed algorithm to make decisions about how to explore the outside world by imitating human strategies. Capable of other human sensations, the sensor can also tell where and in which direction forces are applied to the fingertip and even the thermal properties of an object being touched.
Like the human finger, the group’s BioTac® sensor has a soft, flexible skin over a liquid filling. The skin even has fingerprints on its surface, greatly enhancing its sensitivity to vibration. As the finger slides over a textured surface, the skin vibrates in characteristic ways. These vibrations are detected by a hydrophone inside the bone-like core of the finger.
The specialized robot was trained on 117 common materials gathered from fabric, stationery and hardware stores. When confronted with one material at random, the robot could correctly identify the material 95% of the time, after intelligently selecting and making an average of five exploratory movements. It was only rarely confused by a pair of similar textures that human subjects making their own exploratory movements could not distinguish at all.
Vodafone xone™, the incubation center of Vodafone R&D, one of the world’s largest mobile communications companies by revenue and Expertmaker, a leading Artificial Intelligence (AI) software company, announced the winners of the first AI Hackathonheld by the two companies over the weekend. A total of ten AI-integrated apps were created and three winners were chosen:
First Place: Politicart, a mobile app that allows consumers to buy with their political convictions. Using several forms of AI technology such as text classification and recommendations, Politicart makes it simple to influence politics via your product purchases. Team members Olav Strawe, Shakila Hameed and Bill Tang found that building the app was straightforward and successful, commenting, “The Politicart team was able to analyze and structure several data sources with Expertmaker’s platform after just a few hours of trying the technology. This allowed us to build an intelligent application service to provide a great mobile user experience, all within the two days of the AI Hackathon.” They look forward to bringing the app to market in the near future.
Second Place: BrainyCraig a mobile app designed with a dynamic learning solution component for discovering products on Craigslist using common language to map concepts to products. Effectively, this app helps consumers locate relevant products without knowing the exact term that the Craigslist seller is using. Team members consisted of Roger Pincombe, Alec Green and Brantley Beaird. This exciting app will be available soon.
Third Place: Disaster Expert a news app for urgent disaster updates and notifications that uses advanced filtering and deduplication of messages, allowing targeting by location and particular personal interest. The timeliness and high precision results presented in this app help people learn exactly what they need to know in the chaotic time of a disaster. The team members, Niveditha Jayasekar, Shanthi Sivanesan, Sivakumar Ramanathan and Hari Kunamneni plan to bring this app to market in the near future.
According to recently published report by TechSci Research “United States Smartphones Market Forecast & Opportunities, 2017”, the majority of the Smartphone sales in United States are driven by the number of applications it offers and the operating system it runs. However, the buying pattern is changing with innovation in artificial intelligence and gestural interface based applications which are playing a vital role in driving Smartphone sales. Apple Inc. was the first company to launch artificial intelligence software Siri followed by Samsung and Nokia with their S-voice and Vlingo Apps. The overall response towards artificial intelligence has been overwhelming. It is estimated that in the coming years innovation in these application would act as a major factor deciding the Smartphone vendor’s growth path.
A new force hits the beat: RoboCop.
NR02 Patrol Robot Concept by Kim Jun Pyo gives a glimpse as to what law robotic police force tools might appear in the future.
The concept patrol robot unit for police department based on UCAS -Unmanned Combat Air System has an ominous look and its features are unpleasant (but not lethal) tools like tear petrol shells and rubberized bullets to effectively suppress militant situations. Your unmanned vehicle operates autonomously using artificial intelligence plus a combination of GPS and sonar to ensure law enforcement officials tend to be out of harms means
On May 19th in a park in Moscow the battle for the Absolute World Robot Chess Champion title was on between ChessKA (also called the Chess Terminator) and the robot KUKA Monster. CHESSka, the first chess robot defeating grandmasters in the quick chess game, was created by Russian trainer and and inventor, Konstantin Kosteniuk. It beat the German-built KUKA Monster.
Organizers plan to hold such events every year, creating the major league of chess-playing robots (ChessRoboLiga-1) for Industrial Robotics, similar to Formula 1 for the automakers.
Artificial Intelligence - The world’s first eCycling kiosk: ecoATM
We have an average of 26 electronic devices in our homes and new innovations only add to this number. Most importantly new features cause for shorter and shorter upgrade cycles every year tempting consumers to buy newer versions. What happens to all the older model gadgets? Take them to an ecoATM for eCyling. Fitted with AI software it will evaluate your old gadget and give you cash for it based on its condition. ecoATM will dramatically alter the current life-cycle of consumer electronics and in recognition of that this innovation has won multiple awards. Beautiful!
Last week we witnessed a robotic arm being moved with thought; enabling a 58-year-old woman, paralyzed by a stroke for almost 15 years, to grasp a bottle of coffee, serve herself a drink, and return the bottle to the table.
The Global Future 2045 Congress predicts fully-functioning thought-controlled avatars in less than 8 years, and: flying cars.
In February of 2012 the first Global Future 2045 Congress was held in Moscow. There, over 50 world leading scientists from multiple disciplines met to develop a strategy for the future development of humankind. One of the main goals of the Congress was to construct a global network of scientists to further research on the development of cybernetic technology, with the ultimate goal of transferring a human’s individual consciousness to an artificial carrier.
The three-day event concluded with the finalization of a resolution that will be submitted to the United Nations demanding the implementation of committees to discuss life extension Avatar projects as a necessary tool in the preservation of humankind, as well as defining ethical parameters for scientists worldwide.
The next GF2045 International Congress will be held in June 2013, in New York City.
Here is a timeline published by the Global Future 2045 Congress.
2012-2013. The global economic and social crises are exacerbated. The debates on the global paradigm of future development intensifies.
2013-2014. New centers working on cybernetic technologies for the development of radical life extension rise. The ‘race for immortality’ starts.
2015-2020. The Avatar is created — A robotic human copy controlled by thought via ‘brain-computer’ interface. It becomes as popular as a car.
2020. In Russia and in the world appear — in testing mode — several breakthrough projects:
Android robots replace people in manufacturing tasks; android robot servants for every home; thought-controlled Avatars to provide telepresence in any place of the world and abolish the need business trips; flying cars; thought driven mobile communications built into the body or sprayed onto the skin.
2020-2025. An autonomous system providing life support for the brain and allowing it interaction with the environment is created. The brain is transplanted into an Avatar B. With Avatar B man receives new, expanded life.
2025. The new generation of Avatars provides complete transmission of sensations from all five sensory robot organs to the operator.
2030-2035. ReBrain — The colossal project of brain reverse engineering is implemented. World science comes very close to understanding the principles of consciousness.
2035. The first successful attempt to transfer one’s personality to an alternative carrier. The epoch of cybernetic immortality begins.
2040-2050. Bodies made of nanorobots that can take any shape arise alongside hologram bodies.
2045-2050. Drastic changes in social structure, and in scientific and technological development. All the for space expansion are established. For the man of the future, war and violence are unacceptable. The main priority of his development is spiritual self-improvement.