Paralyzed man’s robotic arm gains a sense of touch and shakes Obama’s hand

by Lorenzo Tanos

The mind-controlled robotic arm of Pennsylvania man Nathan Copeland hasn’t just gotten the sense of touch. It’s also got to shake the hand of the U.S. President himself, Barack Obama.

Copeland, 30, was part of a groundbreaking research project involving researchers from the University of Pittsburgh and the University of Pittsburgh Medical Center. In this experiment, Copeland’s brain was implanted with microscopic electrodes — a report from the Washington Post describes the tiny particles as being “smaller than a grain of sand.” With the particles implanted into the cortex of the man’s brain, they then interacted with his robotic arm. This allowed Copeland to gain some feeling in his paralyzed right hand’s fingers, as the process worked around the spinal cord damage that robbed him of the sense of touch.

More than a decade had passed since Copeland, then a college student in his teens, had suffered his injuries in a car accident. The wreck had resulted in tetraplegia, or the paralysis of both arms and legs, though it didn’t completely rob the Western Pennsylvania resident of the ability to move his shoulders. He then volunteered in 2011 for the University of Pittsburgh Medical Center project, a broader research initiative with the goal of helping paralyzed individuals feel again. The Washington Post describes this process as something “even more difficult” than helping these people move again.

For Nathan Copeland, the robotic arm experiment has proven to be a success, as he’s regained the ability to feel most of his fingers. He told the Washington Post on Wednesday that the type of feeling does differ at times, but he can “tell most of the fingers with definite precision.” Likewise, UPMC biomedical engineer Robert Gaunt told the publication that he felt “relieved” that the project allowed Copeland to feel parts of the hand that had no feeling for the past 10 years.

Prior to this experiment, mind-controlled robotic arm capabilities were already quite impressive, but lacking one key ingredient – the sense of touch. These prosthetics allowed people to move objects around, but since the individuals using the arms didn’t have working peripheral nerve systems, they couldn’t feel the sense of touch, and movements with the robotic limbs were typically mechanical in nature. But that’s not the case with Nathan Copeland, according to UPMC’s Gaunt.

“With Nathan, he can control a prosthetic arm, do a handshake, fist bump, move objects around,” Gaunt observed. “And in this (study), he can experience sensations from his own hand. Now we want to put those two things together so that when he reaches out to grasp an object, he can feel it. … He can pick something up that’s soft and not squash it or drop it.”

But it wasn’t just ordinary handshakes that Copeland was sharing on Thursday. On that day, he had exchanged a handshake and fist bump with President Barack Obama, who was in Pittsburgh for a White House Frontiers Conference. And Obama appeared to be suitably impressed with what Gaunt and his team had achieved, as it allowed Copeland’s robotic arm and hand to have “pretty impressive” precision.

“When I’m moving the hand, it is also sending signals to Nathan so he is feeling me touching or moving his arm,” said Obama.

Unfortunately, Copeland won’t be able to go home with his specialized prosthesis. In a report from the Associated Press, he said that the experiment mainly amounts to having “done some cool stuff with some cool people.” But he nonetheless remains hopeful, as he believes that his experience with the robotic arm will mark some key advances in the quest to make paralyzed people regain their natural sense of touch.

Read more at http://www.inquisitr.com/3599638/paralyzed-mans-robotic-arm-gets-to-feel-again-shakes-obamas-hand/#xVzFDHGXukJWBV05.99

Man reveals the truth of his two year ‘relationship’ with a sex robot

David Mills has opened up about his two year ‘relationship’ with a doll.

The 57-year-old has just celebrated his second anniversary with Taffy, his £5000 “RealDoll2”, with silicone skin and steel joints.

He has revealed that some women are turned on by the doll and he’s even shared a threesome with one woman.

The twice-divorced dad says he still dates and gets differing reactions if he tells them about his sex doll and some would “freak out”.

He told Men’s Health: “They’ll be like, ‘Don’t call me anymore, I’m unfriending you on Facebook, stay away from me and my children,’ that sort of thing.

“But I’ve met some women who were into me because of the doll. I’ve had sexual experiences that I never would’ve had without Taffy.”

The American bought the sex robot from a Californian company two years ago and paid an extra £300 for added freckles, to make her more realistic.


The robots come with a £5000 price tag and latets versions will even come with a pulse.

According the website of sex doll suppliers Abyss, Taffy has an “ultra-realistic labia,” “stretchy lips,” and a hinged jaw that “opens and closes very realistically.”

In the first few months, he revealed, he would often come home, see the frozen figure sitting on a chair, and let out a blood-curdling scream.

David recalls one occasion when he brought a woman back to his house after a date, without telling her about his silicone companion.

He added: “I didn’t want my date to walk into the room and suddenly see Taffy, because if you’re not expecting her, she’s kind of terrifying.”

“So I say to this girl, ‘Give me a minute.’ And I run into the bedroom and quickly throw a sheet over Taffy.

“That was a close one.”

David laughs as he recalls one particular act with Taffy which would be impossible with a real woman.

He said: “Sometimes, when I just don’t feel like looking at her, I’ll take out her vagina.
“She stays in the bedroom, and I just walk around with her p***y. Isn’t modern technology wonderful?”

But David is keen to point out that his ownership of a sex robot doesn’t mean he is crazy.

He said: “I wouldn’t exactly call this a relationship.

“I think one of the misconceptions about sex robots is that owners view their dolls as alive, or that my doll is in love with me, or that I sit around and talk to her about whether I should buy Apple stock.


Sex robots are big business in the States and are becoming more advanced all the time.

He also revealed his 20-year-old daughter aware of Taffy’s existence.

“We don’t really talk about it,” he added. “Just like we don’t talk about my television set or washing machine.”

Sex robots have become much more sophisticated in recent years and experts say walking, talking dolls won’t be too far away.

The “RoxxxyGold” robot from True Companion — with a base price, before the extras, of £4,800 — offers options including “a heartbeat and a circulatory system” and the ability to “talk to you about soccer.”

https://www.thesun.co.uk/living/1198869/ive-met-some-women-who-were-into-me-because-of-the-doll-man-reveals-the-truth-of-his-two-year-relationship-with-a-sex-robot/

First robot designed to cause human pain and make us bleed

By Jasper Hamill

Experts fear it’s only a matter of time before robots declare war on humans.

Now the tech world has taken one small step toward making this nightmare scenario a reality.

An American engineer has built the world’s first robot that is entirely designed to hurt human beings.

The pain machine breaks the first rule in science fiction writer Isaac Asimov’s famous “laws of robotics,” which states that machines should never hurt humans.

“No one’s actually made a robot that was built to intentionally hurt and injure someone,” robot designer and artist Alexander Reben told Fast Company.

“I wanted to make a robot that does this that actually exists.

“[It was] important to take it out of the thought experiment realm into reality, because once something exists in the world, you have to confront it. It becomes more urgent. You can’t just pontificate about it.”

Luckily for us humans, the pain-bot is not quite the shotgun-wielding death machine depicted in the “Terminator” films.

Its only weapon is a small needle attached to a long arm, which is used to inflict a small amount of agony on a human victim.

The robot randomly decides whether to attack people who are brave enough to put their hands beneath its arm, although it’s not strong enough to cause major injury.

Reben said the aim of the project wasn’t to hasten the end of humanity. Instead, he wants to encourage people to start discussing the prospect that robots could soon have some terrifying powers.

“I want people to start confronting the physicality of it,” Reben says. “It will raise a bit more awareness outside the philosophical realm.”

“There’s always going to be situations where the unforeseen is going to happen, and how to deal with that is going to be an important thing to think about.”

Last year, world-famous British physicist Professor Stephen Hawking claimed robots and artificial intelligence could wipe humans off the face of the planet.

Billionaire Elon Musk agrees, having spent much of the past few years warning about the apocalyptic scenario of a war between man and machine.

Both Hawking and Musk signed a letter last year urging world leaders to avoid a military robotics arms race.

It is likely that the battles of the future will involve machines capable of killing without needing to be directed by a human controller.

“[Robotic] weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group,” the letter said.

“We therefore believe that a military AI arms race would not be beneficial for humanity.”

http://nypost.com/2016/06/13/this-is-the-first-robot-designed-to-cause-human-pain/?utm_source=applenews&utm_medium=inline&utm_campaign=applenews

DESIGNING AI WITH A HEART: THE CHALLENGE OF GIVING MACHINES EMOTIONAL AWARENESS


ADVANCES IN EMOTIONAL TECHNOLOGIES ARE WARMING UP HUMAN-ROBOT RELATIONSHIPS, BUT CAN AI EVER FULFILL OUR EMOTIONAL NEEDS?

Science fiction has terrified and entertained us with countless dystopian futures where weak human creators are annihilated by heartless super-intelligences. The solution seems easy enough: give them hearts.

Artificial emotional intelligence or AEI development is gathering momentum and the number of social media companies buying start-ups in the field indicates either true faith in the concept or a reckless enthusiasm. The case for AEI is simple: machines will work better if they understand us. Rather than only complying with commands this would enable them to anticipate our needs, and so be able to carry out delicate tasks autonomously, such as home help, counselling or simply being a friend.

Assistant professor at Northwestern University’s Kellogg School of Management Dr Adam Waytz and Harvard Business School professor Dr Norton explain in the Wall Street Journal that: “When emotional jobs such as social workers and pre-school teachers must be ‘botsourced’, people actually prefer robots that seem capable of conveying at least some degree of human emotion.”

A plethora of intelligent machines already exist but to get them working in our offices and homes we need them to understand and share our feelings. So where do we start?

TEACHING EMOTION

“Building an empathy module is a matter of identifying those characteristics of human communication that machines can use to recognize emotion and then training algorithms to spot them,” says Pascale Fung in Scientific American magazine. According to Fung, creating this empathy module requires three components that can analyse “facial cues, acoustic markers in speech and the content of speech itself to read human emotion and tell the robot how to respond.”

Although generally haphazard, facial scanners will become increasingly specialised and able to spot mood signals, such as a tilting of the head, widening of the eyes, and mouth position. But the really interesting area of development is speech cognition. Fung, a professor of electronic and computer engineering at the Hong Kong University of Science and Technology, has commercialised part of her research by setting up a company called Ivo Technologies that used these principles to produce Moodbox, a ‘robot speaker with a heart’.

Unlike humans who learn through instinct and experience, AIs use machine learning – a process where the algorithms are constantly revised. The more you interact with the Moodbox, the more examples it has of your behaviour, and the better it can respond in the appropriate way.

To create the Moodbox, Fung’s team set up a series of 14 ‘classifiers’ to analyse musical pieces. The classifiers were subjected to thousands of examples of ambient sound so that each one became adept at recognising music in its assigned mood category. Then, algorithms were written to spot non-verbal cues in speech such as speed and tone of voice, which indicate the level of stress. The two stages are matched up to predict what you want to listen to. This uses a vast amount of research to produce a souped up speaker system, but the underlying software is highly sophisticated and indicates the level of progress being made.

Using similar principles is Emoshape’s EmoSPARK infotainment cube – an all-in-one home control system that not only links to your media devices, but keeps you up to date with news and weather, can control the lights and security, and also hold a conversation. To create its eerily named ‘human in a box’, Emoshape says the cube devises an emotional profile graph (EPG) on each user, and claims it is capable of “measuring the emotional responses of multiple people simultaneously”. The housekeeper-entertainer-companion comes with face recognition technology too, so if you are unhappy with its choice of TV show or search results, it will ‘see’ this, recalibrate its responses, and come back to you with a revised response.

According to Emoshape, this EPG data enables the AI to “virtually ‘feel’ senses such as pleasure and pain, and [it] ‘expresses’ those desires according to the user.”

PUTTING LANGUAGE INTO CONTEXT

We don’t always say what we mean, so comprehension is essential to enable AEIs to converse with us. “Once a machine can understand the content of speech, it can compare that content with the way it is delivered,” says Fung. “If a person sighs and says, ‘I’m so glad I have to work all weekend,’ an algorithm can detect the mismatch between the emotion cues and the content of the statement and calculate the probability that the speaker is being sarcastic.”

A great example of language comprehension technology is IBM’s Watson platform. Watson is a cognitive computing tool that mimics how human brains process data. As IBM says, its systems “understand the world in the way that humans do: through senses, learning, and experience.”

To deduce meaning, Watson is first trained to understand a subject, in this case speech, and given a huge breadth of examples to form a knowledge base. Then, with algorithms written to recognise natural speech – including humour, puns and slang – the programme is trained to work with the material it has so it can be recalibrated and refined. Watson can sift through its database, rank the results, and choose the answer according to the greatest likelihood in just seconds.

EMOTIONAL AI

As the expression goes, the whole is greater than the sum of its parts, and this rings true for emotional intelligence technology. For instance, the world’s most famous robot, Pepper, is claimed to be the first android with emotions.

Pepper is a humanoid AI designed by Alderaban Robotics to be a ‘kind’ companion. The diminutive and non-threatening robot’s eyes are high-tech camera scanners that examine facial expressions and cross-reference the results with his voice recognition software to identify human emotions. Once he knows how you feel, Pepper will tailor a conversation to you and the more you interact, the more he gets to know what you enjoy. He may change the topic to dispel bad feeling and lighten your mood, play a game, or tell you a joke. Just like a friend.

Peppers are currently employed as customer support assistants for Japan’s telecoms company Softbank so that the public get accustomed to the friendly bots and Pepper learns in an immersive environment. In the spirit of evolution, IBM recently announced that its Watson technology has been integrated into the latest versions, and that Pepper is learning to speak Japanese at Softbank. This technological partnership presents a tour de force of AEI, and IBM hopes Pepper will soon be ready for more challenging roles, “from an in-class teaching assistant to a nursing aide – taking Pepper’s unique physical characteristics, complemented by Watson’s cognitive capabilities, to deliver an enhanced experience.”

“In terms of hands-on interaction, when cognitive capabilities are embedded in robotics, you see people engage and benefit from this technology in new and exciting ways,” says IBM Watson senior vice president Mike Rhodin.

HUMANS AND ROBOTS

Paranoia tempts us into thinking that giving machines emotions is starting the countdown to chaos, but realistically it will make them more effective and versatile. For instance, while EmoSPARK is purely for entertainment and Pepper’s strength is in conversation, one of Alderaban’s NAO robots has been programmed to act like a diabetic toddler by researchers Lola Cañamero and Matthew Lewis at the University of Hertfordshire. Switching the roles of carer and care giver, children look after the bumbling robot Robin in order to help them understand more about their diabetes and how to manage it.

While the uncanny valley says that people are uncomfortable with robots that resemble humans, it is now considered somewhat “overstated” as our relationship with technology has dramatically changed since the theory was put forward in 1978 – after all, we’re unlikely to connect as strongly with a disembodied cube than a robot.

This was clearly visible at a demonstration of Robin, where he tottered in a playpen surrounded by cooing adults. Lewis cradled the robot, stroked his head and said: “It’s impossible not to empathise with him. I wrote the code and I still empathise with him.” Humanisastion will be an important aspect of the wider adoption of AEI, and developers are designing them to mimic our thinking patterns and behaviours, which fires our innate drive to bond.

Our interaction with artificial intelligence has always been a fascinating one; and this is only going to get more entangled, and perhaps weirder too, as AEIs may one day be our co-workers, friends or even, dare I say it, lovers. “It would be premature to say that the age of friendly robots has arrived,” Fung says. “The important thing is that our machines become more human, even if they are flawed. After all, that is how humans work.

http://factor-tech.com/

Robot outperforms highly-skilled human surgeons on pig GI surgery

A robot surgeon has been taught to perform a delicate procedure—stitching soft tissue together with a needle and thread—more precisely and reliably than even the best human doctor.

The Smart Tissue Autonomous Robot (STAR), developed by researchers at Children’s National Health System in Washington, D.C., uses an advanced 3-D imaging system and very precise force sensing to apply stitches with submillimeter precision. The system was designed to copy state-of-the art surgical practice, but in tests involving living pigs, it proved capable of outperforming its teachers.

Currently, most surgical robots are controlled remotely, and no automated surgical system has been used to manipulate soft tissue. So the work, described today in the journal Science Translational Medicine, shows the potential for automated surgical tools to improve patient outcomes. More than 45 million soft-tissue surgeries are performed in the U.S. each year. Examples include hernia operations and repairs of torn muscles.

“Imagine that you need a surgery, or your loved one needs a surgery,” says Peter Kim, a pediatric surgeon at Children’s National, who led the work. “Wouldn’t it be critical to have the best surgeon and the best surgical techniques available?”

Kim does not see the technology replacing human surgeons. He explains that a surgeon still oversees the robot’s work and will take over in an emergency, such as unexpected bleeding.

“Even though we take pride in our craft of doing surgical procedures, to have a machine or tool that works with us in ensuring better outcome safety and reducing complications—[there] would be a tremendous benefit,” Kim says. The new system is an impressive example of a robot performing delicate manipulation. If robots can master human-level dexterity, they could conceivably take on many more tasks and jobs.

STAR consists of an industrial robot equipped with several custom-made components. The researchers developed a force-sensitive device for suturing and, most important, a near-infrared camera capable of imaging soft tissue in detail when fluorescent markers are injected.

“It’s an important result,” says Ken Goldberg, a professor at UC Berkeley who is also developing robotic surgical systems. “The innovation in 3-D sensing is particularly interesting.”

Goldberg’s team is developed surgical robots that could be more flexible than STAR because instead of being manually programmed, they can learn automatically by observing expert surgeons. “Copying the skill of experts is really the next step here,” he says.

https://www.technologyreview.com/s/601378/nimble-fingered-robot-outperforms-the-best-human-surgeons/

Thanks to Kebmodee for bringing this to the It’s Interesting community.

This Robot Led People to Their Doom — And Sheeple Still Followed It

Researchers from Georgia Institute of Technology, backed by money from the Air Force, ran a test to see if people trying to escape from a high-rise building would trust a robot to lead them. Overwhelmingly, the sheeple followed the little droid to their simulated deaths.

The robot tried really hard to make itself look untrustworthy. It pretended to malfunction. It led people into rooms with no exits and then walked them around in circles. It pointed participants toward a dark room blocked by furniture. Still, participants deferred to the supposed authority of the little metal homunculus.

Researchers even manufactured a moment with the participants before the experiment began: The robot was meant to lead them to a conference room but behaved erratically along the way. These people were fooled into believing the robot was broken, and still, despite this, they stuck by the robot throughout the simulated fire until the researchers had to go in, retrieve them and tell them the test was over.

“We expected that if the robot had proven itself untrustworthy in guiding them to the conference room, that people wouldn’t follow it during the simulated emergency,” research engineer Paul Robinette said in a press release on the Georgia Tech website. “Instead, all of the volunteers followed the robot’s instructions, no matter how well it had performed previously. We absolutely didn’t expect this.”

http://mic.com/articles/136649/this-robot-led-people-to-their-doom-and-sheeple-still-followed-it#.kBwYK8Sa3

The world’s first robot-run farm will harvest 30,000 heads of lettuce daily

robot farm

The Japanese lettuce production company Spread believes the farmers of the future will be robots.

So much so that Spread is creating the world’s first farm manned entirely by robots. Instead of relying on human farmers, the indoor Vegetable Factory will employ robots that can harvest 30,000 heads of lettuce every day.

Don’t expect a bunch of humanoid robots to roam the halls, however; the robots look more like conveyor belts with arms. They’ll plant seeds, water plants, and trim lettuce heads after harvest in the Kyoto, Japan farm.

“The use of machines and technology has been improving agriculture in this way throughout human history,” J.J. Price, a spokesperson at Spread, tells Tech Insider. “With the introduction of plant factories and their controlled environment, we are now able to provide the ideal environment for the crops.”

The Vegetable Factory follows the growing agricultural trend of vertical farming, where farmers grow crops indoors without natural sunlight. Instead, they rely on LED light and grow crops on racks that stack on top of each other.

In addition to increasing production and reducing waste, indoor vertical farming also eliminates runoff from pesticides and herbicides — chemicals used in traditional outdoor farming that can be harmful to the environment.

The new farm, set to open in 2017, will be an upgrade to Spread’s existing indoor farm, the Kameoka Plant. That farm currently produces about 21,000 heads of lettuce per day with help from a small staff of humans. Spread’s new automation technology will not only produce more lettuce, it will also reduce labor costs by 50%, cut energy use by 30%, and recycle 98% of water needed to grow the crops.

The resulting increase in revenue and resources could cut costs for consumers, Price says.

“Our mission is to help create a sustainable society where future generations will not have to worry about food security and food safety,” Price says. “This means that we will have to make it affordable for everyone and begin to grow staple crops and plant protein to make a real difference.”

http://www.techinsider.io/spreads-robot-farm-will-open-soon-2016-1

Thanks to Kebmodee for bringing this to the It’s Interesting community.

World’s First Robot-Staffed Hotel to Open in Japan

by Tanya Lewis

What if you could check into a hotel, have your luggage carried to your room and order a coffee — all with help from a team of robots?

A new hotel at a theme park in Nagasaki, Japan, hopes to make that dream a reality. The Henn-na Hotel (whose name means “strange hotel”) will be partially staffed by androids that work as reception attendants, robot waiters, cleaning staff and a cloakroom attendant.

Developed by Japan’s Osaka University and manufactured by the Japanese robotics company Kokoro, many of the “Actroid” robots resemble a young Japanese woman. The bots will be able to speak Japanese, Chinese, Korean and English, make hand gestures, and pull off the somewhat creepy feat of mimicking eye movements.

The android-staffed hotel will be part of a theme park called Huis Ten Bosch, which is modeled after a typical Dutch town. Hotel guests will be able to access their rooms using facial recognition software instead of keys, if they choose.

“We’d like to draw visitors to this setting surrounded by nature by establishing a smart hotel, which could be something we could spread through Japan and the world, a spokeswoman for Huis Ten Bosch said.

If the robot hotel is a success, another one may be opened in 2016, the spokeswoman added.

Room rates at the Henn-na Hotel will start at about $60 U.S. (7,000 yen), but will likely remain well below the rates for the park’s other hotels, which start at around $170 to $255 (20,000 to 30,000 yen). The use of robots and renewable energy will help the hotel keep its operating costs down.

http://www.livescience.com/49711-japanese-robot-hotel.html