Google’s new app blunders by calling black people ‘gorillas’

google

Google’s new image-recognition program misfired badly this week by identifying two black people as gorillas, delivering a mortifying reminder that even the most intelligent machines still have lot to learn about human sensitivity.

The blunder surfaced in a smartphone screen shot posted online Sunday by a New York man on his Twitter account, @jackyalcine. The images showed the recently released Google Photos app had sorted a picture of two black people into a category labeled as “gorillas.”

The accountholder used a profanity while expressing his dismay about the app likening his friend to an ape, a comparison widely regarded as a racial slur when applied to a black person.

“We’re appalled and genuinely sorry that this happened,” Google spokeswoman Katie Watson said. “We are taking immediate action to prevent this type of result from appearing.”

A tweet to @jackyalcine requesting an interview hadn’t received a response several hours after it was sent Thursday.

Despite Google’s apology, the gaffe threatens to cast the Internet company in an unflattering light at a time when it and its Silicon Valley peers have already been fending off accusations of discriminatory hiring practices. Those perceptions have been fed by the composition of most technology companies’ workforces, which mostly consist of whites and Asians with a paltry few blacks and Hispanics sprinkled in.

The mix-up also surfaced amid rising U.S. racial tensions that have been fueled by recent police killings of blacks and last month’s murder of nine black churchgoers in Charleston, South Carolina.

Google’s error underscores the pitfalls of relying on machines to handle tedious tasks that people have typically handled in the past. In this case, the Google Photo app released in late May uses recognition software to analyze images in pictures to sort them into a variety of categories, including places, names, activities and animals.

When the app came out, Google executives warned it probably wouldn’t get everything right — a point that has now been hammered home. Besides mistaking humans for gorillas, the app also has been mocked for labeling some people as seals and some dogs as horses.

“There is still clearly a lot of work to do with automatic image labeling,” Watson conceded.

Some commentators in social media, though, wondered if the flaws in Google’s automatic-recognition software may have stemmed on its reliance on white and Asian engineers who might not be sensitive to labels that would offend black people. About 94 percent of Google’s technology workers are white or Asian and just 1 percent is black, according to the company’s latest diversity disclosures.

Google isn’t the only company still trying to work out the bugs in its image-recognition technology.

Shortly after Yahoo’s Flickr introduced an automated service for tagging photos in May, it fielded complaints about identifying black people as “apes” and “animals.” Flickr also mistakenly identified a Nazi concentration camp as a “jungle gym.”

Google reacted swiftly to the mess created by its machines, long before the media began writing about it.

Less than two hours after @jackyalcine posted his outrage over the gorilla label, one of Google’s top engineers had posted a response seeking access to his account to determine what went wrong. Yonatan Zunger, chief architect of Google’s social products, later tweeted: “Sheesh. High on my list of bugs you never want to see happen. Shudder.”

http://bigstory.ap.org/urn:publicid:ap.org:b31f3b75b35a4797bb5db3a987a62eb2

Comedy club uses facial recognition to charge by the laugh

comdey club

One Barcelona comedy club is experimenting with using facial recognition technology to charge patrons by the laugh.

The comedy club, Teatreneu, partnered with the advertising firm The Cyranos McCann to implement the new technology after the government hiked taxes on theater tickets, according to a BBC report. In 2012, the Spanish government raised taxes on theatrical shows from 8 to 21 percent.

Cyranos McCann installed tablets on the back of each seat that used facial recognition tech to measure how much a person enjoyed the show by tracking when each patron laughed or smiled.

Each giggle costs approximately 30 Euro cents ($0.38). However, if a patron hits the 24 Euros mark, which is about 80 laughs, the rest of their laughs are free of charge.

There’s also a social element. Get this, at the end of the show the patron can also check their laughter account and share their info on social networks. The comedy club in conjunction with their advertising partner even created a mobile app to be used as a system of payment.

While law enforcement has been developing and using facial recognition technology for quite sometime, more industries are beginning to experiment with it.

Some retailers, for example, are considering using the technology to gauge how people might feel while shopping in a certain section of a store.

The U.K. company NEC IT Solutions is even working on technology that would help retailers to identify V.I.P patrons, such as celebrities or preferred customers.

According to a recent report on EssentialRetail.com, the premium department store Harrod’s has been testing facial recognition during the last two years, albeit, the company has been primarily testing it for security reasons.

Facebook also uses facial recognition technology to suggest tags of people who are in images posted on its site.

http://www.cnbc.com/id/102078398

Stealth Wear fashion to shield people from drones and face-recognition software

drone-proof-burqa
As debate over the use of unmanned aerial vehicles in the U.S. rages on, a fashion designer introduces clothing that blocks drone-mounted infrared cameras.

As the U.S. government draws up plans to use surveillance drones in domestic airspace, opposition to what many consider an unwarranted and significant invasion of privacy is mounting across the country, from rural Virginia to techopolis Seattle. Although officials debate anti-drone legislation at federal, state and local levels, one man is fighting back with high-tech apparel.

A New York City privacy advocate-turned-urban-guerilla fashion designer is selling garments designed to make their wearers invisible to infrared surveillance cameras, particularly those on drones. And although Adam Harvey admits that his three-item Stealth Wear line of scarves and capes is more of a political statement than a money-making venture, the science behind the fashion is quite sound.

“Fighting drones is not my full-time job, but it could be,” says Harvey, an instructor of physical computing at Manhattan’s School of Visual Arts and the creator of the CV Dazzle project, which seeks to develop makeup and hairstyles that camouflage people from face-recognition cameras and software.

Harvey’s newest medium, metalized fabric, has been around for more than 20 years. It holds in body heat that would burn bright for infrared cameras—a characteristic that could prove attractive to those who do not want unmanned aerial vehicles spying on them.

Metalized fabric
Metal is very good at absorbing and scattering infrared light, says Cheng Sun, a Northwestern University assistant professor of mechanical engineering. In that sense there is nothing exotic in how metalized fabric works—it “would strongly attenuate the [infrared] light,” he says. The metal would dissipate heat to surroundings as well, making the wearer harder to pinpoint.

To date, the fabric has primarily been used in tape and gaskets to protect electronics and communications equipment from static electricity and electromagnetic interference, according to Larry Creasy, director of technology for metalized fabric-maker Laird Technologies, based in Saint Louis.

Here’s how metalizing works, at least at Laird: Woven fabric, commonly nylon or polyester, is coated with a special catalyst—a precious metal Creasy declined to specify—that helps copper bind to the fiber. Once dry, the fabric is submerged in a copper sulfate–plating bath and dried. A nickel sulfamate bath follows to help the finished fabric withstand the elements and abrasions. The result is a flexible, breathable fabric that can be cut with ordinary tools but that protects against electromagnetic interference and masks infrared radiation, Creasy says. The process adds weight to the original fabric. An untreated square yard of nylon weighs about 42.5 grams. Treated, the same patch weighs more than 70 grams.

The fashion
Harvey’s fabric is coated with copper, nickel and silver, a combination that gives his scarves, head-and-shoulders cloak and thigh-length “burqa” a silvery and “luxurious” feel. The material blocks cell signals, as well, adding an element of risk to tweeting, texting and other mobile activities, as the wearer must break cover to communicate.

Stealth Wear is sold only via a U.K. Web site. The burqa goes for about $2,300, the “hoodie” is $481 and the scarf is $565—luxury items, but so, too, is privacy today, Harvey says.

The impetus
The high cost and limited availability are significant drawbacks—Harvey says he’s only sold one Stealth Wear item online, a scarf. But the Federal Aviation Administration (FAA) predicts 10,000 commercial drones will ply domestic airspace by 2017—almost twice the that of the U.S. Air Force’s current fleet of unmanned aircraft. The number of drones flying in the U.S. today is hard to pin down because not every company and agency that gets FAA approval to fly a drone actually puts one in the air. In fact, 1,428 private-sector and government requests have been approved since 2007, according to the FAA. A Los Angeles Times report states that 327 of those permits are still active. Meanwhile, President Obama signed a law in February 2012 that gives the FAA until September 2015 to draw up rules that dictate how law enforcement, the military and other entities may use drones in U.S. airspace.

As of October 2012, 81 law agencies, universities, an Indian tribal agency and other entities had applied to the FAA to fly drones, according to documents released by the FAA to the Electronic Freedom Frontier following a Freedom of Information Act lawsuit. Government entities as diverse as the U.S. Department of State and Otter Tail County, Minn., are among them.

Discomfort rising
Although Harvey’s anti-drone fashions are not currently flying off the shelves, he could soon find himself leading a seller’s market if recent events are any metric:

•The Charlottesville, Va., city council has passed a watered-down ordinance that asks the federal and commonwealth governments not to use drone-derived information in court. Proponents had sought to make the city drone-free (pdf).

•Virginia, Minnesota, Oregon, Montana, Arizona (pdf) and Idaho legislators are trying to at least regulate or even prohibit, drones in their skies.

•Seattle Mayor Mike McGinn returned the city’s two surveillance drones after a hostile public reception.

•A bipartisan pair of U.S. Representatives has introduced legislation to limit information-gathering by government-operated drones as well as prohibit weapons on law-enforcement and privately owned unmanned aerial vehicles.

Drone advocates defend the use of the technology as a surveillance tool. “We clearly need to do a better job of educating people about the domestic use of drones,” says Ben Gielow, government relations manager for the Association for Unmanned Vehicle Systems International. Gielow says U.S. voters must decide the acceptability of data collection from all sources, adding, “Ultimately, an unmanned aircraft is no different than gathering data from the GPS on your phone or from satellites.”

GPS does not use infrared cameras, however, and satellites are not at the center the current privacy debate brewing in Washington—factors that could make Harvey’s designs all the more fashionable.

http://www.scientificamerican.com/article.cfm?id=drone-proof-anti-infrared-apparel&page=2

The mannequin that spies on you

Mannequins in fashion boutiques are now being fitted with secret cameras to ‘spy’ on shoppers’ buying habits.

Benetton is among the High Street fashion chains to have deployed the dummies equipped with technology adapted from security systems used to identify criminals at airports.

From the outside, the $3,200 (£2,009) EyeSee dummy looks like any other mannequin, but behind its blank gaze it hides a camera feeding images into facial recognition software that logs the age, gender and race of shoppers.

This information is fed into a computer and is ‘aggregated’ to offer retailers using the system statistical and contextual information they can use to develop their marketing strategies.

Its makers boast: ‘From now on you can know how many people enter the store, record what time there is a greater influx of customers (and which type) and see if some areas risk to be overcrowded.

However, privacy campaigners have denounced the system as ‘creepy’ and said that such surveillance is an instance of profit trumping privacy.

The device is marketed by Italian mannequin maker Almax and has already spurred shops into adjusting window displays, floor layours and promotions, Bloomberg reported.

With growth slowing in the luxury goods industry, the technology taps into retailers’ desperation to personalise their offers to reach increasingly picky customers.

Although video profiling of customers is not new, Almax claims its offering is better at providing data because it stands at eye level with customers, who are more likely to look directly at the mannequins.

The video surveillance mannequins have been on sale for almost a year, and are already being used in three European countries and in the U.S.

Almax claims information from the devices led one outlet to adjust window displays after they found that men shopping in the first two days of a sale spent more than women, while another introduced a children’s line after the dummy showed youngsters made up more than half its afternoon traffic.

A third retailer placed Chinese-speaking staff by a particular entrance after it found a third of visitors using that door after 4pm were Asian.

Almax chief executive Max Catanese refused to name which retailers were using the new technology, telling Bloomberg that confidentiality agreements meant he could not disclose the names of clients.

But he did reveal that five companies – among them leading fashion brands – are using ‘a few dozen’ of the mannequins, with orders for at least that many more.

Almax is now hoping to update the technology to allow the mannequins – and by extension the retailers who operate them – to listen in on what customers are saying about the clothes on display.

Mr Catanese told Bloomberg the company also plans to add screens next to the dummies to prompt passers-by about products that fit their profile, similar to the way online retailers use cookies to personalise web browsing.

Almax insists that its system does not invade the privacy of shoppers since the camera inside the mannequin is ‘blind’, meaning that it does not record the images of passers-by, instead merely collecting data about them.

In an emailed statement, Mr Catanese told MailOnline: ‘Let’s say I pass in front of the mannequin. Nobody will know that “Max Catanese” passed in front of it.

‘The retailer will have the information that a male adult Caucasian passed in front of the mannequin at 6:25pm and spent 3 minutes in front of it. No sensible/private data, nor image is collected.

‘Different is the case if a place (shop, department store, etc.) is already covered by security cameras (by the way, basically almost every retailer in the world today).

‘In those cases we could even provide the regular camera as the data and customers images are already collected in the store which are authorised to do so.

‘In any case, just to avoid questions, so far we only offer the version with blind camera.’

Nevertheless, privacy groups are concerned about the roll-out of the technology. Emma Carr, deputy director of civil liberties campaign group Big Brother Watch, said: ‘Keeping cameras hidden in a mannequin is nothing short of creepy.

‘The use of covert surveillance technology by shops, in order to provide a personalised service, seems totally disproportionate.

‘The fact that the cameras are hidden suggests that shops are fully aware that many customers would object to this kind of monitoring.

‘It is not only essential that customers are fully informed that they are being watched, but that they also have real choice of service and on what terms it is offered.

‘Without this transparency, shops cannot be completely sure that their customers even want this level of personalised service.

‘This is another example of how the public are increasingly being monitored by retailers without ever being asked for their permission. Profit trumps privacy yet again.’

Read more: http://www.dailymail.co.uk/sciencetech/article-2235848/The-creepy-mannequin-stares-Fashion-retailers-adapt-airport-security-technology-profile-customers.html#ixzz2CsSISqiB

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.

Project Glass

 

Google says, “We think technology should work for you—to be there when you need it and get out of your way when you don’t. A group of us from Google[x] started Project Glass to build this kind of technology, one that helps you explore and share your world, putting you back in the moment. We’re sharing this information now because we want to start a conversation and learn from your valuable input. So we took a few design photos to show what this technology could look like and created a video to demonstrate what it might enable you to do.”

https://plus.google.com/u/0/111626127367496192147/posts#111626127367496192147/posts

 

Facial Recognition Technology Now Readily Accessible

Imagine being able to sit down in a bar, snap a few photos of people and quickly learn who they are, who their friends are, where they live, what kind of music they like … even predict their Social Security number. 

Now, imagine you could visit one of those anonymous online dating sites and quickly identify nearly every person there, just from their photos, despite efforts to keep their online romance search a secret.

Such technology is so creepy that it was developed, and withheld, by Google — the one initiative that Google deemed too dangerous to release to the world, according to former CEO Eric Schmidt.

Too late, says Carnegie Mellon University researcher Alessandro Acquisti. 

“That genie is already out of the bottle,” he said Thursday, shortly before a presentation at the annual Las Vegas Black Hat hackers’ convention that’s sure to trouble online daters, bar hoppers and anyone who ever walks down the street.

Using off-the-shelf facial recognition software and simple Internet data mining techniques, Acquisti says he’s proven that most people can now be identified simply through a photograph of their face — and anyone can do the sleuthing. In other words, our faces have become our identities, and there little hope of remaining anonymous in a world where billions of photographs are taken and posted online every month.

http://redtape.msnbc.msn.com/_news/2011/08/04/7254996-your-face-and-the-web-can-tell-everything-about-you

Google Mobile Facial Recognition Application

 

Google has announced plans to introduce a mobile application that would allow users to snap pictures of people’s faces in order to access their personal information.

In order to be identified by the software, people would have to check a box agreeing to give Google permission to access their pictures and profile information.

Google has had the technical capabilities to implement this type of search engine for years, but has delayed its release due to concerns about how privacy advocates might receive the product.

http://www.cnn.com/2011/TECH/mobile/03/31/google.face/index.html?hpt=C2