Google and Indian government partner for toilet-finder tool

by Matt Hickman

World Toilet Day, an annual United Nations-sanctioned day of observance drawing attention to the 2.4 billion people around the world without access to clean and safe sanitation, dropped this past weekend in typically splashy fashion: a Coldplay and Jay-Z concert, the unveiling of a Gates Foundation poop smell-blocking perfume and enough well-meaning potty puns to last well into the new year.

While World Toilet Day is global in scope, much of the awareness-raising, activism-inspiring action this year — aforementioned Coldplay and Jay-Z concert included — was centered around India, a country where an estimated 70 percent of households in both rural and urban areas don’t enjoy the luxury of having a functioning commode. For a majority of India’s 1.2 billion citizens, defecating and urinating in the open is the norm.

Similar to other developing nations, cellphones are far more prevalent than toilets in India. As backwards as this may seem to Westerners, it’s a reality for millions of Indian households. According to a 2012 census, 60 percent of Indian households surveyed have one or more mobile devices while only 36.4 percent of households have a toilet.

Given these statistics, a new partnership between Google and India’s Ministry of Urban Development (MoUD) seems like a match in clean sanitation heaven: the introduction of a Google Maps tool that points users in the direction of toilets that are clean, safe and open for public use. As reported by the International Business Times India, the toilet-finder tool is due to launch this month in Delhi, India’s second most populous city, before potentially becoming available in other major population cities, although the timeline is unclear.

How the app works

Of course, the tool, dubbed Google Toilet Locator, won’t solve India’s underlying toilet shortage problem or reverse cultural attitudes regarding al fresco urination. However, it does help on-the-go Delhi residents more easily find somewhere to go if need be. While we’ve written about urban toilet-finder apps in the past, those have been more or less spurred by convenience (and excessive drinking). Google Toilet Locator, piloted in a city of 25 million where public toilets are far and few between, is more driven by necessity

An unnamed official with the MoUD explains to the IBTimes India that the Google Toilet Locator will pull up all known public lavatories — sulabh shauchalays — across the National Capital Region along with harder-to-find loos located inside of shopping malls, gas stations, hospitals, etc. Listing both deluxe flush situations and standard no-frills squat options, the tool itself is integrated into Google Maps. Mobile users simply must open the app and enter one of numerous keywords in English or Hindi — “toilet,” “restroom,” “lavatory,” “swachhata,” “shauchalay,” etc. — and they’ll be directed to the nearest option based on their location.

Just like a restaurant or retail establishment, Delhi residents — and visitors — can use Google Toilet Locator to rate and comment on specific public restrooms, either providing a glowing recommendation or warning others to stay away.

Explains an official with the MoUD: “The system being put in place relies heavily on crowdsourcing, with people’s feedback helping fuel it. Therefore, if a person finds that a toilet is not clean, he or she can give it a bad review or rating, the facility for which is available on Google Maps.”

Considering that many Delhi residents who will be potentially using the app don’t have a toilet of their own at home, knowing if a public restroom is clean — or even open — is all the more important. Foreign tourists aside, for a large majority of folks using Google Toilet Locator, there isn’t the option of “holding it until I get home.”

Google Toilet Locator is just one of many events and initiatives launched in conjunction with World Toilet Day, which as is tradition, boasts an annual theme. This year, in order to spotlight the oft-overlooked link between economic livelihoods and sanitation, the theme is “Toilets and Jobs.”

For most, the topic of toilets and jobs usual revolves around ill-timed toilet paper shortages, privacy peccadilloes, rude noises or knowing to avoid the men’s room for at least 15 minutes after Ron from accounting goes in. For others, the workplace — and perhaps home, as well — might completely lack a clean, safe bathroom option. Poor sanitation has a direct link to economic well-being — that is, things like absenteeism, exhaustion and decreased productivity rise when employees don’t have access to a toilet at work or at home. In addition to impacting performance, the illnesses associated with poor sanitation keep workers off the job, sometimes temporarily and sometimes for good.

As the World Toilet Day website stresses, providing women with adequate and private bathroom facilities is of particular importance in developing areas.

And because it just wouldn’t be World Toilet Day without a video featuring dancing animated poos, here’s this year’s offering, which in keeping with the jobs theme, also features a variety of hard-working, life-saving “professional” toilets.

http://www.theverge.com/2016/11/16/13651882/google-maps-toilet-locator-india

Google’s AI DeepMind to get smarter by taking on video game Starcraft II

by Jeremy Kahn

Google’s DeepMind AI unit, which earlier this year achieved a breakthrough in computer intelligence by creating software that beat the world’s best human player at the strategy game Go, is turning its attention to the sci-fi video game Starcraft II.

The company said it had reached a deal with Blizzard Entertainment Inc., the Irvine, California-based division of Activision Blizzard, which makes the Starcraft game series, to create an interface to let artificial intelligence researchers connect machine-learning software to the game.

London-based DeepMind, which Google purchased in 2014, has not said it has created software that can play Starcraft expertly — at least not yet. “We’re still a long way from being able to challenge a professional human player,” DeepMind research scientist Oriol Vinyals said in a blog post Friday. But the company’s announcement shows it’s looking seriously at Starcraft as a candidate for a breakthrough in machine intelligence.

Starcraft fascinates artificial intelligence researchers because it comes closer to simulating “the messiness of the real world” than games like chess or Go, Vinyals said. “An agent that can play Starcraft will need to demonstrate effective use of memory, an ability to plan over a long time and the capacity to adapt plans to new information,” he said, adding that techniques required to create a machine-learning system that mastered these skills in order to play Starcraft “could ultimately transfer to real-world tasks.”

Virtual Mining

In the game, which is played in real-time over the internet, players choose one of three character types, each of which has distinct strengths and weaknesses. Players must run an in-game economy, discovering and mining minerals and other commodities in order to conquer new territory.A successful player needs to remember large volumes of information about places they’ve scouted in the past, even when those places are not immediately observable on their screen.

The player’s view of what an opposing player is doing is limited — unlike chess or Go where opponents can observe the whole board at one time. Furthermore,unlike in a game where players take turns, a machine-learning system has to deal with an environment that is constantly in flux. Starcraft in particular also requires an ability to plan both a long-term strategy and make very quick tactical decisions to stay ahead of an opponent — and designing software that is good at both types of decision-making is difficult.

Facebook, Microsoft

Researchers at Facebook Inc. and Microsoft Corp. have also published papers on ways to interface artificial intelligence systems with earlier versions of Starcraft. And some Starcraft-playing bots have already been created, but so far these systems have not been able to defeat talented human players.

Microsoft Chief Executive Officer Satya Nadella has taken swipes at Google’s focus on games in its AI research, telling the audience at a company event in Atlanta in September that Microsoft was “not pursuing AI to beat humans at games” and that Microsoft wanted to build AI “to solve the most pressing problems of our society and economy.”

Games have long-served as important tests and milestones for artificial intelligence research. In the mid-1990s, International Business Machines Corp.’s supercomputer Deep Blue defeated world chess champion Garry Kasparov on several occasions. IBM’s Watson artificial intelligence beat top human players in the game show Jeopardy in 2011, an achievement that showcased IBM’s strides in natural language processing. In 2015, DeepMind developed machine learning software that taught itself how to play dozens of retro Atari games, such as Breakout, as well or better than a human. Then, in March of 2016, DeepMind’s Alpha Go program, trained in a different way, defeated Go world champion Lee Sodol.

In the twenty years since Starcraft debuted, the game has acquired a massive and devoted following. More than 9.5 million copies of the original game were sold within the first decade of its release, with more than half of those being sold in Korea, where the game was especially popular. Starcraft II shattered sales records for a strategy game when it was released in 2010, selling 1.5 million copies within 48 hours. Pitting two players against one another in real-time, Starcraft was a pioneer in professional video game competitions and remains an important game in the world of e-sports, although its prominence has since been eclipsed by other games.

http://www.detroitnews.com/story/business/2016/11/05/deepmind-master-go-takes-video-game-starcraft/93370028/

Google voice search records and keeps conversations people have around their phones – but the files can be deleted


Just talking is enough to activate the recordings – but thankfully there’s an easy way of hearing and deleting them.

by Andrew Griffin

Google could have a record of everything you have said around it for years, and you can listen to it yourself.

The company quietly records many of the conversations that people have around its products.

The feature works as a way of letting people search with their voice, and storing those recordings presumably lets Google improve its language recognition tools as well as the results that it gives to people.

But it also comes with an easy way of listening to and deleting all of the information that it collects. That’s done through a special page that brings together the information that Google has on you.

It’s found by heading to Google’s history page (https://history.google.com/history/audio) and looking at the long list of recordings. The company has a specific audio page and another for activity on the web, which will show you everywhere Google has a record of you being on the internet.

The new portal was introduced in June 2015 and so has been active for the last year – meaning that it is now probably full of various things you have said, which you thought might have been in private.

The recordings can function as a kind of diary, reminding you of the various places and situations that you and your phone have been in. But it’s also a reminder of just how much information is collected about you, and how intimate that information can be.

You’ll see more if you’ve an Android phone, which can be activated at any time just by saying “OK, Google”. But you may well also have recordings on there whatever devices you’ve interacted with Google using.

On the page, you can listen through all of the recordings. You can also see information about how the sound was recorded – whether it was through the Google app or elsewhere – as well as any transcription of what was said if Google has turned it into text successfully.

But perhaps the most useful – and least cringe-inducing – reason to visit the page is to delete everything from there, should you so wish. That can be done either by selecting specific recordings or deleting everything in one go.

To delete particular files, you can click the check box on the left and then move back to the top of the page and select “delete”. To get rid of everything, you can press the “More” button, select “Delete options” and then “Advanced” and click through.

The easiest way to stop Google recording everything is to turn off the virtual assistant and never to use voice search. But that solution also gets at the central problem of much privacy and data use today – doing so cuts off one of the most useful things about having an Android phone or using Google search.

http://www.independent.co.uk/life-style/gadgets-and-tech/news/google-voice-search-records-and-stores-conversation-people-have-around-their-phones-but-files-can-be-a7059376.html

Google invents cyborg lenses for our eyes

by David Goldman

Google has patented a new technology that would let the company inject a computerized lens directly into your eyeball.

The company has been developing smart glasses and even smart contact lenses for years. But Google’s newest patented technology would go even further — and deeper.

In its patent application, which the U.S. Patent and Trademark Office approved last week, Google says it could remove the lens of your eye, inject fluid into your empty lens capsule and then place an electronic lens in the fluid.

Once equipped with your cyborg lenses, you would never need glasses or contacts again. In fact, you might not even need a telescope or a microscope again. And who needs a camera when your eyes can capture photos and videos?

The artificial, computerized lenses could automatically adjust to help you see objects at a distance or very close by. The lenses could be powered by the movement of your eyeball, and they could even connect to a nearby wireless device.

Google says that its patented lenses could be used to cure presbyopia, an age-related condition in which people’s eyes stiffen and their ability to focus is diminished or lost. It could also correct common eye problems, such as myopia, hyperopia, astigmatism.

Today, we cure blurry vision with eyeglasses or contact lenses. But sometimes vision is not correctable.

And there are clear advantages to being a cyborg with mechanical eyes.

Yet Google (GOOGL, Tech30) noted that privacy could become a concern. If your computerized eyes are transmitting data all the time, that signal could allow law enforcement or hackers to identify you or track your movements. Google said that it could make the mechanical lenses strip out personally identifying information so that your information stays secure.

Before you sign up for cyborg eyes, it’s important to note that Google and many other tech companies patent technologies all the time. Many of those patented items don’t end up getting made into actual products. So it’s unclear if Google will ever be implanting computers into your eyes — soon or ever.

http://money.cnn.com/2016/05/04/technology/google-lenses/index.html

Google Unveils Neural Network with “Superhuman” Ability to Determine the Location of Almost Any Image

Here’s a tricky task. Pick a photograph from the Web at random. Now try to work out where it was taken using only the image itself. If the image shows a famous building or landmark, such as the Eiffel Tower or Niagara Falls, the task is straightforward. But the job becomes significantly harder when the image lacks specific location cues or is taken indoors or shows a pet or food or some other detail.

Nevertheless, humans are surprisingly good at this task. To help, they bring to bear all kinds of knowledge about the world such as the type and language of signs on display, the types of vegetation, architectural styles, the direction of traffic, and so on. Humans spend a lifetime picking up these kinds of geolocation cues.

So it’s easy to think that machines would struggle with this task. And indeed, they have.
Today, that changes thanks to the work of Tobias Weyand, a computer vision specialist at Google, and a couple of pals. These guys have trained a deep-learning machine to work out the location of almost any photo using only the pixels it contains.

Their new machine significantly outperforms humans and can even use a clever trick to determine the location of indoor images and pictures of specific things such as pets, food, and so on that have no location cues.

Their approach is straightforward, at least in the world of machine learning. Weyand and co begin by dividing the world into a grid consisting of over 26,000 squares of varying size that depend on the number of images taken in that location.

So big cities, which are the subjects of many images, have a more fine-grained grid structure than more remote regions where photographs are less common. Indeed, the Google team ignored areas like oceans and the polar regions, where few photographs have been taken.

Next, the team created a database of geolocated images from the Web and used the location data to determine the grid square in which each image was taken. This data set is huge, consisting of 126 million images along with their accompanying Exif location data.

Weyand and co used 91 million of these images to teach a powerful neural network to work out the grid location using only the image itself. Their idea is to input an image into this neural net and get as the output a particular grid location or a set of likely candidates.

They then validated the neural network using the remaining 34 million images in the data set. Finally they tested the network—which they call PlaNet—in a number of different ways to see how well it works.

The results make for interesting reading. To measure the accuracy of their machine, they fed it 2.3 million geotagged images from Flickr to see whether it could correctly determine their location.

“PlaNet is able to localize 3.6 percent of the images at street-level accuracy and 10.1 percent at city-level accuracy,” say Weyand and co. What’s more, the machine determines the country of origin in a further 28.4 percent of the photos and the continent in 48.0 percent of them.

That’s pretty good. But to show just how good, Weyand and co put PlaNet through its paces in a test against 10 well-traveled humans. For the test, they used an online game that presents a player with a random view taken from Google Street View and asks him or her to pinpoint its location on a map of the world.

Anyone can play at http://www.geoguessr.com. Give it a try—it’s a lot of fun and more tricky than it sounds.

Needless to say, PlaNet trounced the humans. “In total, PlaNet won 28 of the 50 rounds with a median localization error of 1131.7 km, while the median human localization error was 2320.75 km,” say Weyand and co. “[This] small-scale experiment shows that PlaNet reaches superhuman performance at the task of geolocating Street View scenes.”

An interesting question is how PlaNet performs so well without being able to use the cues that humans rely on, such as vegetation, architectural style, and so on. But Weyand and co say they know why: “We think PlaNet has an advantage over humans because it has seen many more places than any human can ever visit and has learned subtle cues of different scenes that are even hard for a well-traveled human to distinguish.”

They go further and use the machine to locate images that do not have location cues, such as those taken indoors or of specific items. This is possible when images are part of albums that have all been taken at the same place. The machine simply looks through other images in the album to work out where they were taken and assumes the more specific image was taken in the same place.

That’s impressive work that shows deep neural nets flexing their muscles once again. Perhaps more impressive still is that the model uses a relatively small amount of memory unlike other approaches that use gigabytes of the stuff. “Our model uses only 377 MB, which even fits into the memory of a smartphone,” say Weyand and co.

Ref: arxiv.org/abs/1602.05314 : PlaNet—Photo Geolocation with Convolutional Neural Networks

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.

Man buys ‘google.com’ domain for 12 dollars and owns it for one minute before Google cancels transaction

An administration oversight allowed US student Sanmay Ved to buy the right to control the domain on 29 September.

The oversight left him in charge of Google.com for about a minute until Google caught on and cancelled the transaction.

Now Mr Ved has been given a cash reward for spotting the error, which he has decided to donate to charity.

Google declined to comment on the story.

Mr Ved detailed his experience in a post on the LinkedIn site saying that he had been keeping an eye on Google-related web domains for some time because he used to work at the search giant. Mr Ved is currently an MBA student at a US college.

In the early hours of 29 September he noticed a for sale sign next to the Google.com name while browsing sites on Google’s own website-buying service.

He used a credit card to pay the $12 fee to grab google.com and got emails confirming he was the owner. Almost immediately he started getting messages intended for Google’s own web administration team.

This was followed by a cancellation message sent by the website buying service which said he could not take over Google.com because someone else had already registered it and his $12 payment was refunded.

Now it has emerged that Mr Ved has been given a “bug bounty” by Google’s security team for revealing the weakness in the domain buying system. The internal emails Mr Ved received while in charge of google.com have been passed to this team.

Mr Ved decided to give the cash to an Indian educational foundation and in response, Google doubled the reward.

http://www.bbc.com/news/technology-34504319

Google’s new app blunders by calling black people ‘gorillas’

google

Google’s new image-recognition program misfired badly this week by identifying two black people as gorillas, delivering a mortifying reminder that even the most intelligent machines still have lot to learn about human sensitivity.

The blunder surfaced in a smartphone screen shot posted online Sunday by a New York man on his Twitter account, @jackyalcine. The images showed the recently released Google Photos app had sorted a picture of two black people into a category labeled as “gorillas.”

The accountholder used a profanity while expressing his dismay about the app likening his friend to an ape, a comparison widely regarded as a racial slur when applied to a black person.

“We’re appalled and genuinely sorry that this happened,” Google spokeswoman Katie Watson said. “We are taking immediate action to prevent this type of result from appearing.”

A tweet to @jackyalcine requesting an interview hadn’t received a response several hours after it was sent Thursday.

Despite Google’s apology, the gaffe threatens to cast the Internet company in an unflattering light at a time when it and its Silicon Valley peers have already been fending off accusations of discriminatory hiring practices. Those perceptions have been fed by the composition of most technology companies’ workforces, which mostly consist of whites and Asians with a paltry few blacks and Hispanics sprinkled in.

The mix-up also surfaced amid rising U.S. racial tensions that have been fueled by recent police killings of blacks and last month’s murder of nine black churchgoers in Charleston, South Carolina.

Google’s error underscores the pitfalls of relying on machines to handle tedious tasks that people have typically handled in the past. In this case, the Google Photo app released in late May uses recognition software to analyze images in pictures to sort them into a variety of categories, including places, names, activities and animals.

When the app came out, Google executives warned it probably wouldn’t get everything right — a point that has now been hammered home. Besides mistaking humans for gorillas, the app also has been mocked for labeling some people as seals and some dogs as horses.

“There is still clearly a lot of work to do with automatic image labeling,” Watson conceded.

Some commentators in social media, though, wondered if the flaws in Google’s automatic-recognition software may have stemmed on its reliance on white and Asian engineers who might not be sensitive to labels that would offend black people. About 94 percent of Google’s technology workers are white or Asian and just 1 percent is black, according to the company’s latest diversity disclosures.

Google isn’t the only company still trying to work out the bugs in its image-recognition technology.

Shortly after Yahoo’s Flickr introduced an automated service for tagging photos in May, it fielded complaints about identifying black people as “apes” and “animals.” Flickr also mistakenly identified a Nazi concentration camp as a “jungle gym.”

Google reacted swiftly to the mess created by its machines, long before the media began writing about it.

Less than two hours after @jackyalcine posted his outrage over the gorilla label, one of Google’s top engineers had posted a response seeking access to his account to determine what went wrong. Yonatan Zunger, chief architect of Google’s social products, later tweeted: “Sheesh. High on my list of bugs you never want to see happen. Shudder.”

http://bigstory.ap.org/urn:publicid:ap.org:b31f3b75b35a4797bb5db3a987a62eb2

An Android robot is urinating on an Apple logo in Google Maps

In the outskirts of Rawalpindi, a Pakistani city less than 10 miles southwest of Islamabad, is what appears to be a park in the shape of an Android robot peeing on an Apple logo.

At least, that’s what shows up if you look up Rawalpindi in Google Maps.

he park is not actually there — it’s an illustration.

It’s not clear how long the image has been there. When you look at “satellite view,” you’ll see a few residential roads, a bit of green space and some hills — nothing that looks remotely like an Android peeing on an apple.

It was discovered Friday by Ahmad Babar, a former Samsung employee living in Lahore, Pakistan.

On Facebook (FB, Tech30), Babar posted that he came across the Android image while looking for a place in Rawalpindi.

Google said the image was not created by an employee. The company has a group of vetted contributors who add to the Maps tool in order to keep Google Maps up to date, and one of those contributors drew the image.

“The vast majority of users who edit our maps provide great contributions, such as mapping places that have never been mapped before,” said Caroline Matthews, a spokeswoman for Google. “We’re sorry for this inappropriate user-created content; we’re working to remove it quickly.”

Google (GOOGL, Tech30) is no stranger to so-called Easter eggs — hidden treasures in its products. Just try typing “tilt,” “do a barrel roll,” “recursion,” “anagram,” “once in a blue moon,” or “answer to life the universe and everything” into Google’s search engine.

In Google Images, type “Atari breakout.” Google Translate has Pirate, Elmer Fudd, Klingon and Pig Latin options.

There are literally dozens of Google Maps Easter eggs in addition to the Android peeing on an Apple (AAPL, Tech30) logo, including a tie-dyed Street View character in Berkeley, California and a “royal carriage” transportation option when asking for directions to Windsor Castle.

A spokeswoman for Apple did not respond to requests for comment.

http://money.cnn.com/2015/04/24/technology/android-peeing-on-apple-google-maps/index.html?iid=TL_Popular

Woman who bared her breasts to Google Street View charged with disorderly conduct

Karen Davis was photographed on Google Street View flashing her breasts.
Police reported her for disorderly behaviour and she must report to court.
Police said her ‘actions were the same as someone flashing their genitals.’
SA country town mum hit back at critics saying they are insecure.
She plans to do a topless skydive for her 40th birthday next year.

A woman who notoriously flashed her K-cup breasts on Google Street View has been charged by police with disorderly behaviour.

Karen Davis, from Port Pirie in South Australia, was captured streaking by a camera car for the popular Google Maps app, which allows users to zoom in on certain streets and towns in cities all over the world with a 360-degree view.

Police released a statement alleging the 38-year-old mother ‘pursued’ the Google car to make sure she was captured exposing herself, and that it was an illegal act.

‘The woman’s actions were the same as someone flashing their genitals and the public expectation is that we take action,’ said Superintendent Scott Denny of Port Pirie police.

‘Recently in Port Pirie we arrested a man for exposing himself in public – this incident is no different,’ he said.

‘It is not appropriate for anyone to expose themselves in public places. Our community should be able to expect a bit of decency.’

Ms Davis will be summonsed to appear in the Port Pirie Magistrates Court at a date to be determined.

In the image, Ms Davis can be seen holding her arms up in the air with her T-shirt hunched up around her neck bearing her breasts, as she follows the Google camera cars around the street.

Her sons are playing in the background and an unknown man stands at the fence watching.

Across the road, a neighbour is lounging on her outdoor furniture, watching the whole thing unfold.

The 38-year-old, who plans to skydive topless for her 40th birthday, has hit back at the controversy over her actions, claiming that ‘flat-tittie chicks’ are not confident enough with their own bodies and should focus on how they look.

Speaking to Daily Mail Australia, Ms Davis was in tears over the nasty comments coming from her community after she was branded a ‘bad mother’ and ‘pure filth’ for her raunchy behaviour.

‘They are narrow-minded people who are not happy with their own bodies,’ she said.

Posting on her Facebook account, Ms Davis addressed the fact that she pursued the car through Barry Street in Port Pirie until they got the perfect shot and believes locals are jealous of her antics.

‘Haters hate, you got the guts to do it?’ she posted on Facebook after the photo went public.

‘All the flat-tittie chicks think I am disgusting. Big-boob envy has hit Port Pirie.’

Taking to Facebook, disgusted commenters attacked Ms Davis’ parenting skills after it became clear that her two sons were in the background of the picture.

‘I’m sure your children will be proud of their mother that is probably going to cause them a lot of embarrassment,’ one Facebook commenter said.

‘Oh goodness. Can’t even begin to imagine how her children are feeling,’ another user said.

However a select few came out in support of Ms Davis’s show on Google Maps.

‘Let her go, she’s having some fun, Pirie people need to lighten up a bit. if more lovely ladies would get them out more often the world would be a much happier place,’ one commenter said.

Ms Davis told Daily Mail Australia that she thought the act would be funny and that it was an item she has now ticked off her bucket list.

She also said that she has a friend in the United Kingdom and she thought it would brighten up his day if he saw the image online.

‘I have a friend in the UK. If he looks on there he will smile,’ she said.

Ms Davis wasn’t sure that the photo would make it on to Google Maps but she said she is delighted that it did.

‘I think maybe some need to start their own bucket list and leave mine alone,’ she said.

She also revealed that since the photo has been released she has attracted a whole host of new friend requests on Facebook.

Many young men have tried to befriend her but she has not accepted any of them.

Ms Davis said she has only learnt to embrace her size-K breasts in the last few years after spending her youth hiding them away.

‘I always got picked on and it wasn’t until late in my 20’s that I became confident in myself,’ she said.

She also revealed that she has to buy her bras online from the UK as they do not make size-K bras in Australia.

‘It would be nice if they made my size bra in Australia,’ she said.

Ms Davis said that she would do it all again, even considering the backlash the image has received.

‘It’s my life not theirs,’ she said.

‘When you point your finger at me, you have 4 pointing back at yourself.’

Some people online have suggested that she should be formally charged for her display but she has contacted the police who have confirmed that they have ‘no concerns’.

read more: http://www.dailymail.co.uk/news/article-3020958/Fun-police-gone-far-Woman-exposed-size-k-boobs-Google-street-view-CHARGED-disorderly-behaviour.html#ixzz3W3DG7Vf0

The eternity drive: Why DNA could be the future of data storage

By Peter Shadbolt, for CNN

How long will the data last in your hard-drive or USB stick? Five years? 10 years? Longer?

Already a storage company called Backblaze is running 25,000 hard drives simultaneously to get to the bottom of the question. As each hard drive coughs its last, the company replaces it and logs its lifespan.

While this census has only been running five years, the statistics show a 22% attrition rate over four years.

Some may last longer than a decade, the company says, others may last little more than a year; but the short answer is that storage devices don’t last forever.

Science is now looking to nature, however, to find the best way to store data in a way that will make it last for millions of years.

Researchers at ETH Zurich, in Switzerland, believe the answer may lie in the data storage system that exists in every living cell: DNA.

So compact and complex are its strands that just 1 gram of DNA is theoretically capable of containing all the data of internet giants such as Google and Facebook, with room to spare.

In data storage terms, that gram would be capable of holding 455 exabytes, where one exabyte is equivalent to a billion gigabytes.

Fossilization has been known to preserve DNA in strands long enough to gain an animal’s entire genome — the complete set of genes present in a cell or organism.

So far, scientists have extracted and sequenced the genome of a 110,000-year-old polar bear and more recently a 700,000-year-old horse.

Robert Grass, lecturer at the Department of Chemistry and Applied Biosciences, said the problem with DNA is that it degrades quickly. The project, he said, wanted to find ways of combining the possibility of the large storage density in DNA with the stability of the DNA found in fossils.

“We have found elegant ways of making DNA very stable,” he told CNN. “So we wanted to combine these two stories — to get the high storage density of DNA and combine it with the archaeological aspects of DNA.”

The synthetic process of preserving DNA actually mimics processes found in nature.

As with fossils, keeping the DNA cool, dry and encased — in this case, with microscopic spheres of glass – could keep the information contained in its strands intact for thousands of years.

“The time limit with DNA in fossils is about 700,000 years but people speculate about finding one-million-year storage of genomic material in fossil bones,” he said.

“We were able to show that decay of our DNA and store of information decays at the same rate as the fossil DNA so we get to similar time frames of close to a million years.”

Fresh fossil discoveries are throwing up new surprises about the preservation of DNA.

Human bones discovered in the Sima de los Huesos cave network in Spain show maternally inherited “mitochondrial” DNA that is 400,000 years old – a new record for human remains.

The fact that the DNA survived in the relatively cool climate of a cave — rather than in a frozen environment as with the DNA extracted from mammoth remains in Siberia – has added to the mystery about DNA longevity.

“A lot of it is not really known,” Grass says. “What we’re trying to understand is how DNA decays and what the mechanisms are to get more insight into that.”

What is known is that water and oxygen are the enemy of DNA survival. DNA in a test tube and exposed to air will last little more than two to three years. Encasing it in glass — an inert, neutral agent – and cooling it increases its chances of survival.

Grass says sol-gel technology, which produces solid materials from small molecules, has made it a relatively easy process to get the glass around the DNA molecules.

While the team’s work invites immediate comparison with Jurassic Park, where DNA was extracted from amber fossils, Grass says that prehistoric insects encased in amber are a poor source of prehistoric DNA.

“The best DNA comes from sources that are ceramic and dry — so teeth, bones and even eggshells,” he said.

So far the team has tested their storage method by preserving just 83 kilobytes of data.

“The first is the Swiss Federal Charter of 1291 — it’s like the Swiss Magna Carta — and the other was the Archimedes Palimpsest; a copy of an Ancient Greek mathematics treatise made by a monk in the 10th century but which had been overwritten by other monks in the 15th century.

“We wanted to preserve these documents to show not just that the method works, but that the method is important too,” he said.

He estimates that the information will be readable in 10,000 years’ time, and if frozen, as long as a million years.

The cost of encoding just 83Kb of data cost about $2,000, making it a relatively expensive process, but Grass is optimistic that price will come down over time. Advances in technology for medical analysis, he said, are likely to help with this.

“Already the prices for human genome sequences have dropped from several millions of dollars a few years ago to just hundreds of dollars now,” Grass said.

“It makes sense to integrate these advances in medical and genome analysis into the world of IT.”

http://www.cnn.com/2015/02/25/tech/make-create-innovate-fossil-dna-data-storage/index.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+rss%2Fcnn_latest+%28RSS%3A+Most+Recent%29