Posts Tagged ‘VANESSA BATES RAMIREZ’

By Vanessa Bates Ramirez

A Norwegian container ship called the Yara Birkeland will be the world’s first electric, autonomous, zero-emissions ship.

With a capacity of up to 150 shipping containers, the battery-powered ship will be small compared to modern standards (the biggest container ship in the world holds 19,000 containers, and an average-size ship holds 3,500), but its launch will mark the beginning of a transformation of the global shipping industry. This transformation could heavily impact global trade as well as the environment.

The Yara Birkeland is being jointly developed by two Norwegian companies: agricultural firm Yara International, and agricultural firm, and Kongsberg Gruppen, which builds guidance systems for both civilian and military use.

The ship will be equipped with a GPS and various types of sensors, including lidar, radar, and cameras—much like self-driving cars. The ship will be able to steer itself through the sea, avoid other ships, and independently dock itself.

The Wall Street Journal states that building the ship will cost $25 million, which is about three times the cost of a similarly-sized conventional ship. However, the savings will kick in once the ship starts operating, since it won’t need traditional fuel or a big crew.

Self-driving cars aren’t going to suddenly hit the streets straight off their production line; they’ve been going through multiple types of road tests, refining their sensors, upgrading their software, and generally improving their functionality little by little. Similarly, the Yara Birkeland won’t take to the sea unmanned on its first voyage, nor any of its several first voyages, for that matter.

Rather, the ship’s autonomy will be phased in. At first, says the Journal, “a single container will be used as a manned bridge on board. Then the bridge will be moved to shore and become a remote-operation center. The ship will eventually run fully on its own, under supervision from shore, in 2020.”

Kongsberg CEO Geir Haoy compared the ship’s sea-to-land bridge transition to flying a drone from a command center, saying, “It will be GPS navigation and lots of high-tech cameras to see what’s going on around the ship.”

Interestingly, there’s currently no legislation around autonomous ships (which makes sense since, well, there aren’t any autonomous ships, either). Lawmakers are getting to work, though, and rules will likely be set up by the time the Yara makes it first fully-autonomous trip.

The ship will sail between three ports in southern Norway, delivering Yara International fertilizer from a production facility to a port called Larvik. The planned route is 37 nautical miles, and the ship will stay within 12 nautical miles of the coast.

The United Nations’ International Maritime Organization estimates over 90 percent of the world’s trade is carried by sea, and states that maritime transport is “By far the most cost-effective way to move en masse goods and raw materials around the world.”

But ships are also to blame for a huge amount of pollution; one study showed that just 15 of the world’s biggest ships may emit as much pollution as all the world’s cars, largely due to the much higher sulfur content of ship fuel. Oddly, shipping emission regulations weren’t included in the Paris Agreement.

Besides reducing fuel emissions by being electric, the Yara Birkeland will supposedly replace 40,000 truck drives a year through southern Norway. Once regulations are in place and the technology has been tested and improved, companies will start to build larger ships that can sail longer routes.

https://singularityhub.com/2017/07/30/the-worlds-first-autonomous-ship-will-set-sail-in-2018/?utm_source=Singularity+Hub+Newsletter&utm_campaign=23e95e4fd1-Hub_Daily_Newsletter&utm_medium=email&utm_term=0_f0cf60cdae-23e95e4fd1-58158129

Advertisements

By Vanessa Bates Ramirez

3D printing is being used to produce more and more novel items: tools, art, even rudimentary human organs. What all those items have in common, though, is that they’re small. The next phase of 3D printing is to move on to things that are big. Really big. Like, as big as a house.

In a small town in western Russia called Stupino, a 3D printed house just went up in the middle of winter and in a day’s time.

Pieces of houses and bridges have been 3D printed in warehouses or labs then transported to their permanent locations to be assembled, but the Stupino house was printed entirely on-site by a company called Apis Cor. They used a crane-sized, mobile 3D printer and a specially-developed mortar mix and covered the whole operation with a heated tent.

The 38-square-meter (409-square-foot) house is circular, with three right-angled protrusions allowing for additional space and division of the area inside. Counter-intuitively, the house’s roof is completely flat. Russia’s not known for mild, snow-free winters. Made of welded polymer membranes and insulated with solid plates, the roof was designed to withstand heavy snow loads.

Apis Cor teamed up with partners for the house’s finishing details, like insulation, windows, and paint. Samsung even provided high-tech appliances and a TV with a concave-curved screen to match the curve of the interior wall.

According to the company, the house’s total building cost came to $10,134, or approximately $275 per square meter, which equates to about $25 per square foot. A recent estimate put the average cost of building a 2,000 square foot home in the US at about $150 per square foot.

The homes of the future?

Since these houses are affordable and fast to build, is it only a matter of time before we’re all living in 3D printed concrete circles?

Probably not—or, at least, not until whole apartment buildings can be 3D printed. The Stupino house would be harder (though not impossible) to plop down in the middle of a city than in the Russian countryside.

While cities like Dubai are aiming to build more 3D printed houses, what many have envisioned for the homes of the future are environmentally-friendly, data-integrated ‘smart buildings,’ often clad with solar panels and including floors designated for growing food.

Large-scale 3D printing does have some very practical applications, though. Take disaster relief: when a hurricane or earthquake destroys infrastructure and leaves thousands of people without shelter, 3D printers like Apis Cor’s could be used to quickly rebuild bridges, roads, and homes.

Also, given their low cost and high speed, 3D printed houses could become a practical option for subsidized housing projects.

In the US, tiny houses have been all the rage among millennials lately—what if that tiny house could be custom-printed to your specifications in less than a week, and it cost even less than you’d budgeted?

Since software and machines are doing most of the work, there’s less margin for human error—gone are the days of “the subcontractor misread the blueprint, and now we have three closets and no bathrooms!”

While houses made by robots are good news for people looking to buy a basic, low-cost house, they could be bad news for people employed in the construction industry. Machines have been pouring concrete for decades, but technologies like Apis Cor’s giant printer will take a few more human workers out of the equation.

Nonetheless, the company states that part of their mission is “to change the construction industry so that millions of people will have an opportunity to improve their living conditions.”

https://singularityhub.com/2017/03/05/watch-this-house-get-3d-printed-in-24-hours/?utm_source=Singularity+Hub+Newsletter&utm_campaign=12834f7547-Hub_Daily_Newsletter&utm_medium=email&utm_term=0_f0cf60cdae-12834f7547-58158129

By Vanessa Bates Ramirez

In recent years, technology has been producing more and more novel ways to diagnose and treat illness.

Urine tests will soon be able to detect cancer: https://singularityhub.com/2016/10/14/detecting-cancer-early-with-nanosensors-and-a-urine-test/

Smartphone apps can diagnose STDs:https://singularityhub.com/2016/12/25/your-smartphones-next-big-trick-to-make-you-healthier-than-ever/

Chatbots can provide quality mental healthcare: https://singularityhub.com/2016/10/10/bridging-the-mental-healthcare-gap-with-artificial-intelligence/

Joining this list is a minimally-invasive technique that’s been getting increasing buzz across various sectors of healthcare: disease detection by voice analysis.

It’s basically what it sounds like: you talk, and a computer analyzes your voice and screens for illness. Most of the indicators that machine learning algorithms can pick up aren’t detectable to the human ear.

When we do hear irregularities in our own voices or those of others, the fact we’re noticing them at all means they’re extreme; elongating syllables, slurring, trembling, or using a tone that’s unusually flat or nasal could all be indicators of different health conditions. Even if we can hear them, though, unless someone says, “I’m having chest pain” or “I’m depressed,” we don’t know how to analyze or interpret these biomarkers.

Computers soon will, though.

Researchers from various medical centers, universities, and healthcare companies have collected voice recordings from hundreds of patients and fed them to machine learning software that compares the voices to those of healthy people, with the aim of establishing patterns clear enough to pinpoint vocal disease indicators.

In one particularly encouraging study, doctors from the Mayo Clinic worked with Israeli company Beyond Verbal to analyze voice recordings from 120 people who were scheduled for a coronary angiography. Participants used an app on their phones to record 30-second intervals of themselves reading a piece of text, describing a positive experience, then describing a negative experience. Doctors also took recordings from a control group of 25 patients who were either healthy or getting non-heart-related tests.

The doctors found 13 different voice characteristics associated with coronary artery disease. Most notably, the biggest differences between heart patients and non-heart patients’ voices occurred when they talked about a negative experience.

Heart disease isn’t the only illness that shows promise for voice diagnosis. Researchers are also making headway in the conditions below.

ADHD: German company Audioprofiling is using voice analysis to diagnose ADHD in children, achieving greater than 90 percent accuracy in identifying previously diagnosed kids based on their speech alone. The company’s founder gave speech rhythm as an example indicator for ADHD, saying children with the condition speak in syllables less equal in length.
PTSD: With the goal of decreasing the suicide rate among military service members, Boston-based Cogito partnered with the Department of Veterans Affairs to use a voice analysis app to monitor service members’ moods. Researchers at Massachusetts General Hospital are also using the app as part of a two-year study to track the health of 1,000 patients with bipolar disorder and depression.
Brain injury: In June 2016, the US Army partnered with MIT’s Lincoln Lab to develop an algorithm that uses voice to diagnose mild traumatic brain injury. Brain injury biomarkers may include elongated syllables and vowel sounds or difficulty pronouncing phrases that require complex facial muscle movements.
Parkinson’s: Parkinson’s disease has no biomarkers and can only be diagnosed via a costly in-clinic analysis with a neurologist. The Parkinson’s Voice Initiative is changing that by analyzing 30-second voice recordings with machine learning software, achieving 98.6 percent accuracy in detecting whether or not a participant suffers from the disease.
Challenges remain before vocal disease diagnosis becomes truly viable and widespread. For starters, there are privacy concerns over the personal health data identifiable in voice samples. It’s also not yet clear how well algorithms developed for English-speakers will perform with other languages.

Despite these hurdles, our voices appear to be on their way to becoming key players in our health.

https://singularityhub.com/2017/02/13/talking-to-a-computer-may-soon-be-enough-to-diagnose-illness/?utm_source=Singularity+Hub+Newsletter&utm_campaign=14105f9a16-Hub_Daily_Newsletter&utm_medium=email&utm_term=0_f0cf60cdae-14105f9a16-58158129

BY VANESSA BATES RAMIREZ

Drivers on Colorado’s interstate 25 may have gotten a good scare last Thursday, and it wasn’t a Halloween prank—glancing into the cab of an Otto 18-wheeler loaded with a beer delivery, they’d have been stunned to notice there was no one at the wheel.

In the first-ever commercial shipment completed using self-driving technology, the truck drove itself 120 miles from Fort Collins to Colorado Springs while its human driver sat in the sleeper cab. The driver did have control of the truck from departure until it got on the highway, and took over again when it was time to exit the highway.

Uber acquired Otto in August for $680 million. The company partnered with Anheuser-Busch for its first autonomous delivery, which consisted of 50,000 cans of beer—cargo many would consider highly valuable.

How the trucks work

Because of the relatively constant speed and less-dense surroundings, highway driving is much simpler for a driverless vehicle than city driving. There are no stop signs or pedestrians to worry about, and it’s not even necessary to change lanes if the delivery’s not on a tight schedule.

To switch from human driver to self-driving mode, all the driver had to do was press a button labeled “engage,” and this kicked the truck’s $30,000 of retro-fitted technology into action: there are three lidars mounted on the cab and trailer, a radar attached to the bumper, and a high-precision camera above the windshield.

The company made sure to plan the trip at a low-traffic time and on a day with clear weather, carefully studying the route to make sure there wouldn’t be any surprises the truck couldn’t handle along the way.

Why they’re disruptive

Though self-driving cars certainly get more hype than self-driving trucks do, self-driving truck are currently more necessary and could have an equally disruptive, if not larger, effect on the economy. Anheuser-Busch alone estimates it could save $50 million a year (and that’s just in the US) by deploying autonomous trucks across its distribution network.

Now extrapolate those savings over the entire trucking industry, extending the $50 million estimate to every company that delivers a similar volume of cargo throughout the US via trucks. The total easily leaps into the billions.

But what about all those jobs?

This doesn’t mean the company would fire all its drivers; savings would come from primarily from reduced fuel costs and a more efficient delivery schedule.

As of September 2016, the trucking industry employed around 1.5 million people, and 70 percent of cargo in the US is moved by trucks, with total freight tonnage predicted to grow 35% over the next ten years.

That’s a lot of freight. And as it turns out, the industry is sorely lacking in drivers to move it. The American Trucking Association estimates its current shortfall of drivers at 48,000. So rather than displacing jobs, autonomous trucking technology may actually help lift some of the burden off a tightly-stretched workforce.

Rather than pulling over to sleep when they get tired, drivers could simply time their breaks to coincide with long stretches of highway, essentially napping on the job and saving valuable time, not to mention getting their deliveries to their destinations faster.

In an interview with Bloomberg, Otto president and co-founder Lior Ron assured viewers that trucking jobs aren’t going anywhere anytime soon: “The future is really those drivers becoming more of a copilot to the technology, doing all the driving on city streets manually, then taking off onto the highway, where the technology can help drive those long and very cumbersome miles… for the foreseeable future, there’s a driver in the cabin and the driver is now safer, making more money, and can finish the route faster.”

Besides taking a load off drivers, self-driving trucks will likely make the roads far safer. According to the Insurance Institute for Highway Safety, about one in ten highway deaths occurs in a crash involving a large truck, and over 3,600 people were killed in large truck crashes in 2014.

The biggest culprit? Human error.

It’s not a done deal just yet

Otto’s trucks are considered to be in the Level 4 group of autonomous vehicles, which means human drivers are unnecessary in reasonably-controlled environments; on the highway, drivers can actually take a nap if they want to. In comparison, Tesla’s Autopilot system is considered Level 2, meaning it helps the driver by maintaining speed and avoiding obstacles, but the driver still needs to be engaged and paying close attention.

Besides the fact that the technology has a ways to go before being ready for large-scale deployment, barriers like regulation and plain old resistance to change could slow things down.

Drivers interviewed for a New York Times article were far from endorsing the co-pilot idea, due both to safety concerns and the degree to which self-driving technology would change the nature of their jobs.

If it were me, I know a whole lot of testing would have to be done before I’d be okay with falling asleep inside a vehicle moving at 60 miles an hour without a driver.

Once the technology’s been proven to a fail-proof rate, however, truckers may slowly adapt to the idea of being able to drive 1,200 miles in the time it used to take to drive 800.

An Uber Self-Driving Truck Just Took Off With 50,000 Beers