Thursday 30 June 2016

Self Driving Cars Are Coming, But is That a Good Thing?

I have been excited for a while, since I heard that Google was experimenting with self-driving cars. It seems that this technology is set to change our lives forever. But now I have just learned that these changes are potentially fraught with unseen, shocking threats. Apparently we do have real cause for concern. Read on: 
Self-Driving Cars
001
Google’s Self-Driving Car Project is paving the way for a future of millions of cars driving passengers all around the world, without any of us controlling the vehicles. We will simply have to input our required destination, then the car will navigate its way, using its sensors to detect objects in its path, causing the car to modify its speed and direction in reaching its desired goal. It’s estimated that some 10 million models will be self-driving by 2020.

What are the benefits?

003
The principal benefit humankind will hopefully acquire from these technological wonders is safety. People make mistakes on the road frequently enough for there to be many unnecessary accidents, leading to horrific injuries and fatalities. Computers, however, are able to reach a pitch of almost-infallible proficiency, without any deadly tiredness, anger or drunkenness becoming a factor. Self-driving cars can interact and share data with each other, making collisions less likely. Pedestrians and passengers alike can rejoice in a future of greater road safety. 

What are the risks?

The great human inventions are a testament to mankind's ingenuity, but they are also to man’s detriment. Every time some task is made easier by technological discoveries, we gradually lose, through lack of practice, our ability to do things for ourselves. How many people can start a fire using just natural materials? And how many can cook a family meal from scratch? Soon, we may be asking, ‘who can drive?’ This may matter in situations where technology fails, placing an untested driver at the wheel suddenly, face to face with a responsibility that technology has prevented them from preparing for. Our generation will be okay, but what about the next?
005
Like any computerized technology, things can wrong with self-driving cars. Software can become corrupted and hackers could take control of a vehicle, effectively kidnapping the ‘driver.’ Several cars could be remotely manipulated causing deliberate traffic jams for other devious purposes. The possibilities for crooks and terrorists could potentially be endless. 
 
Ethical Dilemmas
006
Another problem with self-driving cars is that ethical dilemmas are taken from the hands of humans and given to machines to resolve, posing some uncomfortable questions to arise. In crisis situations on the road, drivers are given split-second decisions to make about their own lives and others’. A driver may face five individuals in the road he can’t prevent hitting without swerving onto a pedestrian on the pavement. Would a computer be programmed to make such utilitarian decisions, choosing the greater number over the lesser? Is it right for a computer to make these life and death decisions for us?

Will such a car prefer females before males, children before the elderly? Would this mean the car accepts that death is sometimes necessary? These are all hard questions, and yet the programming work is already being done behind closed doors; we have no idea what instructions these cars contain. This should be of grave concern to all citizens.

In the near future, perhaps all our manifold devices – phones, watches and other unheard of things – will contain immeasurable data about our lives and our health. Will cars use this data to make judgments about whether we should be sacrificed for a superior person in a crisis situation? The possibility is alarmingly Orwellian. It’s enough to make me regret the advance of this particular piece of technology. 

Watch this video that considers the case against self-driving cars
Watch this considered video that argues that self-driving cars are something for us to be afraid of. And remember that driving is a human skill to be cherished.

Fatal crash of Tesla Model S in autopilot mode leads to investigation by federal officials

James F. Peltz and Samantha Masunaga
Federal regulators opened a preliminary probe into the autopilot feature on a Tesla Model S electric car after a fatal crash involving the technology, Tesla said Thursday.
The fatality – thought to be the first in the auto industry related to an autopilot feature – sparked questions about the limitations of the technology and its place in what is seen as an inevitable march toward self-driving vehicles. It followed other recent incidents in which drivers reported collisions while using such technology.
The male driver died in a May 7 crash in Williston, Fla., when a big rig made a left turn in front of his Tesla.
In a blog post, Tesla Motors Inc. said the 2015 car passed under the trailer, with the bottom of the trailer hitting the Model S’ windshield.
“Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” Tesla said.
Tesla said it immediately reported the fatal crash to the National Highway Traffic Safety Administration. The automaker emphasized that its autopilot feature is still in a beta phase of introduction and has limitations.
“Autopilot is getting better all the time but it is not perfect and still requires the driver to remain alert,” the Palo Alto-based company said in the post.
Tesla noted that the autopilot technology comes disabled and requires “explicit acknowledgement” from the driver before activation that the system is still in a public beta phase.
When autopilot is activated, “the car reminds the driver to ‘Always keep your hands on the wheel. Be prepared to take over at any time.’”
The company noted that when drivers activate the feature, “the car reminds the driver to ‘Always keep your hands on the wheel. Be prepared to take over at any time.’ “
Analysts generally agreed that the Tesla fatality would be more a wake-up call to motorists that the autopilot features are fallible rather than a major hit against Tesla’s brand reputation.
“We do not yet have fully autonomous cars,” said Karl Brauer, a senior analyst at Kelley Blue Book. “It might be this tragic event starts a kind of movement of educating consumers.”
NHTSA said it would evaluate “the design and performance of automated driving systems in the Tesla Model S” along with investigating the vehicle and the crash scene.
The agency said it also would “gather additional data regarding this incident and other information regarding the automated driving systems.”
Scott Galloway, founder of the brand research firm L2, said the momentum of self-driving technology likely will continue despite the tragedy.
“Unless statistics show this is not only as dangerous but more dangerous than traditional modes of driving, I don’t think you’re going to see a slowdown” in using the technology, he said.
George Peterson, president of the research firm AutoPacific Inc. in Tustin, said the issue of the Tesla Model S’ sensors not picking up the white truck during a  bright day is similar to issues involving autonomous vehicles and snow, as it can be difficult to determine the center line of the road or lane markers.
“Those are technology issues that the manufacturers are trying to sort out right now,” Peterson said. “They’re making a lot of progress really quickly, but it’s not a foolproof system, as this shows.”
He said adding beta options in cars such as Tesla’s autopilot feature is “pretty unusual” but that Tesla adds disclaimers before use. Peterson also said that he didn’t think the crash would deter Tesla customers.
“Tesla seems to be a Teflon company,” he said. “Even though Tesla doesn’t sell a tremendous amount of cars, their stock price is hugely high, and there’s a huge group of Tesla fans out there. I don’t think it’ll put them off – I think it’ll make them more cautious when they’re using autopilot.”
Tesla’s stock price, which closed at $212.28 on Thursday, fell 2.5% in after-hours trading.
The Model S is Tesla’s mainstay vehicle so far, and the sedan accounted for most of the 50,580 vehicles Tesla delivered last year.
Rob Enderle, president and principal analyst at technology strategy firm Enderle Group, said the fatality was “a reminder that cars on the road haven't been deployed as self-driving cars, and people shouldn't be driving them hands-off at freeway speeds on the open road.”
“Autopilot as it’s positioned is the next generation of cruise control where the driver is supposed to stay engaged, where you can't depend on this to drive the car for you,” he said. “This could've been easily driver error. This isn't a self-driving car.”
One motorist, Alex Roy, said he and two friends drove a Tesla Model S from Los Angeles to New York in 2015 and spent 96% of the drive on autopilot. Roy, an New York City editor at large at TheDrive, a car website, said the only time he felt unsafe was when he took his hands off the wheel for too long. He blamed himself for that.
“I found myself getting overconfident in the system,” said Roy, adding that Tesla makes it clear that the car cannot operate completely autonomously but that it’s tempting to let go of the wheel because the system generally works so well.
“I took my hands off the wheel [at one point] and thought the car was still on autopilot and it wasn’t, and I almost hit something,” he said.
The need to maintain control became clear to Aaron Souppouris when he test-drove a Model S in April. Souppouris, a senior editor at the blog Engadget, said Tesla loaned him the car for an article about the autopilot feature and he drove it about 500 miles around England.
There were times at night, he said, when the car went back and forth within a lane and seemed “skittish,” he said.
Once on autopilot, the car tried to change lanes but then reverted back suddenly, and another time it disengaged the autopilot mode in the middle of a lane switch, Souppouris said. The car did better during the day than at night, he said.
Times staff writers Natalie Kitroeff and Paresh Dave contributed to this report.