03 April 2018

WOULD YOU TRUST YOUR LIFE TO A SELF-DRIVING CAR ?

American Lawrence Sperry created the world's first successful aircraft autopilot 105 years ago in 1912 and whenever we take a commercial flight somewhere today, we place our lives in the hands of a latter-day, computer-controlled autopilot linked to the aircraft's flight management system, thus enabling the aircraft to be flown on autopilot from take-off to landing, though at any time, the pilot can intervene to alter the settings of the autopilot or to immediately disengage it and manually fly the plane id the need arises. 

Many trains today, especially urban transit trains such as those used on the Klang Valley's Kelana Line are driverless - instead being driven under computer control, much like lifts (or "elevators" to Americans) which automatically stop at the required floors.

In recent years there have been trials of self-driving cars on the public roads and there have already been been some on accidents, such as on the night of Wednesday, 21 March 2018, when Elaine Herzberg, 49 was hit head on by an Uber self-driving vehicle whilst pushing her bicycle across Mill Avenue near Curry Road in Tempe, Arizona and died as a result.

Below is a screen cap from the video released by the Tempe Police, showing Elaine Herzberg pushing her bicycle moments before she was hit by the oncoming car.
(If you cannot see the embedded image, please enable "View Images" or "Load Images" in your e-mail client.)



And below is a link to the full video on You Tube.

CLICK TO VIEW VIDEO

Then there is the Bloomberg report "Tesla Driver Died Using Autopilot, With Hands Off Steering Wheel" on 31 March 2018. where a Tesla Model X driven using autopilot car collided with a highway barrier on 23 March 2018 and caught fire resulting in the death of its driver Wei Huang.
CLICK HERE TO READ

The above Bloomberg article referred to a blog post by TeslaTeam on 30 March 2018 which stated:-

"In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken."

"The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash."

"Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability."

"In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident."

"Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists."
CLICK HERE TO READ

Whilst I don't have any details at hand to refute or confirm Tesla's claim of fewer crashes and fatalities of cars driven using its autopilot and whilst some may blame Elaine Herzberg instead of the Uber car's self-driving system, since she crossed the road where there is no designated pedestrian crossing, however unlike with aircraft flown by autopilot or computer-controlled trains, road traffic including of vehicles and pedestrians is very much less predictable and erratic, more so in countries such as Malaysia, where cars and motorcycles  suddenly cut across lanes, brake suddenly, beat traffic lights, drive or ride on the wrong side of the road  against regular traffic flow, drivers don't respect the right of way of others, pedestrians cross the road at un-designated crossings such as in the case of Elaine Herzberg and with the recent "Mat Lajak" youth sub-culture in Malaysia where teenagers ride their bicycles in a herd in a dangerous manner, such as the eight teenagers in Johor Baru who died when they were hit by a car whilst riding their bicycles in the middle of a dark road at around 3 am on Saturday 17th February 2018 resulting in a tragedy as shown in the video below.

CLICK TO VIEW VIDEO - Car rams into cycling teens, eight dead

So I wonder how advanced car autopilot technology will have to be to be able to adequately cope with such erratic road behaviour, especially in heavily congested traffic and dark roads, considering that the road Elaine Herzberg crossed that fateful night had relatively little traffic at the time and she was still run down by that self-driven car. 

Proponents and defenders of self-driving cars may argue that the technology is still very new and needs to be improved and enhanced to include more sensors and smarter control systems to be able to cope with such erratic and unpredictable road traffic and users, which is true but unlike with aircraft and trains where professionally managed and maintained subject to strict regulations, how sure can we be that all motorists will religiously maintain their vehicles to ensure all its additional advanced safety features and facilities are in proper working order, when many already don't do so with the existing more basic features and facilities due to financial constraints, a lackadaisical attitude or some other reason.

If many do not maintain their vehicles in tip top working order now, can we expect that they will maintain all these additional safety sensors, on-board computer, GPS navigation and so forth systems on their self-driving vehicles in tip top working order in the future.

It's easy for the marketing types of the ICT industry and tech-futurists to imagine and preach how things should be when the realities on the ground don't work likewise.  

The New York Times article below which questions the claimed safety of self-driving cars made me realise why I have always felt so uncomfortable with all these claims my information and communication technology marketers and tech-futurists that advancements in the technology which replace or reduce the role of humans will somehow make the world some kind of paradise on earth and solve problems resulting from human habits and behaviour.

Especially these few paragraphs below jive with my own feelings with regards the many starry-eyed claims I have heard and read over the last two decades:-

“Technology does not eliminate error, but it changes the nature of errors that are made, and it introduces new kinds of errors,” said Chesley Sullenberger, the former US Airways pilot who landed a plane in the Hudson River in 2009 after its engines were struck by birds and who now sits on a Department of Transportation advisory committee on automation. “We have to realize that it’s not a panacea.”

"Experts who are skeptical about the unceasing forward march of technology say fatalities are rising because public officials have become so enamored with the shiny new thing, self-driving cars, that they have taken their eyes off problems they could be solving today. In the federal government and most states, there appears to be little interest in or patience for doing the tedious work of identifying and implementing policies and technologies with proven track records of saving lives now, as opposed to some time in the distant future."

"Silicon Valley technologists would argue that algorithms and machine learning will simply leapfrog what they might dismiss as the legacy problem of human fallibility. But Mr. Sullenberger, for one, is worried that the rush to develop automated cars will lead to many unforeseen problems. “Even though there is a sense of urgency to prevent human-caused accidents,” he told me, “we need to do it in a responsible way, not the fastest way.”

Unlike Sullenberger who has real world experience on the ground, the Silly Con Valley technologies and tech-futurists have their heads in the clouds and it's claims like these which have irked me through over two decades of writing about the ICT industry, especially when I find little change or even worsening of the reality ground compared to the rosy picture of a "glorious future" painted by the marketers and futurists of the ICT industry - whether it be about online education and computer-based learning versus traditional teacher-based learning, about gee whiz traffic information and data analysis systems solving traffic congestion on our roads without much or without any real-world modifications or remedies to the causes of such congestion being implemented on the ground.

Basically, people who have experience of helping to solve real world problems on the ground find the claims of the Silly Con Valley technologists, marketers and tech-futurists rather pie in the sky, such as in the case of creating Bitcoin to circumvent financial problems due to central banks such as the Federal Reserve, instead of dealing with the problems due to these central banks head on.

Well Bitcoin's price has been going sideways at around US$7,000 since 30th March 2017 and showing no signs of shooting up to US$1 million by 2020, so Mr. McAfee had better start thinking of the sauce to go with his penis, which he said he would eat if its price does not reach US$ 1 million by then.


The New York Times article referred to follows below.

nytimes.com

Opinion | The Bright, Shiny Distraction of Self-Driving Cars





The promise of self-driving cars can be alluring — imagine taking a nap or watching a movie in a comfortable armchair while being shuttled safely home after a long day at work. But like many optimistic images of the future, it is also a bit of an illusion.

Automated cars may indeed make commuting more pleasurable while preventing accidents and saving tens of thousands of lives — someday. But a recent fatal crash in Tempe, Ariz., involving a car operated by Uber that was tricked out with sensors and software meant to turn it into a latter-day version of K.I.T.T. from the TV show “Knight Rider” suggests that at least some of these cars are not ready for the hustle and bustle of American roads. In fact, the technology that powers these vehicles could introduce new risks that few people appreciate or understand. For example, when a computer controlling the car does not hit the brakes to avoid a collision, the person in the driver’s seat — many automated cars on the road today still require someone to be there in case of an emergency — may also fail to intervene because the driver trusts the car too much to pay close attention to the road. According to a video released by Tempe police, that is what appears to have happened in the Uber crash.

“Technology does not eliminate error, but it changes the nature of errors that are made, and it introduces new kinds of errors,” said Chesley Sullenberger, the former US Airways pilot who landed a plane in the Hudson River in 2009 after its engines were struck by birds and who now sits on a Department of Transportation advisory committee on automation. “We have to realize that it’s not a panacea.”

Mr. Sullenberger is hardly a technophobe. He has flown passenger jets crammed with advanced electronics and software and has a keen professional interest in technology. What concerns him and other safety experts is that industry executives and government officials are rushing headlong to put self-driving cars on the road without appropriate safeguards and under the unproven hypothesis that the technology will reduce crashes and fatalities. The Senate, for instance, is considering a bill that would exempt self-driving cars from existing federal regulations and pre-empt state and local governments from regulating them. And Arizona became a hotbed of self-driving testing by telling auto and technology companies — like Uber — that it will not ask too many questions or institute a lot of new rules.

Even as officials place a big bet that autonomous cars will solve many of our safety problems, American roads are becoming less safe. More than 37,000 people were killed on American roads in 2016, up 5.6 percent from 2015, according to government data. The National Safety Council, a research and advocacy organization, estimates that the death toll was more than 40,000 in 2017.

Consider automatic braking systems. The Insurance Institute for Highway Safety estimates that there is a 42 percent reduction in rear-end crashes that cause injuries when this technology is installed on cars. Advocates for Highway and Auto Safety and other public interest groups asked the Transportation Department in 2015 to require that all new trucks, buses and other commercial vehicles have such systems, which have been around for years. The department accepted that petition but has yet to propose a rule. The government did reach a voluntary agreement with 20 automakers to make automatic braking a standard feature on cars and light trucks by September 2022.

Even as American regulators have dragged their feet, other industrialized countries have made great strides in reducing traffic crashes over the last two decades. Road fatality rates in Canada, France, Germany and Sweden, for example, are now less than half the rate in the United States. And no, these countries don’t have fleets of self-driving cars. They have reduced accidents the old-fashioned way. Some of them have worked to slow down traffic — speed is a leading killer. They have added medians and made other changes to roads to better protect pedestrians. And European regulators have encouraged the use of seatbelts by putting visual reminders even in the back seat. Germany, which has the high-speed autobahn, also requires much more rigorous driver education and testing than most American states do.

“The things that have been killing us for decades are still killing us: speed, impaired driving, not using seatbelts,” said Deborah Hersman, the former chairman of the National Transportation Safety Board who now heads the National Safety Council. “The things that we know can save lives, some of them don’t cost any money, like seatbelts.”

Silicon Valley technologists would argue that algorithms and machine learning will simply leapfrog what they might dismiss as the legacy problem of human fallibility. But Mr. Sullenberger, for one, is worried that the rush to develop automated cars will lead to many unforeseen problems. “Even though there is a sense of urgency to prevent human-caused accidents,” he told me, “we need to do it in a responsible way, not the fastest way.”

https://www.nytimes.com/2018/03/31/opinion/distraction-self-driving-cars.html

Yours truly

IT.Scheiss
http://itsheiss.blogspot.my/