The ineptitude of the tech culture in managing real world engineering solutions is now approaching a pathetic comedic level of complete surrender, as the tally continues to pileup of accidents caused by flawed self-driving technology and gullible drivers, who completely put their faith and trust into the raw and developing technology.
The unbelievable list of hedonistic driver no-nos committed in moving automated vehicle is no laughing matter and includes sleeping, texting, cooking a rich curry, watching movies, changing wardrobes in the back seat, exercising, and playing a street race video game. Obviously, the trendy objective in operating a self-driving vehicle is to engage in any activity that does not involve hands on the steering wheel or eyes on the road, as paying attention is no longer cool. Just let the computer handle everything. The disturbing lackadaisical approach to driving smart cars, and the questionable testing and engineering protocol employed by the Google and Tesla in using public streets as live proving grounds, creates an unacceptably potentially dangerous situation for every other driver, biker and pedestrian sharing the streets.
Nearly two months after ride-sharing giant Uber suspended testing in North America of self-driving vehicles in the wake of a horrendous tragedy, where a pedestrian was struck and killed in Phoenix, another sentient automobile and driver catapult to the front page, as the BBC News reports that a Tesla in “autopilot” mode, crashed into a parked police vehicle in California Tuesday. Fortunately, the driver suffered only minor injuries, and the investigation wages on as to who was at fault, the car, the operator, or a combination of the two. Allegedly, certain self-driving infrastructures under perform in differentiating between moving and non-moving entities, and was the primary cause of Tuesday’s incident, a glitch that should completely prevent the vehicles from being allowed to interact with “live” traffic scenarios.
While the precarious allocation of blame balancing act wavers between human and machine and vise versa, the final report released by the National Safety Transportation Board of the deadly Phoenix crash is interesting, if not disconcerting. Officials at the NTSB conclude that the Uber driver had over 6 seconds to avoid the collision with the pedestrian, an eternity in the read and react situation at arterial speeds. Three alarming scenarios could have possibly taken place in considering the concrete data. Either the driver was completely oblivious to the situation until the fatal contact was made, they saw the person too late, or they attempted to disengage and override the technology, and failed to gain complete manual control of vehicle in executing a swerve or stop maneuver. Option one is horrific and chilling and option three is utterly sickening. The haunting monotone of HAL in Arthur C. Clarke’s world of 2001, A Space Odyssey and the life or death chess match between man versus machine, is fair warning for awarding a computer control over real life.
The most telling issue facing the future of the self-driving vehicle marked is voiced eloquently by a driver turned crash test dummy.
“The vigilance required to use the software, such as keeping both hands on the wheel and constantly monitoring the system for malfunctions or abnormal behaviour, arguably requires significantly more attention than just driving the vehicle normally.”
Society can live with just a clear mind, a steering wheel, a gas pedal and a brake. Anything more is pathologically reprehensible.
Read the BBC News story here.