Blog: The Growing Risk of the Fix-It-Later Culture

Article By : Brian Santo

Some in the high tech industry mistake irresponsibility for "disruption." We can't let those people run the auto industry.

The automotive industry is being “disrupted” by the electronics industry, and together they’re on the verge of doing something dangerous and irresponsible. As car makers and their high-tech suppliers imbue motor vehicles with increasing levels of autonomy, they are heading toward absolving themselves of all responsibility for their products. Tesla and its cheerleaders are pushing the issue right now. It’s behavior that should not be tolerated.

Corporations have long practiced the art of limiting their legal culpability. Is there anyone unfamiliar with the advisory to “use at your own risk”? Kitchen cutlery, playground equipment, sewing machines — there are any number of products that have some inherent element of danger. And it is eminently reasonable to expect people to use such products with caution and be responsible for what happens when using them.

There are other products, such as chainsaws, firearms, and chemical sprayers, which if operated improperly can be a danger not only to the user but to others as well. Even so, the operator is still largely responsible — again, reasonably so. As part of that responsibility it is also reasonable to expect the user to maintain such products so that they function properly and are safe — keep them oiled, store them unloaded, use as directed. 

Then there is a separate category reserved for products too large, complex, and sophisticated that their safe operation is beyond the control of their operators — systems that are too complex for the user to be expected to maintain and be entirely responsible for, and products which can be dangerous to anyone who happens to be in, on, or near (purposefully or merely coincidentally). Responsibility for airplanes reasonably goes beyond pilots and includes ground crews, airlines, airports, the manufacturers and government overseers. It’s similar for explosives and demolition experts, and construction cranes and their operators, for example.

Motor vehicles have long been in the category of products which, when operated improperly, can be a danger to others. For the most part, individual drivers are entirely responsible for the operation of their vehicles, and also for their maintenance.

But upon adding autonomous functions, motor vehicles shift into that third category — or they should. Vehicles with autonomy become too complex for drivers alone to be responsible for. And that’s where the worst of Silicon Valley comes in.  

 ALT TEXT HERE

Over the years, the technology industry has achieved something extraordinary: it treats its customers as guinea pigs and it makes them happy about it. There are millions upon millions of people — smartphone users and gamers — who have grown up eager to be beta testers, happy to risk getting hung up on flaws in both software and hardware in exchange for the honor of access to a product prior to its availability to the average consumer.

App culture has got consumers comfortable with software errors. “There’s a bug? Someone will report it and do a patch overnight. No problem!” Of course, that same culture encourages some companies to tolerate slapdash code-writing and to adopt a cavalier attitude when it comes to product testing. “Of course, there’ll be a bunch of bugs! Nobody cares! We’ll patch ‘em later!”

The fix-it-later mentality has become acceptable, especially when the product is software.

There isn’t much damage that bug-ridden software can lead to on a consumer electronics device. But bug-riddled software in a vehicle (or other autonomous system) can lead to physical injury and death. That’s a key distinction that seems to be shrugged off by technology geeks who treat the fix-it-later way of doing things as an intrinsic element of technological progress.

It is not. Commercializing defective products is sloppy and irresponsible, and nothing more. That sloppiness and irresponsibility has been tolerable — acceptable, even — because it hasn’t had any serious consequences. Yet.

Junko Yoshida wrote about Tesla’s Smart Summon feature. It’s a Level 4 feature, which ordinarily would mean that the OEM (in this case Tesla) is largely responsible for anything that goes wrong when it’s activated. But Tesla is pretending it’s a Level 2 feature, which means Tesla is shifting responsibility to Tesla customers for anything that goes wrong. Semicast analyst Colin Barnden told Yoshida he thinks Smart Summon is just a gimmick, meant to show off (their conversation on the topic is in a recent podcast: AVs and the Blame Game ● Indian IC Industry Ascendent ● The Artistry of AI).

But I’m of a more conspiratorial bent. Is it a gimmick? Yes. Is it just a gimmick? No, I don’t think so. I think it’s Tesla practicing to see what our tolerance is for shifting what should be Tesla’s responsibility to Tesla drivers.

So far, given Tesla fanboy reaction, it’s working.

One of the most insidious practices mastered by the high-tech industry is the imposition of end user licensing agreements. EULAs tend to be pages-long masterpieces of mind-numbing legalese, much of which is often devoted to companies absolving themselves completely of any imaginable responsibility, liability, or culpability of any kind for any possible shortcoming in the functioning or operation of their products. Because they’re unreadable, nobody reads them, but agreeing to them is obligatory, and this is how EULAs have come to be seen as trivial.

They are not trivial.

I think it is inevitable that an automotive OEM — most likely one of the “innovative” high-tech companies that prides itself on “disruption” — is going to try to get away with foisting EULAs on their customers, absolving themselves of responsibility for the operation of their Levels 3- to-5 autonomous vehicles, vehichles that should be considered, by anyone sensible, in the same category as jet aircraft or sticks of dynamite.  

That avenue should be blocked off now, before it’s too late. If it’s not done by common agreement now, it will have to be done by legislation later, and it will have to be legislated at the Federal level. It can start at the state level (California often leads in this area), but if safety regulation remains isolated with the states, we end up with a patchwork of safety regulations. At that point we don’t have safety; we just have more legal mess waiting to happen.

I’m worried that we, as a society, have become so habituated to being beta guinea pigs, and to reflexively agreeing EULAs, that we’ll end up letting auto OEMs foist off upon us the legal responsibility for the operation of autonomous devices — making us liable for risks that we couldn’t possibly be truly responsible for unless we mindlessly and stupidly volunteer to assume them. That will end in sorrow.

Subscribe to Newsletter

Test Qr code text s ss