Two more reasons Apple doesn’t sell a self-driving automobile

From the New York Times’s “2 Killed in Driverless Tesla Car Crash, Officials Say” posted Monday:

Two men were killed in Texas after a Tesla they were in crashed on Saturday and caught fire with neither of the men behind the wheel, the authorities said.

Mark Herman, the Harris County Precinct 4 constable, said that physical evidence from the scene and interviews with witnesses led officials “to believe no one was driving the vehicle at the time of the crash.”

The vehicle, a 2019 Model S, was going at a “high rate of speed” around a curve at 11:25 p.m. local time when it went off the road about 100 feet and hit a tree, Constable Herman said. The crash occurred in a residential area in the Woodlands, an area about 30 miles north of Houston.

The men were 59 and 69 years old. One was in the front passenger seat and one in the rear seat, Constable Herman said.

My take: It’s a tossup who was more reckless, the two guys who let a Tesla drive itself into a tree or Elon Musk, sold them the thing. I’m reminded of the piece in EE Times by Colin Barnden that I quoted in January:

Uber seemingly didn’t think, or didn’t care, about the risks of hurling 4,500-pound Volvo XC90s around the streets of Tempe, Arizona, using test-level software and poorly trained safety drivers. It foolishly pursued the Silicon Valley mantra of “move fast and break things” in its efforts to be first to market with a self-driving robotaxi service and succeeded in killing Elaine Herzberg in March 2018…

For the last five or so years, we have been watching a multi-billion-dollar game of poker play out under the guise of “self-driving.” Uber was the robotaxi joker and Tesla the sucker that showed everyone the liability risks associated with poorly designed automated driving systems and inadequate driver monitoring in mass-market vehicles.

Apple, in comparison, is neither stupid nor a sucker. Apple did what Apple does and sat on the sidelines and watched developments unfold. It waited. And waited. And waited…

34 Comments

  1. Horace Dediu said:
    “No one was driving the vehicle at the time of the crash”? Tesla has the option for “full self-driving” so if that was enabled then the officials are wrong and there certainly was a driver.

    8
    April 19, 2021
    • Rodney Avilla said:
      It appears you may be applying your definitions to ‘Musk’s words’ (the Tesla operating manual). Your definition implies the car IS the driver, no human needed. Tesla’s definition, which is well stated, ‘the car is driving, BUT a human MUST be at the controls, to take over if need be’. I realize that politicians and lawyers excel at exchanging definitions. But it is wrong and unfair to do so. If your goal is to understand and communicate accurately, then seek to understand their intended message. To do otherwise causes deception and misunderstanding which is why lawyers and politicians do it so often. The purpose is usually to demonize a person or point of view.

      2
      April 19, 2021
    • John Butt said:
      Most manufacturers have ACC, usually referred to as assisted cruise control. Tesla calls the same thing Autodrive, and it’s unusable on my Model 3 compared to my 4 year old Hyundai Ioniq on ACC

      Tesla has a long way to go to catch existing manufacturers safety efforts. This is not a iPhone like disruption

      0
      April 19, 2021
    • Steven Noyes said:
      FYI: The car did not even have FSD available.

      0
      April 20, 2021
  2. David Emery said:
    A friend whose daughter goes to school in Texas observed, “Don’t see many Teslsas in Texas.” (I wrote back, “Fewer now…”) I’m guessing range would be a concern in Texas and other “wide open places”.

    2
    April 19, 2021
    • Arthur Cheng said:
      Actually there are quite a few in Houston. I spend enough time there to know. They are driven around town rather than long distances.

      3
      April 19, 2021
      • Jerry Doyle said:
        @Arthur Cheng: You are correct. Texas has quite many Teslas driving the state roads and they drive long distances too, such as Houston to San Antonio (197 miles), Houston to Dallas (239 miles) and Houston to Austin (165 miles) three of the largest cities in the top ten most populous of the nation. The corridor from San Antonio to Austin is only 82 miles.

        0
        April 19, 2021
  3. Kirk DeBernardi said:
    I’ve heard it said that you can’t fix stupid.

    10
    April 19, 2021
    • Gregg Thurman said:
      Can’t up vote you more than once. Wished I could.

      This accident occurred because humans didn’t follow manufacturer’s instructions regarding self-driving mode. But let’s ignore that and blame the manufacturer.

      Self-driving has been statistically safer, despite the stupids, than humans for a couple years now.

      Would I use self-driving? Yes, but would I trust it to the extent that I wouldn’t sit behind the wheel, NO.

      5
      April 19, 2021
      • Steven Noyes said:
        I agree 100% with this. For this to happen, the two occupants had to intentionally override built in safety measures to prevent this. Blaming Musk for this is intellectually dishonest and is similar to blaming Ford for someone driving too fast around a tight curve.

        1
        April 19, 2021
  4. Jerry Doyle said:
    “…. However, it (Tesla) warns that “current Autopilot features require active driver supervision and do not make the vehicle autonomous.”’

    Tesla can not preclude stupidity. Who in their rightful mind believes we are at fully autonomous driving vehicles (Level 5) on our current roads today? I regret the loss of life here, but to be driving at a high rate of speed in the Woodlands (which I know well) on those curvy roads late at night borders on the ignoramuses. Apple will be cognizant fully of this accident, but in no way will this avoidable accident preclude Apple’s car plans anymore than people getting on planes to fly after 9/11. Tesla may modify currently a feature where the car will not operate unless someone is sitting in the driver’s seat. This accident in no way is the fault of Tesla. This unfortunate and avoidable accident evolved from arrogance, self importance and over bearing pride.

    8
    April 19, 2021
    • Bart Yee said:
      One issue with a person sitting in the driver’s seat is that they could be completely un- or in-attentive to “driving”. They could be on their phone, a tablet, or watching in-car entertainment, or even be asleep. So a passenger in the driver’s seat is no guarantee they would be aware enough to take over if something were to happen.

      In this Tesla crash, its is unclear why the Tesla approached a known curve at a speed that could cause it to lose control. With modern traction control, speed limiting software / map coordination, and sensors, it is baffling why the Tesla would lose control unless it flat didn’t and simply drove off the road instead. Assuming (and it isn’t likely) the control box/cpu/black box survived an intense, recurrent, and very hot Lithium-fed battery fire (lithium burns/oxidizes with water to produce hydrogen which self-ignites from the heat of the lithium/water/oxygen reaction), the car’s actual behavior can be analyzed and reaction to the curve understood. This is where the failure of sensors, algorithms, and decision making is examined.

      It is not clear whether the owner could input adverse driving parameters (exceed speed limits for example) or whether weather and road conditions played a role. A very slippery slope indeed.

      2
      April 19, 2021
      • Jerry Doyle said:
        I will repost here, an earlier comment I made above: We may use all the sophisticated word substitutions or alternative euphemisms we so choose to use but folk; a reason exists why there is a steering wheel in every Tesla vehicle sold.

        0
        April 19, 2021
      • Steven Noyes said:
        Given neither autopilot nor FSD (not purchased for this car) were engaged….

        Your comment makes no sense.

        0
        April 20, 2021
  5. Jerry Doyle said:
    Came across this excellent comment in “MarketWatch!”
    Miguel White
    14 minutes ago

    Two alleged “educated” participants, supposedly an engineer and a doctor, get in a Tesla vehicle, one in the back seat and the other in the front passenger seat. They engage the “autopilot” and start to move, a short time later they impact a tree at high speed. This is a Darwin award if there ever was one.

    Tesla on it’s site and operating manuals CLEARLY state in multiple places that the “autopilot” and other driver assist features are only to be used with a human driver with their hands on the wheel, ready to take over operation at any time.

    These factors absolve Tesla of ALL liability. Two people used the vehicle in a way that was not supported, and the manufacturer clearly stated this. The two people are wholly to blame for the outcome, despite what any law firm may claim.

    There should be no reason the Tesla stock should be down.

    2
    April 19, 2021
    • Steven Noyes said:
      Don’t forget, they had to actively defeat measures in place to avoid this. I think Forest Gump had a saying for this.

      1
      April 19, 2021
    • Bart Yee said:
      “There should be no reason the Tesla stock should be down.”
      Jerry, while I somewhat agree with your comments, Tesla stock is going to do what it does, in response to what logical and illogical stock traders, investors, and people do with their Tesla investments/short/trading position. It matter not one whit to them who or what is responsible for this terrible accident, rather they care only about making or losing money in the short term. They are Mr. Market, not accident investigators.

      2
      April 19, 2021
  6. Peter Kropf said:
    I guess the big issues are:

    1 – Is Tesla still supporting a mode Musk calls “full self-driving”?

    2 – When did Musk stop calling it “full self-driving”?

    Finally, how strong is the legal team bringing a class action suit against Musk’s “full self-driving’?

    PS – Of course, bank robbers say they rob banks because that’s where the money is. Lawyers say the same for suing Tesla.

    5
    April 19, 2021
  7. Rodney Avilla said:
    “My take: It’s a tossup who was more reckless”

    In London, where the murder rate has passed NYC at times, the weapon of choice is the knife. England must be the home of a lot of reckless knife sellers.

    3
    April 19, 2021
  8. THOMAS E FARRIS JR said:
    Greg is right. Now if we can only get gun owners to be responsible for their actions and leave the gun manufacturers out of it. It seems like a simple recognition. “Tesla’s don’t kill people, People do.”

    3
    April 19, 2021
  9. Rodney Avilla said:
    Because of a recent murder case, Apple decides to stop selling some computer models (those weighing over 10 lbs. Details? An angry wife recently killed her husband using her Mac Pro when she discovered his multiple affairs. The method, the weapon, and the motive were all Mac Pro computer related. The suspect discovered the affairs on her Mac Pro (social media), learned how much force and weight of an object was needed to cause significant damage (wikipedia), and was motivated by a recent Mac Pro article, “The New Mac Pro Enables You to Do Things You Never Imagined”.
    Footnote: Defense lawyers argued that nowhere in the manual did it state the Mac Pro should not be used in this way.

    5
    April 19, 2021
  10. Rodney Avilla said:
    Sorry, but one last comment. 20 min ago I was driving in my Tesla on autopilot, unbuckled the seat belt and the car immediately started slowing down with loud alarms going off. While stopped, I tried engaging the autopilot, even sitting in driver’s seat. Nothing happened. Nothing. This story is definitely missing crucial information.

    3
    April 19, 2021
  11. Ken Cheng said:
    I looked at the address where this took place.
    30.154814, -95.531329
    https://maps.apple.com/?address=Hammock%20Dunes%20Pl,%20Spring,%20TX%20%2077389,%20United%20States&ll=30.154814,-95.531329&q=Hammock%20Dunes%20Pl&_ext=EiYpldmScHsmPkAxA1wMYFbiV8A5kYTeKMgoPkBB5zAFKazhV8BQBA%3D%3D

    FSD, Full-Self Driving and/or AP, AutoPilot, cannot activate on an unlined road. Not only was this an unlined road, it’s a short residential dead-end lane in a mega-mansion gated community.

    Even if it could be activated, the speed is limited to 5mph over the limit. A residential neighborhood is typically 25mph, so 30mph is the limit for AP/FSD.

    It’s clear to me that the deceased driver is now likely in the back seat.

    I now see a report that this was two elderly friends, taking out the car for a test drive. It’s likely just unfamiliarity. Late at night, they might have had a few drinks before hand. A tragedy, but not likely an AutoPilot or FSD one.

    Very likely alot of bad reporting and investigators involved who don’t really know how Teslas operate.

    1
    April 19, 2021
  12. Ken Cheng said:
    My original comment has a link, so it’s “awaiting moderation”. So, here it is without the link to where the accident took place:

    I looked at the address where this took place.
    30.154814, -95.531329

    FSD, Full-Self Driving and/or AP, AutoPilot, cannot activate on an unlined road. Not only was this an unlined road, it’s a short residential dead-end lane in a mega-mansion gated community.

    Even if it could be activated, the speed is limited to 5mph over the limit. A residential neighborhood is typically 25mph, so 30mph is the limit for AP/FSD.

    It’s clear to me that the deceased driver is now likely in the back seat.

    I now see a report that this was two elderly friends, taking out the car for a test drive. It’s likely just unfamiliarity. Late at night, they might have had a few drinks before hand. A tragedy, but not likely an AutoPilot or FSD one.

    Very likely alot of bad reporting and investigators involved who don’t really know how Teslas operate.

    0
    April 19, 2021
  13. Jerry Doyle said:
    Update: “…. A tweet by Tesla TSLA Chief Executive Elon Musk appeared to boost shares of the electric-car maker in late trade. “Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD [Full Self-Driving],” he said on Twitter TWTR.”

    1
    April 20, 2021

Leave a Reply