Why The New York Times Tesla Model 3 Review Is Nonsense

a close up of a logo© Alex Roy

Remember back in 2016 when Elon Musk suggested that critics of self-driving cars were killing people? He may yet be right, but not until self-driving cars are both commercially available and demonstrably safer than humans. Between now and then we have a deeper problem, which is that the majority of self-driving media coverage is absolute nonsense. The stench of clickbait around self-driving is so thick, it’s almost impossible to find a story burdened with accurate information. Between Business Insider, the fools, and the shills grasping for the 40% of global automotive media traffic with “Tesla” in the headline, I thought the toilet bowl of coverage was full.

I was wrong. The latest offender? The New York Times’ absurd Tesla Model 3 review.

What is happening at the New York Times? One would expect the paper of record to have people with basic knowledge of their topic areas, if not expertise. Sadly, when it comes to automotive coverage, there’s no there there. They fired everyone with sector knowledge last year, including The Drive’s very own expert Lawrence Ulrich. That error became glaringly clear with the Times’ October 2017 outrage, “Driverless Cars Made Me Nervous. Then I Tried One“, in which the esteemed David Leonhardt contradicted the headline in the first sentence, calling the Volvo S90 “semi-driverless”, a term that exists nowhere except in the pages of publications too cheap to keep experts on the masthead.

a screenshot of a video camera© Provided by TIME Inc.

A car is self-driving, or it isn’t.

A car is driverless, or it isn’t.

A car has semi-autonomous features, or it doesn’t.

That the widely recognized Society of Automotive Engineers (SAE) automation levels are vague and need revamping is no excuse; Leonhardt doesn’t even cite them. The headline and article are so misleading as to make the New York Times complicit in the very storm of self-driving disinformation they would seek to clear.

Musk might yet be half-right, but he was also half wrong. It’s not just the critics of self-driving that are killing people. It’s also the supporters. When the Times is foolish enough to conflate “driverless”, “self-driving”, “semi-autonomous” and “semi-driverless”, people who don’t know better might start believing Tesla Autopilot actually is an autonomous system.

Despite a flurry of Twitter criticism, the New York Times didn’t resolve its total of automotive expertise, it doubled down with the most irresponsible and inaccurate Tesla Autopilot article yet from a mainstream publication, “With Tesla in a Danger Zone, Can Model 3 Carry It to Safety?

This is a really bad article.© Provided by TIME Inc. This is a really bad article.

Let’s deconstruct the offending paragraphs:

© Provided by TIME Inc.

What is author James B. Stewart trying to suggest? That “Autopilot” is a feature set that can be transferred from aviation to cars? Tesla Autopilot is a brand, not a technology, and as 793,000 news stories pointed out in a .54 second Google search, it isn’t the same thing as an aviation “autopilot” for the ground.

a screenshot of a cell phone© Provided by TIME Inc.

Huh? I searched the Tesla website and couldn’t find any reference to them claiming Autopilot is “an autonomous driving system.” It isn’t. It’s semi-autonomous, at best. That second sentence about full self-driving capabilities adds context, but not the good kind.

a screenshot of a cell phone© Provided by TIME Inc.

At least the author uses “semi-autonomous” this time, but then he makes a factual error. Gripping the wheel doesn’t disable Autopilot and transfer control back to the driver. Neither does laying your hand on it. Torque does. There is no capacitive sensor on the wheel that senses human touch. You have to apply approximately 1 Newton meter of torque to disable Autopilot.

Bad New York Times. Bad.

And then things go from uninformed to inaccurate.

a screenshot of a cell phone© Provided by TIME Inc.

If I knew nothing, I would assume that the author told the Tesla Model 3 where to go, it drove there, and then it parked, all without his intervention. He actually uses the phrase “without my intervention.” That sounds like a voice-activated self-driving car. A car that can do everything door-to-door. There is nothing in that paragraph that suggests the Tesla Model 3 is anything but a self-driving car.

Newsflash: there are no self-driving cars for sale today. Not one.

What the author describes is not what actually happened.

Here’s what actually happened:

  1. The author pressed a button on the steering wheel to activate Voice Command.
  2. The author said “Navigate to Garden State Plaza”.
  3. The navigation system listed the Garden State Plaza on the center screen.
  4. The author tapped on the correct destination on the center screen.
  5. The author began driving under human control.
  6. Using a stalk left of the steering wheel, the author set the radar cruise control distance to 3 car lengths. Unless it was already set there.
  7. When conditions permitted, the author pulled twice on the same stalk to engage Autopilot, which remained engaged for some length of time, one or more times.
  8. When the author approached the a light/exit/intersection, he disengaged Autopilot by tapping on the brakes, applying force to the steering wheel, and/or using the same stalk.
  9. The author then manually drove into the parking lot.
  10. Upon finding a spot, the author stopped the car, then engaged the Automatic Parking functionality, which parked the car.

Almost everything in that paragraph is wrong. That is not a fully self-driving car. That is a car in which a person has a lot of choices to make, and has to make, or the car isn’t going anywhere. If it is, and the driver doesn’t make some real decisions, it’s going into a ditch, or worse.

a screenshot of a cell phone© Provided by TIME Inc.

Does the author understand what he’s saying? The Tesla does not eliminate the danger of a blind spot. That would require sensors the Model 3 lacks, like a rear radar. Strangely, he points out that he’s had problems in other cars with blind spots, then says this:

a screenshot of a cell phone© Provided by TIME Inc.

So what’s his conclusion?

a screenshot of a cell phone© Provided by TIME Inc.

Tesla’s wireless over-the-air (OTA) updates are brilliant, but they can’t magically conjure up the necessary hardware to resolve the problem he describes.

The author isn’t familiar with the most basic concepts of automation or autonomous systems, or the difference between series and parallel semi-autonomy. #WouldYouLikeToKnowMore? Here’s an article about it from last year which explains the difference, and how misguided Stewart is when he claims Autopilot “enhances (rather than supplants) human performance.”

Virtually everyone, from car companies to Silicon Valley to the Department of Transportation and NHTSA, has failed to educate consumers about the realities of automation and autonomy, which are not the same thing. If safety is a desired goal of self-driving cars, it’s not being served by misinformation.

This is the legacy of cost cutting in newsrooms.

When one of the most respected papers in the country is publishing drivel of this level, what does that suggest about their other coverage?

Signed,

A lifelong New York Times fan and reader.

P.S. if you want to read a serious Model 3 review, read Tesla Model 3, The First Serious Review.

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s