Tesla and GM diverge on autonomous driving offerings

With GM’s Super Cruise, drivers don’t have to hold the steering wheel or touch it periodically.

DETROIT — Autopilot. The name connotes a futuristic vision, the automotive equivalent of soaring carefree through open skies.

Contrast that with Super Cruise, which suggests an enhanced version of cruise control, relief from road-trip tedium, like a good audiobook.

The two brand names reflect the diverging visions of autonomy that Tesla Inc. and General Motors are offering today’s consumers: Tesla’s Autopilot, defined by a promise to stretch the boundaries of technology, vs. GM’s Super Cruise, characterized by the limits it places on both man and machine.

Which vision consumers embrace could have broad implications for how quickly and aggressively autonomous technology is deployed across the industry. And the events of recent weeks and months are sure to shape that conversation.

Autopilot — released as a beta technology in 2015 — has come under growing scrutiny following a string of crashes involving Tesla vehicles with the system engaged and reports of drivers misusing the system, including a Briton who was punished for sitting in the passenger seat while Autopilot guided his car.

Federal investigators looking into a fatal crash in California in March determined that an Autopilot-guided Tesla Model X sped up to 71 mph in the seconds before it slammed into a highway barrier, Bloomberg reported last week, and the driver’s hands were off the wheel. Safety advocates have pointed to the Autopilot name as a risk factor, given its well-known meaning in the aviation context.

Super Cruise, meanwhile, has experienced little to no controversy since launching last year on the 2018 Cadillac CT6. GM markets it as a limited-scope “hands-free” driving technology but avoids any comparisons to systems used on airplanes.

{{title}}

{{abstract}}

Read more >

{{/content}}

The smooth launch allowed GM to announce last week that the technology will grow across Cadillac’s lineup beginning in 2020, followed by GM’s other brands. GM product chief Mark Reuss, announcing the plans here at the Intelligent Transportation Society of America conference, called Super Cruise “a giant leap” along the path to true autonomous vehicles.

Different systems

Super Cruise and Autopilot, while designed for similar uses, operate differently and use distinct mechanisms to discourage abuse.

GM limited the use of Super Cruise to more than 130,000 miles of divided highways in the U.S. and Canada that were previously mapped using lidar — a key differentiator for GM that provides detail down to each lane on a highway and prepares the vehicle’s controls for twists and bumps in the road. The mapping is in addition to sensors, radars and cameras that help the vehicle analyze its surroundings.

GM also installed a facial recognition system that monitors a driver’s alertness. Drivers don’t have to hold the steering wheel or touch it periodically, as they do with other companies’ systems, but they do have to keep their eyes open and facing the road most of the time to prevent Super Cruise from initiating a series of alerts that eventually can lead to the vehicle stopping.

Like GM’s system, Autopilot uses sensors, radar and cameras to “see” its surroundings. But it can be used on all highways and city streets and has an automatic lane-change feature activated by a turn signal — a feature GM engineers decided against.

And it’s not intended to be hands-free. Tesla uses the steering wheel touch sensor to try to ensure that drivers are behind the wheel and paying attention. But some drivers reportedly have figured out ways to subvert that safeguard by placing objects on the steering wheel to simulate a human’s touch.

Tesla has continually updated Autopilot since releasing it to consumers in 2015. An upgraded version was released in October 2016, months after a fatal crash in a Tesla Model S — the first known death involving a semi-autonomous driving system.

‘Demonstrably dangerous’

But accidents still are happening, and criticism is growing.

In a letter last month, two U.S. consumer advocacy groups urged the Federal Trade Commission to investigate what they call the “deceptive and misleading” use of the name Autopilot in the company’s marketing.

“It gives the impression of something that flies itself, or in the case of an automobile, drives itself,” said Jason Levine, executive director of the Center for Auto Safety, which along with fellow nonprofit Consumer Watchdog submitted the letter.

The letter was sent the same week Tesla agreed to settle a class-action lawsuit with buyers of its Model S and Model X who alleged that Autopilot was “essentially unusable and demonstrably dangerous,” according to Reuters.

Tesla has strongly defended the safety of Autopilot and its vehicles broadly, touting their renowned crashworthiness and attributing the accidents involving Autopilot to driver inattentiveness. In the case of the March crash in California, Tesla said in a statement that the driver didn’t respond to “several visual and one audible hands-on warning” earlier in the drive and the driver’s hands were not detected on the wheel for six seconds before the collision.

The company, citing vehicle logs, says the driver had about five seconds and 150 meters of unobstructed view of the concrete divider but did not take action.

Experts say the crashes involving automated systems are something all automakers should be concerned about, as they can undermine consumer confidence in all types of driver-assistance and crash-prevention technologies as well as in the coming generation of fully autonomous vehicles that automakers and safety advocates see as potential lifesavers.

Said Levine: “It is only hurting the long-term ability to move this technology forward.”

Be the first to comment

Leave a Reply

Your email address will not be published.


*