Daniel Sinclair @_DanielSinclair Building for young people. Not reading @danielsunread. Lurking behind likes and thinking about social media, communication, & China. Oct. 01, 2019 6 min read

Tesla's new V10 update includes Smart Summon, which is further along their path of Autopilot, of truly reaching a Level 5 to have an automated taxi fleet.  https://www.youtube.com/watch?v=nlCQG2rg4sw 

Smart Summon and Autopilot are EARLY and that's kind of frightening, especially when the cars are failing to stop at stop signs or avoiding other drivers in contained environments 😬

Usually the cars just get confused, and confuse other drivers, causing a computational thread lock in the real world.  https://www.youtube.com/watch?v=-dfpnL9OpzM 

But this Model 3 literally drove into another car going 1 MPH while in Smart Summon, never activating accident avoidance.

I can't really speak tp why Smart Summon seems to have so many flaws in these small environments, while Autopilot is also quite great on the highway, even in questionable, dynamic circumstances.

But there are some hints in that Smart Summon commercial, and in the diagnostics point cloud data that is displayed in the Tesla app.

Other than being a design decision that screams FUTURE and makes the app feel cooler, I don't know why that raw Tesla data is displayed at all. I believe there are 2 interfaces; one for near-range, the other w/ a waypoint. But there appears to be no manual intervention controls.

I could see this data being used for something like manual intervention as you babysit the car during these early versions. But, again, I don't see any of those controls. All of these videos have owners just watching their car crash.

Nonetheless, that diagnostics data is helpful in understanding the broader capabilities of Autopilot. I overlaid that visual data with the area I think it is describing. A Tesla backed to a building, with a side walk on the left, cars in front, and a round-walled exit loop.

Given the point density, you can see that Autopilot has a pretty high confidence that the car is facing away from that wall (an object to avoid) and is approaching a walled lane, yet something else it really wants to avoid. It uses that to path find.

But the car is not seeing everything. This data is certainly a limited window into Autopilot's current understanding of this environment. It is missing some important elements, however, like the parked cars (could just be clipped from this nadir window) & the side walk boundaries

I believe those dots I highlighted are referencing the side walk polls. You can see their distance from the building wall, and that looks like the outer boundary of a side walk.

Telsa hasn't released a lot really in-depth diagnostics videos since that Hardware 2.0 video in California. Hardware 3.0 does everything faster w/ Tesla's processor, but a lot of the cameras and sensors are the same between those versions.

I slowed down an important part of that 2.0 video that gives us some hints as to some limitations we may be seeing. First full, then just the MR camera. Watch the lane detection; the red lines are over-projected on the curve, and delayed until the camera sees the center lines.

At least back in 2016, using 2.0 hardware, Autopilot was most confident about about lane detection using markers. It was much less confident about outer-lane lines & curbs. On roads & highways, where Autopilot excels, it is mostly likely hugging the left because of this disparity

In the limited window we saw for Smart Summon, the car is confidently hugging the right wall. As the frame would begin to see the parked cars, it would likely easily avoid them — but when looking at low-curbs, the car appears to be far less confident, and doesn't inference them.

The odd thing about that distinction is that Autopark is quite good. At least in that mode, we have more confidence that the car can see and inference curbs. But on a scale of caution, it is possible that Autopark is drastically less aggressive, more hesitant, and thus sees more.

When watching that 2016 video, you can see that Autopilot is seeing more than just lanes. It picks up the planes of the road (importantly absent is the side walk plane), as well as environment geography (like hills and tree canopies in green streaks). The rear cameras are better.

That last video showed the rear-ward cameras. This shows the medium range forward looking camera, and how it appears to be tasked more aggressively with seeing the road planes & lanes. Again, side walks aren't highlighted here. That doesn't necessarily mean it doesn't see them.

But given the visual weight applied to the lane, as well as cars and pedestrians, we probably can interpret that Autopilot (at least then) had more confidence in seeing & avoiding those pedestrians and accidents than it did seeing the stop of the road and start of the sidewalk.

Years later, we appear to see that same confidence weighting appearing in Smart Summon. This could just be a distinction between how camera-dependent cars will see & drive, versus their LIDAR counterparts.

Waymo, hugely reliant on LIDAR, appears to have some problems w/ curbs too. While LIDAR (in good conditions) can see curbs in high fidelity, they are tasked w/ smoothing that data to path find. Sometimes Waymo interprets phantom localization events via curbs & the wheels stutter.

To just clarify what I think I see: Waymo uses static landmarks like side walks to localize the vehicle in space, thus allowing it to stay centered in lanes. That data can be errored, maybe because of uneven/changing side walks, & the car thinks its in the wrong place & adjusts.

There can be real problems w/ an over-emphasis on LIDAR. Sometimes the data is wrong, like if its raining, foggy, or dusty. We can see that & the misinterpretation at the most basic levels of Waymo's driving. Waymo is even quietly reducing that dependence.

Between these bets, you have Tesla trying not to see or interpret sidewalks while instead using lane markers & larger objects for localization, and you have Waymo over-trusting those sidewalks & their sensors' ability to see them at the core building blocks of driving.

It is becoming more clear that Tesla's path will prove brighter. The hybrid may win out as new types of radar & sensors appear — but LIDAR more likely will fade away. These visible qualms & decisions show us the importance of behavior, & the stark contrasts between Tesla's modes.

A lot is going wrong in the field tests of Smart Summon. It wasn't ready. But in its point cloud, we can see what Autopilot cannot, & what modes like this think they shouldn't — and the right answer may demand more resolution than 720p & less confidence in the walls & markers.

There is also a lot that Tesla needs to do to describe these products & their immaturity. There is too much trust in these features. That's a real flaw of automation, of handing over the reigns— but we can design better. We need those oddly absent controls

Every insurer watching those Smart Summon videos

No matter the hand holding, Smart Summon is just not ready yet. Yeesh. It zig zagged through parking spots, and didn't stop for a pedestrian in this video. I'm starting to wonder if it might be a camera FOV problem, like we may see in curb behavior too.  https://www.youtube.com/watch?v=enkRALcdPb0 

The problems of new technology generally get more attention & coverage, and there are a lot of people unfortunately incentivized to market those issues, but are there really any Smart Summon videos out there that don't make you nervous? Sadly feels like Tesla's first dud.

The pinhole camera in this robot is more responsive to pedestrians, even when I try to walk into it, than what we’re seeing in the Smart Summon tests. Tesla will win Level 4, but this is a real setback. It’s adds more weight behind those early summon demos w/ a secret driver.


You can follow @_DanielSinclair.



Bookmark

____
Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.

Enjoy Threader? Sign up.

Threader is an independent project created by only two developers. The site gets 500,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Running this space is expensive and time consuming. If you find Threader useful, please consider supporting us to make it a sustainable project.