Hacker News new | past | comments | ask | show | jobs | submit login

I don't think it will ever be shown as "raw data" like that though. They will most likely only show the current street view, with cars and unknown objects that are in the road.



Absolutely. The point is though that the visualisation isn’t the occupancy network. And the occupancy network is a lot closer to a substitute for ultrasonics than anything shown on car screens. How much closer is hard to guess and likely unknown by anyone commenting in this thread. But I dare say it’s probably closer than some critics assume.


> The point is though that the visualisation isn’t the occupancy network. And the occupancy network is a lot closer to a substitute for ultrasonics than anything shown on car screens.

What is it then? Because it's my car, detecting the occupancy of something that isn't a car that it previously did not show visuals for, and the release notes don't mention anything that would enable this visualization aside from the occupancy network[0].

> Added control for arbitrary low-speed moving volumes from Occupancy Network. This also enables finer control for more precise object shapes that cannot be easily represented by a cuboid primitive. This required predicting velocity at every 3D voxel. We may now control for slow-moving UFOs.

> Upgraded Occupancy Network to use video instead of images from single time step. This temporal context allows the network to be robust to temporary occlusions and enables prediction of occupancy flow. Also, improved ground truth with semantics-driven outlier rejection, hard example mining, and increasing the dataset size by 2.4x.

Additionally the wording they use when presenting the research was vague, they say it's not currently shown, but proceed to show raw visual data. What I take this to mean, is that they won't ever show the raw data, but will build visuals from it. They even detail this by mapping occupied space to "the nearest available item"[2]

FWIW, _if_ this is the occupancy network in action it's lacking a lot. It can't detect thin (~4" squared, floor to ceiling) poles in my garage, but my ultrasonic sensors can. (I know this because my car showed a path going straight into one.) I also can't find anyone discussing the visuals that I am seeing on my car either.

Additionally, the cameras on teslas have a very big blind spot in the front of the car, exactly where the ultrasonics were. [1]

[0]: https://www.notateslaapp.com/software-updates/version/2022.2...

[1]: https://nitter.net/greentheonly/status/1216194238146383879#m

[2]: https://youtu.be/jPCV4GKX9Dw?t=779




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: