Hacker News new | past | comments | ask | show | jobs | submit login

Edge means on-premise (on robot) as you said.

But 'edge,' as used in context of AI, is also a wink-and-a-nod that the device is inference-only (no learning, no training). The term "inference only" doesn't sound very marketing-friendly.




AGX Xavier can do training on device just fine - and run every CUDA workload. It's just not the fastest device at that, you'd prefer a desktop GPU if you can for such a purpose.


I assume what fizixer means is, if you're making an Amazon-Alexa-type-thing, training 1 model on 1 million user's data will work better than 1 million models trained on 1 user's data each.

AFAIK the "Roomba learns the layout of your house" type of edge learning is generally done with SLAM rather than neural networks. There might be other applications for edge learning, of course.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: