Hacker News new | past | comments | ask | show | jobs | submit login

I'm curious, is it possible to train a model and send that model to a raspberry pi or something to make something like this realistic to use? Then take the results from the raspberry pi, send them elsewhere to be trained into the model? and repeat?



Yes, this is very frequently done. Training is far more computationally expensive than evaluation in most cases.

I don't know of any specific cases of this being down with a raspberry pi, but many phone apps, for example, have this sort of architecture. Train a model on a powerful server/cluster, send model to phones for use, phones collect more training data that is sent to server/cluster, repeat.


I have successfully run a tensorflow model on my Android phone, I haven't tried the raspberry pi bits, but tensorflow does have examples for running models on it: https://github.com/tensorflow/tensorflow/tree/master/tensorf...

You can certainly send data from your raspberry pi to a central server, and then periodically retrain and update the model on your raspberry pi through whatever update mechanism you create, but that will require your own infrastructure.


Yes, I train my models on my GPU cluster, and then transfer them to my iOS app, which does image recognition.

Some of the models can be quite large for say, 3G (225mb), but I'm working on various compression techniques now too.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: