Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why can't you update it?

There could be a release of a new model every 6 months or something (although even that is probably too often, the incremental improvement due to statistical changes in the distribution of images being compressed isn't likely to change much over time), and you just keep a copy of all the old models (or lazily download them like msft foundation c++ library versions when you install an application).

The models themselves aren't very large.



I don't know why this comment was downvoted - it's a legitimate question.

One scenario I can picture is the Netflix app on your TV. Firstly, they create a neural network trained on the video data in their library and ship it to all their clients while they are idle. They could then stream very high-quality video at lower bandwidth than they currently use and, assuming decoding can be done quickly enough, provide a great experience for their users. Any updates to the neural network could be rolled out gradually and in the background.


Google used to do something called SDCH (Shared Dictionary Compression for HTTP), where a delta compression dictionary was downloaded to Chrome.

The dictionary had to be updated from time to time to keep a good compression rate as the Google website changed over time. There was a whole protocol to handle verifying what dictionary the client had and such.


Not just that, but you could take a page out of the "compression" book and treat the NN as a sort of dictionary in that it is part of the compressed payload. Maybe not the whole NN, but perhaps deltas from a reference implementation, assuming the network structure remains the same and/or similar.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: