Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can also build and train models in js. This is one of the features I'm looking at. Currently there is a lot of information websites are throwing away or they're using centralized aggregation mechanisms to provide personalization but Tensorflow.js changes that dynamic.

I think small AI models that run completely in the browser and provide personalization by learning from how the user interacts with a given website are the future. This empowers the user and puts them in charge of how their data is used. The example I mentioned previously to demonstrate this was about ranking new submissions on HN: https://news.ycombinator.com/item?id=23407549. I'll quote the relevant part

> TensorFlow.js is a pretty nifty piece of software and it's underutilized. If the model parameters can be stored in IndexedDB then users could train TensorFlow.js based site augmentation to suit their own needs. For example, what if HN had a TensorFlow.js model for ranking new submissions based on the user's preferences? This model could be trained like a spam filter and would eventually learn the types of articles that someone likes to see but they would be in charge of the model's evolution and so would be empowered to use it however they saw fit. Maybe I don't care about politics then my model parameters would eventually converge on downgrading all political posts and the more technical submissions would rise to the top based on how I upvoted new and front page submissions.



This is a cool idea, but requires some hooks. Otherwise, you also need api access and remake the UI. I guess you could intercept API calls with a browser extension, but having a hook for you personalization so would be great!


Cool idea, but why not have that client side model operate across websites -- if I switch from hacker news to reddit why not carry over that data to figure out what I will like on the separate platform?


Excellent idea. Yes, you could do that as well. Nothing would prevent copying the parameters and models for one site and using at another one since at the end of the day it would all just be data controlled by the user.

One can even imagine a decentralized sharing mechanism where users can create ensembles of models by combining models trained by different users.

All the building blocks are there. Just requires mindshare and a few killer applications.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: