Hacker News new | past | comments | ask | show | jobs | submit login

It does happen.

It starts as a box that the user submits all of their texts, recordings, emails, content to, and a comprehensive survey covering items such as accuracy, temperament, "what would so and so do in this situation". Think of it like reverse-takeout. The box arrives, you fill it, then send it back.

That box ships off the data to be 'curated' (remote training and buildup of an ad hoc model, read: taking existing data provided and supplementing data based on region, familial background, community), then the curator provides a sample window for the user via their browser or phone. If they choose to keep the cultivated persona representing their loved one (or marketed persona), they pay and a box device arrives, pre-programmed with the model they've ordered. At first these are dumb and only have knowledge of what they've been provided, but eventually they're able to assimilate new data, and grow or evolve the persona as if it were still a person.

Few buy the full body, some stick with just the interaction provided by their Alexa, some a painting or app. The medium is transient, and offers degrees of expression for the proxy model, a mother may want to be able to hold the child she lost, while someone who lost a friend may find it adequate to have their friend in an app. It's personal choice.




Egan's Quarantine also has exactly this, though it's not part of the plot.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: