Hacker News new | past | comments | ask | show | jobs | submit login

Have you thought about how autodiff can be baked into the lang? Given that it is already seemingly all representing computation with graphs of operations, building AD into it seems like a natural extension that could open up some pretty useful applications/bring in some scientific computing interest.

I’m also curious to think about how the “time travel” might interact (if at all) with (1) basic algebra and (2) building off that, systems of equations and (3) building off of that something like implicit/crank-nicholson finite difference schemes.

Without having thought about it much, it feels intuitively like there might be an elegant way of expressing such algorithms using this notion of information traveling backwards, but perhaps not.

As an aside, although I’m not really a fan, the representation of time travel in Tenet seems like a natural analogy to include in your docs for fun if desired. More relevant, talking about things like backprop in standard ML packages might be one way of providing a shared reference point for contextualizing information flowing backward through a computational graph.

Finally, echoing what others have said about not burying the lede in the docs, as well as wanting to see more motivation/examples/discussion of interaction nets.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: