Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Their development on ShardedTensor seems to have slowed or stopped recently

What makes you say that?



Just looking at commit frequency in the main branch, and activity on the relevant RFCs in GitHub issues. I have no visibility into what’s going on at Meta of course so it could just be there’s a lot of currently internal development, maybe there’s a lot of work going into a different branch at the moment, or maybe they’re waiting for things to happen in torch’s RPC support, I’m not sure. Or it could be they’re just waiting to release what they have as a beta in the 1.13 or 1.14 release?

I’ll note that checking right now there was a commit 15 hours ago, but the last commit before that seems to be 28 days. So some work is still going on at least, thankfully :)


ShardedTensor was merged / subsumed into DTensor: https://dev-discuss.pytorch.org/t/rfc-pytorch-distributedten...

Lots of development and traffic happening here: https://github.com/pytorch/tau/


So is DTensor a Google thing like XLA or more of an open standard?


Looks like its just a name collision. It's a tensor used in distributed models, thus Distributed Tensor, or DTensor for short.


Awesome thanks for pointing this out!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: