Hacker News new | past | comments | ask | show | jobs | submit login

If I want to use one of the hyper* tools to partially syncronize files p2p on a LAN, which one should I use?

The hyper* world seems to be very fragmented right now. There is the dat [0] project, which started 2013 and shares files between computers p2p. In may 2020 the dat protocol was renamed to the hypercore protocol [1] and dat "will continue as the collective of teams and projects that have evolved from the original Dat CLI project.". hypercore-protocol [2] links to multiple applications for file sharing (none of them the dat CLI tool).

Hyperdrive [3] "help[s] you share files quickly and safely, directly from your computer [...] -- all P2P". The github [4] mentions the hyperdrive daemon als a batteries included experience, but the hyperdrive daemon has a deprecation notice and tells you to use hyperspace.

Hyperspace [5] "provides remote access to Hypercores and a Hyperswarm instance" and "exposes a simple RPC interface". The documentation is very technical and seems to be aimed at developers, not using it as a tool.

Digging around in the github organisation [6] or by stumbling upon the patreon [7] one can find the hyp CLI tool which is "A CLI for peer-to-peer file sharing (and more) using the Hypercore Protocol". The first commit is 9 days old. On the twitter of the author one can also find hyperbeam [8], which is integrated into hyp.

Here on HN one can also find an announcement for "uplink" [9].

The tech looks pretty cool, but the vast amount of different projects makes it difficult to grasp. From all of these tools the dat tool seems to be the most advanced, but not actively maintained. It's not linked on any of the hyper* sites and doesn't seem to be the recommended way to use the hypercore protocol to share files p2p. While I would like to use the tech, I'm pretty lost on how to do that today.

    [0] https://docs.datproject.org/ 
    [1] https://blog.datproject.org/2020/05/15/dat-protocol-renamed-hypercore-protocol/
    [2] https://hypercore-protocol.org/
    [3] https://hypercore-protocol.org/#hyperdrive
    [4] https://github.com/hypercore-protocol/hyperdrive
    [5] https://github.com/hypercore-protocol/hyperspace
    [6] https://github.com/hypercore-protocol/cli
    [7] https://www.patreon.com/posts/hyp-command-line-44923749
    [8] https://github.com/mafintosh/hyperbeam
    [9] https://www.patreon.com/posts/paul-reveals-is-44665610



First off, as someone who has spent many hours trying to use dat, I agree 100% with your assessment.

I have been using dat command line tool to p2p synchronize files on a LAN (and also over the internet) for a year or two now. The latest version is more stable for me than than the previous version and I really love it. It synchronizes files across multiple computers nearly instantly.


I'm glad the dat cli is still working for you. It's on an older protocol codebase and won't be maintained, the hyp cli is meant to replace it going forward.

https://github.com/hypercore-protocol/cli/


If someone would decide to use either dat or hyper as a base for a project today: what are the differences?

Is hyper going to replace dat as an advancing network protocol? Is peer discovery different, or did hyperdrive change...or...?

(I am very confused about dat/hyper efforts, as I've been following it a bit over the last year)

I've seen that beaker changed to hypercore/hyperdrive...so my current assumption is that the hyper protocol is the web browser effort, rather than the other parts of infrastructure.

Are there incompatibilities in both protocols, or is hypercore trying to refactor its codebase in order to say, remove legacy dependencies that might not be necessary anymore?


Dat is essentially the old version of Hypercore Protocol (aka "Hyper"). We had to make a set of breaking changes and decided to rebrand. So- if you're looking for a base, use the hyper stuff.

The changes included a switch to a DHT for peer lookup and data-structure reworks that improved speed and scaling by quite a bit.


Does the new cli allow multiwriting yet? I'd like multiple computers to be able to modify the dataset.

Similarly, it seems the protocol supports keeping around all versions of the hyperdrive, but it looks like that isnt made available via the cli. Is this correct?


That sounds cool. Would you mind telling a bit more about your Setup? How does it compare to Syncthing? I'm still looking for a p2p file sync solution that can mix locally and remotely stored files (and actually works at all).


I dont know syncthing so cant comment on that. My setup is very simple. `dat share` running on my laptop and `dat <hash>` on all of my servers. The servers are running inside docker/k8s in a daemonset so every server in the cluster has a recent version of the files. Some benefits of this over say a shared NFS mount hosted in the cluster:

- I can use my local editor online or offline in the same config

- my editor doesnt hang as it saves over a network connection

- live reload apps running on the servers will pickup the changes immediately. inotify doesnt work on nfs mounted drives.

One main disadvantage in some cases is it doesnt currently support multi-writer. the servers mounting the dat are read only. I'm pretty sure this is changing in newer versions, but I'm not sure on roadmap.


The tech is solid, there is no doubt. However, Despite following Dat since about 6 years ago, the general organization has always been lacking. I took a break from following it and all “tech” for all of 2020 as I needed a breather and focused on other offline interests. The new Beaker Browser v1 was finally released which was a big milestone and this news hit my radar. It inspired me to delve back in to this particular technology and went down the hyper rabbit hole. I read about the change from Dat to Hyper and spent the last week or so trying to grok all the reorg and docs. It’s dizzying. I’m both very interested in using this tech again and sadly also frustrated with the org issues which seem not so much better than previous but just different.

On a high level, there are improvements from rebrand to abstraction and generalization of components to allow for a better path forward. I can def see a developer who never followed Dat having a lot of problems grasping Hyper Protocol without investing a lot of upfront time... since it’s been difficult for me who has used Dat pretty extensively and followed all things Dat prior to 2020.

I think they are going about this the right way, though, and it will just take time for it all to get calibrated and be more cohesive. I’m sure they will take all the help they can get so hopefully more people will get involved because the tech has def improved a lot and IMO is a big deal in p2p.


Hey sull, appreciate the thoughts- I think that's a pretty good take and I believe/hope you're right that we'll have things ironed out soon


I've never heard of any of these tools, but they all individually sound very useful and cool. This is one of those projects that would probably benefit from a technical writer and marketing person.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: