Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
WebRTC file sharing broker using Elixir Phoenix (zohaib.me)
110 points by zabi_rauf on Aug 13, 2015 | hide | past | favorite | 23 comments


They says "secure" but it's trusting the website to not deliver malicious JS. Also trusting the numerous third party domains that javascript is included from (and if you block them the entire thing breaks).


Yeah. It's "secure" as long as none of Filecha, Cloudfare, Google, Facebook, jQuery or Akamai get hacked, compromised or coerced.

There's no reason he can't host all of these javascript resources on the same domain, substantially reducing the attack surface area.

I know people like to use CDNs and third party hosted analytics software, but can we at least come to the compromise that if you're going to say your app is "secure" or "private", that you at least attempt to host what you can on your own domain...

[edit] I'm probably being unfair. He makes the code available so you can host it yourself. I'm sure most people who install it will leave the CDNs in place though.


You're not being unfair at all, IMO. The instance he hosts says secure, yet it includes lots of 3rd party resources, and of course you're still trusting his server every time you run it.


> you're still trusting his server every time you run it.

Is there any way to initiate transfers between two browsers over WebRTC without the use of a server?


Yes, but that's not the point here. The point is getting the code that knows what to do with the WebRTC connection.

Since you asked: https://github.com/cjb/serverless-webrtc/


What would it take to get the w3/html5 folks simply add a src-hash="$algo:$value" to any tag that can load remote resources?

Seems like a low-impact way to significantly boot the usefulness + security of CDN's. If the source page (requested over https, and presumedly not MITM'ed already) declares "I want to load that resource over there, and I expect it to hash to this value", then we get all the benefits of caching + trust that it has not been tampered.


This exists, subresource integrity: http://www.w3.org/TR/SRI/


Last time I saw this mentioned someone posted s link to the w3 pointing out that it is in fact being discussed for future implementation.


Thanks for pointing it out. It is still in development. The reason i was using CDN was because of bandwidth and reusing the cached JS if the user already has that. What will be the best way to tackle this? If I put all those under one CDN then I guess the bandwidth problem will be solved but I wont be able to reuse the JS that user might already have. At the end of the day I still have to trust the third party JS.


The general consensus is that you won't (meaningfully) be able to "reuse the JS that a user might already have" anyway. The idea of well-know-domain.com/well-known-framework-well-known-version-0-1.js being cashed is interesting -- but a) it doesn't appear to work to well in practice (I seem to recall 30% hit rate for the "best case" of jquery-latest on some big CDN) b) It's not that much data anyway.

Lets say you have 500kb of compressed js. If you have 2 million unique visitors, that'll just eat 1TB of bandwidth -- and if you don't screw up your own cache headers -- that'll be that -- it'll be cached.


Reminds me of http://file.pizza/

Seems like P2P apps are the new Hello World programs of WebRTC applications.


P2P apps are certainly the go to Hello World application for WebRTC, but that makes sense considering that it's a P2P API.

Not trying to be a troll, I just see these comments ("X is the new Hello World!" where X is anything from messaging to algorithmic trading) all the time now. At least this use case makes sense for the tools, I suppose.


Isn't that the whole reason webRTC was invented?


Hey look, I made one too https://fuego.link


Could you deploy an automatic mirror like this, so you can turn your clients into a temporary CDN while they have your page loaded?


I think it's not really advisable to do that, for privacy reasons:

https://news.ycombinator.com/item?id=9112717

https://news.ycombinator.com/item?id=9893561

Also your CDN'ing clients could serve different content, giving some of your visitors a random surprise...


You can easily avoid "surprise" by hashing content, but malicious clients still will be able to delay content load.


you can authenticate the content to prevent that


Yep you can. There was this http://peercdn.com/ that did that and I remember a post on HN where it served the page from other clients but used web sockets instead.


>This page exists only if someone is looking at it

https://news.ycombinator.com/item?id=9531265


Sounds a lot like https://sharefest.me/ :)


Sharefest has been in Alpha for years. Why is that still the case? Is it not under development anymore?


All webRTC apps are like that. I suppose it's because the standard is considered unstable?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: