Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There is no such thing as a communication channel you can give to a black box that only allows the black box to communicate things you approve of. It's either give it a communication channel or don't.

If you want to trust it you have to be able to audit its workings. There is no magic sauce from a network layer that gets around this step and after this step nothing else is needed.



> If you want to trust it you have to be able to audit its workings.

This is the fundamental problem; we provably cannot even know if the black box will even halt[1], we obviously cannot audit any it for behaviors far more complex than simply halting. Even worse, if the black box is equivalent to a Turing machine with more than ~8k states, it's behavior cannot be described in ZF set theory[2].

> There is no magic sauce from a network layer

As long as the black box is Turing complete, any noisy channel[3] to the network it can influence can be used as a foundation for reliable digital channel. The solution is to limit the black box to such that it is not Turing complete. The decision problems about its behavior must be decidable.

A good example of a limited design is the original World Wide Web. HTML with 3270-style forms (or an updated version that is better looking with modernized form tags) was decidable, with the undecidable complexity sandboxed on the server. The instructions to the client (HTML) and the communications with the server (links/URLs, POSTed form field) are understandable by both humans and machines.

Today's web requires trusting a new set of undecidable software on each page load. We're supposed to trust 3rd parties even though trust is not transitive. We're supposed to accept the risk of running 3rd party software even though risk is transitive. Now these problems are leaking into other areas like DNS, and the Users lose yet another battle in the War On General Purpose Computing[4][5].

[1] https://en.wikipedia.org/wiki/Halting_problem

[2] https://www.scottaaronson.com/blog/?p=2725

[3] https://en.wikipedia.org/wiki/Noisy-channel_coding_theorem

[4] https://boingboing.net/2012/01/10/lockdown.html

[5] https://boingboing.net/2012/08/23/civilwar.html


Black box? I'm def not proposing any black box or "magic".

It's open middleware, just like the glibc resolver. For example, it's entirely possible to force applications to use the glibc resolver, just dont let them open sockets to anything but 127.0.0.1:53. They wouldnt be able to use http/https either in that case, but that's the point.

If you are thinking about side-channels like HTTP over DNS(S), then fine, but the middleware can see the traffic because that's it's job. If the app starts making encrypted requests atleast you would know, and since it's open source the user can fix it and tell everyone the application is using a side-channel to subvert the user.

_But that missed the point._ The app wouldnt have DNS code in it. It would only be able to ask to map a name to a record. And even then, that misses the point too. In the end it wants to fetch a URL, and what I am talking about does that. Firefox parses a GET it was handed, and if it wants to make additional GET/POST's, then hand them over. No DNS or networking code needed in the browser. Linking to a SSL lib would be a bug.

Reaching into an arb open source app and getting ahold of it's SSL machinery to MITM it is always a moving target (aka deliberate problem), and that's an anti-user feature.

Common middleware that handels the comms (SSL etc) (os or application level) levels the playing field. The recent DoH changes would have been up to the user, because that code isnt in the browser any more. Users are leveraged by the browser vendors, "want the latest version?" "hey I see you are using a 0-day browser?" and forced to swallow or fork. I realize users can disable DoH, but again, that's the point. It's a moving target. They can just keep "fixing" the defaults.

Same thing with Chrome's recent changes regarding the DOM blocking API. If Chrome was forced to deal with asing for URL's instead of fetching them directly, it wouldnt matter. The blockers would operate in the middleware.

As I mentioned in my original comment, the point is to axe the networking code from the applications, and force them to make requests a layer up. This is not like forcing them thorugh a SOCKS proxy. It's deduplicating the code, and making the parts seperable. The monolithic nature of browsers isn't some accident.


And if an app decides to send encrypted/compressed/encoded data over the TLS socket you gave it then what? The assumption is the app is malicious, how are you ensuring the data underneath the communication layer you provide is actually clean?


They dont get a socket. What's the socket for?


Please don't get hung up on terminology - for the sake of conversation assume the name for the way the black box app gets information from the outside world is termed a "socket". It's a name, it doesn't really matter. What matters is you are taking information from the outside and giving it to the black box.

So say you give the black box in your sandbox a socket (from the above definition). Say it is a news app. This news app loads headlines and articles. How does your sandbox know if the headline/article text contains some form of encoding? How does it know the timing of requests isn't leaking information out about your viewing patterns?


Ah, I misunderstood your mention of black boxes. This is not for black boxes. It only makes sense for open source. I agree it wont work for a black box that you cant audit. If the browser is side-channeling by asking for scheme://some.thing/enc_data_sent_here -> enc_data_recieved when the middleware hands the GET back, then atleast the user can decide if they want to honor that GET request and figure out why the browser made it, and remove that code. The browser should have no crypto code in it. Linking to a SSL/crypto lib would be a bug as I mentioned.

I spend quite a bit of time railing on JS. Executing arb code is fundamentally a bad idea (Halting Problem).

https://hn.algolia.com/?dateRange=all&query=jakeogh%20%22dis...


> https://hn.algolia.com/?dateRange=all&query=jakeogh%20%22dis...

> This page will only work with JavaScript enabled

Really!? Why the hell would you link to that?


I know, it's rough! I had to start FF to generate it. For some reason it wont work with webkit(surf ctrl-shift-s to JS).




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: