Hacker News new | past | comments | ask | show | jobs | submit login

I'm using vercel/pkg as well. I have a Node.js server which generates HTML and opens a browser on Windows which then asks the server for that html at 127.0.0.1.

So browser will be my GUI and Node.js packaged with vercel/pkg my back-end. It is more flexible than say Electron because GUI can be anything I want it to be.

My concern is only will users accept a local server running on their desktop. I've tried to configure the executable so that the server accepts connections only from the same host as where the http-requests are coming from.

I assume the same situation would exist with Denon, if you build a product with it and want to use the browser as your front-end. Are users OK with a server running on their PC?




> My concern is only will users accept a local server running on their desktop

I would say that users don't mind. They want a working application, that does what it promises, and is dependable. Things like having an underlying server is just an implementation detail, nobody really cares (talking about the big public here, not the more technical users of HN who might have a say about the technology choices, but are overall a very small portion of potential users)

That doesn't mean the implementation should be sloppy, though! Make sure to iron out all those security concerns regarding 127.0.0.1, because it could cause a data breach or some other severe damage. Other than that, just aim to provide the best user experience.


Right. One additional complication might be that some web-tech in the browser may require https, which would bring its own complications with security certificates.


PSA: If you have an https web site that needs to talk to your server in localhost, that localhost server does not have to run https since localhost is considered potentially trustworthy.

It will not trigger the mixed content errors (for Chrome and Firefox – Safari does not do this for now, but it is in the works: https://bugs.webkit.org/show_bug.cgi?id=171934 )

I have seen some companies do the following too: They embed a snake oil certificate and instruct people to disable web security, don't do this. Some other companies do something like purchase a foolocal.com https certificate (spotify and dropbox do this I believe), but that certificate might be revoked afaik.

As to CORS, you need to check that the incoming request comes from your own remote machine, otherwise if you enable cors for all, other sites can scan and get your user's data (ebay was recently doing this, scanning ports: see https://blog.nem.ec/2020/05/24/ebay-port-scanning/ )


> It is more flexible than say Electron because GUI can be anything I want it to be.

Feels like one of us is misunderstanding something. Electron lets you run any web technology - so when users open your app, they are greeted with whatever you can show on a webpage.

You can also have Electron run in the background as a server if that's something you're into ;)


I believe the idea is if you wanted a client built with C# Winforms, wxWidgets, JS+HTML or anything else they could all run against the same backend server.. So it'd be more flexible than Electron in that you wouldn't need to use web technology at all, if you don't want to.


Right. Also each Electron app "comes with a copy of two large software frameworks, Node.js and Chromium" ( https://medium.com/dailyjs/put-your-electron-app-on-a-diet-w... )

Whereas if you build starting from Node.js (or Deno I assume) you can skip the Chromium part. Instead of packaging Chromium with your app you assume that users have a browser and can use that to talk to your app.

The benefit of using Node.js is you don't have to use a different language for the backend. It helps.

BTW. "Electrino" in the linked-to article seems interesting too.


The additional benefit of this approach is that you can then build your application such that it works with all standards-compatible browsers, such as Firefox.


That's not a benefit - that's a down-side. When writing a front-end for Electron, you only worry about 1 browser and you have a guarantee that it will work the same way across every OS.

If you have to write a front end (website) that works with more browsers, you have to put in moro work.


It's a benefit from the point of view of the user and for the openness of the web. We shouldn't rely on features exclusive to a single browser anyway.


> My concern is only will users accept a local server running on their desktop.

This sounds pretty cool -- is there a way for the deno runtime to restrict the server process to only be able to generate the legal types of http traffic that a browser could generate if allowed to open a connection to a specific url (or set of urls)?

If so i'd feel more comfortable about running the embedded server -- ideally the security risk of running the server would be closer to the risk when running an embedded set of daemon browser tabs -- rather than the risk when exposing all the unfettered power of a unix process ...


This reminds me of the server that Zoom used to have. Accepting connections only from 127.0.0.1, alone, isn't enough, since any request from the browser would match that IP, even if the request was being made through a XSS attack.

I'm sure someone with more knowledge in security would better chime in.


What I do is generate a random token, pass it to the browser I spawn, and only accept requests that include the token.


What is a good secure way to pass a random token to the browser? If the token is part of the URL, which is in the command-line, then it appears other users can see the token. What I do for DomTerm (https://domterm.org) is create a small only-user-readable html file which sets the secret token, and then does a 'location.replace(newkey)' where newkey to an http url to localhost and includes the secret token. I spawn the browser file a file: url, so teh browser can only read the file if it is running as the same user. Better suggestions?


I use chrome's devtools protocol, so I get a pipe to the browser that I can issue commands over. The token is sent to the browser via that; there is no URL on the command-line for other users to snoop via 'ps' or the task manager.


Pgadmin does something like this. And also makes it easy for you to get another URL with that token, in case you want to open the same page in a different browser.


Wouldn't proper CORS be enough? I guess you would have to avoid putting any sensitive data in GET requests


No, because CORS can only restrict which origins (scheme, domain, port combinations) are able to access the site's data. But you're not even connecting from a web origin but from localhost and you're trying to defend from all access except by your frontends. For this, you need a shared secret between the server and the frontend.

A further limitation of CORS is that certain requests are allowed even if they are not from an allowed origin.

To conclude, you definitely need a secret.


IKs there a good simple way to keep it secret?

I assume someone could have a look at the JavaScript on the browser and see hey this must be the secret stored here because it is passed to the server on every request. Then write there XSS attack to use that.


The secret would have to be non-static (not baked into the code).


I'm so fuzzy on the details but isn't this what client certs are for?


The cert couldn't obviously couldn't be static in this case (otherwise it would be trivial to get the private key).

Creating a cert during "install" probably adds a good bit of complexity (especially if the map has multiple env targets)


You could accomplish this with client certs too, sure. A random secret is a simpler solution in many ways and accomplishes the same goal, though.


One problem is that we developers are lazy and use Access-Control-Allow-Origin: * (wild star) instead of actual hostname. (eg. allowing all origins to access the backend)

No modern browser allows access to localhost without that header.

But it's still possible to forge a request using curl or whatever to bypass CORS. So as the parent post suggest - use a token of some sort.

I also recommend using a strict Content-Security-Policy to stop X-site injection attacks. (eg someone adding an image to your page/app with src="/api/cmd=rm -rf /"


That's a good trick. Thanks for the tip.


Do you have any trouble with the users being prompted by the Windows firewall to allow the server to listen on localhost? I like this way of doing things but I've had propmts like that for my own dev tools and I think that would put off my users.


Anecdotal, but I maintain a mildly popular Spotify player (https://github.com/dvx/lofi) which needs a similar flow for OAuth authentication. I've never had an issue created or complaint about the Windows Firewall popup.


Nice player BTW never heard of it!


I can see that could be an issue. I guess one solution is to make sure users understand that that is going to happen.

These days everybody is used to their smartphone asking things like "Do you want to allow App-X to use your microphone" etc.


Interesting. Are you sure the firewall prompt comes when the server is listening on 127.0.0.1 as opposed to 0.0.0.0?


I'm not actually sure in which cases the prompt happens these days - I meant that to be part of what I was asking but didn't word it very well. Hopefully listening on 127.* doesn't trigger it!


I did the same but used a server coded in go[0], I don't think it's a security problem as long as you take the standard security measures that apply to any web server. Mine is something that just scratches my itch so I explicitly enabled cors (if a malicious website can guess the endpoint, it's possible to get my stupid dev logs as JSON during the times this logger is running).

[0]: https://github.com/egeozcan/json-tail


I am trying to do this too. But I would like to make it work on MacOS as well. The problem there is Safari does not let you connect to localhost over http. So if Safari is the preferred browser of the user, you need to convince them to change, which is hard.

Or is there à better way there?


I can connect to localhost on Safari Version 14.0.1 Make sure you also load index.htm from the same host! (localhost) so you need a little web server in your app that serves that file.

One problem is however that there is no easy way to get Safari to run in chromeless/app mode (without browser/url bar etc), which you can do in most other browsers using --app=url flag


Thank you, I will try. I was actually trying to mix web app (https:// mydomain.com) and connecting to localhost. And yes, a localhost URL in the bar looks ugly... Trade-offs...


See https://bugs.webkit.org/show_bug.cgi?id=171934

I hope they'll fix it soon.


I may misunderstand the thread, but it looks to me like they don't want to fix it, even though it is the spec, and other browsers do follow it.


in this case why not just use go-binary as the backend? go is natively designed to be single executable and you don't need any package tools, and it does everything node.js can do on your OS while using your browser as a GUI frontend.


Except you'll have to develop with another ecosystem of packages and cannot do generic programming "natively".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: