Hacker News new | past | comments | ask | show | jobs | submit | more the_angry_angel's comments login

Off the top of my head;

* No Active Directory/SAML/other SSO. * Gaudy/gamer orientated look-and-feel, messages and notifications is hard to sell to business. * No/Poor implementations of "Enterprise Features" (read: eDiscovery, DLP, etc.)

Of those the first and second are arguably the most important as it is often a requirement for businesses of any size. The third is a bit more larger/enterprise-y business.


This was exactly my intended use case - as a gift for relatives - the problem I've had is that I need it enclosed to prevent tiny little fingers from getting hurt/destroying the thing. Tried soak testing it one night and experienced the same thing :/

Currently I'm hoping that I can modify the case this weekend to fulfill my needs.

As something intended to be used by kids to learn about computing I do wonder how many are going to get hurt fingers from the heat output by accidentally maxing the cpu and/or touching the USB ports, etc.


> As something intended to be used by kids to learn about computing I do wonder how many are going to get hurt fingers from the heat output by accidentally maxing the cpu and/or touching the USB ports, etc.

Sounds like one of the unexpected lessons will be heat dissipation :)

Probably outside of the realm of weekend hacking, but I wonder if something like this would work for you: https://flirc.tv/more/raspberry-pi-4-case

Best of luck!


Hadn't come across the case. I'll give it a go if my DIY attempts dont work (or are crap) :D


This is my beef with (my understanding maybe incomplete, in which case I apologise) the implementation.

Internal DNS, split brain DNS aren’t catered for without disabling support? I don’t want my internal names leaking to the internet, nor necessarily are they the same for external resolvers. Now yes the latter is a hack, but it’s one widely used still today.

The idea is laudable. But it feels hostile. I can disable support, but for how long?


I guess it would be possible to run my own local DNS server that connects to these DoH servers. Does any DNS server support DoH? This could also allow the user to override domains using their /etc/hosts file in case DoH on Firefox doesn't support it.


I use pihole + dnscrypt-proxy 2.*

A fair number of resolvers support DoH and dnscrypt-proxy also supports DoT. It's fairly feature rich, you can configure a hosts file and then some.

[0] https://github.com/jedisct1/dnscrypt-proxy


I'm running my own DNSoverHTTP instance at home. I have Apache, with HTTP/2 support, running, some self-signed certificates, and a CGI script that accepts the DNSoverHTTP request and makes a DNS call to my local version of bind. I found RFC-8484 to be quite easy to follow, and I've set network.trr.mode to 4 (use DNS, but also send DNSoverHTTP for testing) and network.trr.allow-rfc1918 to true (so local addresses can be resolved locally).

I will do the occasional tests with network.trr.mode to 3 (only use DNSoverHTTP) but I seem to have issues resolving github. I haven't looked that far into it.

EDIT: there do exist solutions to run locally.


Thanks, I’ll hace to look it up and give it a read. I’ll be honest I’ve not read the actual RFC in this instance and pieced together what I know from articles, reported behaviour, etc.

I know it’s lazy and I should’ve done more work. But, burn out


In that case you are better off running local DNS and using a different subdomain (internal.companyname.com or whatever) for internal DNS entries; the DNS-over-HTTPS query will go out, fail, and then Firefox will fallback to traditional UDP DNS on port 53, hit the local resolver on the LAN, and away you go. It will presumably cause a short delay the first time a host is queried, but after that I assume Firefox is smart enough to cache the result, so unless you have absurdly short TTLs the performance impact should be pretty low.

The positives certainly outweigh the negatives of inconveniencing some IT admins who, as you correctly point out, are implementing a dirty hack anyway.


the DNS-over-HTTPS query will go out, fail

You completely missed the point of the parent, which is to NOT let internal hostnames out of the network.

The positives certainly outweigh the negatives of inconveniencing some IT admins who, as you correctly point out, are implementing a dirty hack anyway.

This is a perfect example of the irritating attitude I see from people pushing hostile features like this. Everyone wants their network to operate the way they want, and yet you think you know better than the actual owners of those networks.


The owners of the networks have time and time again done user hostile things that have compromised their security and privacy.

Corporate networks are a small percentage of network traffic and their use cases are less important in the grand scheme of the internet.

DNS over HTTPS is a solution because network owners can't be trusted. Either by blocking or by taking their DNS logs and selling it to advertiser's.


I am the owner of my network, as is everyone who has one at home, and you are saying that I can't be trusted. WTF!?


I think Jonnax meant ISPs, not you.


In which case he is wrong. You are the operator of your own network.

If applications decide to bypass you, they are hostile and cannot be trusted.


Then DNS protocol is old and needs to be updated. Security considerations need to be forward looking rather than clinging to the past.

DNS is unencrypted and a security risk. For the user. It's an old technology that needs to be updated.

DNS over TLS/HTTPS allows the browser to get a trusted record of IP which is a public register.

The bypass here is looking up what the corresponding IP address for a hostname is.

DNS based blocking isn't as effective as IP blocking.

You can feed a DNS based list of IPs you want to block into a firewall and have the exact same behaviour.

Both Firefox and Chrome have the ability to set enterprise user settings that can force certain configurations.

So you should have the ability to disable it if you want in your network.

If you're worried about the security of your network don't allow devices that you don't trust into it and restrict internet access properly.

What's more anyone can configure a custom DNS resolver on their device when connecting to a network.


You seem to forget that Domain Name Resolution became a problem after the more generic Name Resolution (ie Novel/lanman/NetBIOS). The Generic Name resolution system used lmhosts, which became hosts to more easily associate IPs and names. [0]

> Originally these names were stored in and provided by a hosts file but today most such names are part of the hierarchical Domain Name System (DNS).

[0] https://en.wikipedia.org/wiki/NetBIOS#NetBIOS_name


The lack of trust I mentioned was about ISP provided DNS servers. You don't own your WAN network and the majority of people use the DNS provided by their ISP.

On your own network, if you feel like doing a DNS lookup to what amounts to a public address book is unethical then don't allow arbitrary clients on the network.

If you want to do blocking based on a DNS list, configure your firewall to do that.


Knowing better than the owners is a matter of tradeoff.

There are whole isps and even countries (including the UK shortly) which mess with DNS requests. Helping the millions of users who are in that situation, and don't even know what D Sits, seems like a net good. As you say, experts can choose to disable it.


> As you say, experts can choose to disable it.

As long as they can. The problem with these ideas is that it can get increasingly difficult to work around them. How many hoops you have to jump through to pcap your own software on your own machines now that certificate pinning is becoming popular? What when someone will have the bright idea of implementing certificate pinning for DoH inside browsers, "because security"?

(I could live with the choice between having to somehow acquire Chrome Enterprise Edition vs. switching to Firefox, to have a browser I can control. I'm worried now that Firefox might be turning into Chrome, though.)


> including the UK shortly

If you're implying the porn filter, no, the porn filter has been shelved 'indefinitely' because a) it's against EU law, b) it was May's personal project (she pushed heavily for it when she was Home Secretary, and it became a thing under her PM-ship).


Firefox could provide an opt-out flag.


A likely opt-out [1] mechanism that will prevent Firefox from enabling DoH in any future update.

5 means explicitly disabled. 0 (default) is whatever is considered default for now.

  1. about:config
  2. network.trr.mode = 5
[1] https://news.ycombinator.com/item?id=20373444


Once Firefox starts to ignore DNS resolvers configured at the OS level other apps are sure to look at it and think it must be a good idea because Firefox is doing it. Soon there will be a multitude of applications needing this disabled in their own unique way.

If the Mozilla Foundation see this as an issue they should instead be developing a separate solution to provide this system wide. If you must bundle it with Firefox and offer to install it at browser installation or upgrade time. Don't install it by default and certainly don't enable it without user permission.


You can absolutely configure IPSec tunnels on Fortinet.

Can I ask, do you perform updates and maintenance on these boxes? How do they perform in terms of throughput with openvpn?


I'd be more interested if wireguard is an option, and how it performs. Since they're already on linux/oss.


IPSec is pretty much the standard interop still - pretty much everything talks it Cisco, Juniper, pfSense, Fortinet, etc. You name it I’d be surprised if it didn’t have support.

I’d love to see wireguard implemented in the networking world, but I think it may take some time to get there :/


FWIW in MSSQL there’s always the `option (maxrecursion 0)` trick


On one hand I kind of agree. It's a special syntax, which is one of the reasons why I personally dislike Vue and Angular, etc.

However, pragmatically, it's minimal and all the guts will be in Elixir - where the people probably most interested in LiveView will be spending most of their time anyway. So it's a net win imho.


Definitely not just a gimmick. It's aimed at smaller contractors imho.

I bought my first house in the UK less than a year ago. I ripped absolutely everything out. For weeks at a time I had only 1 working sink, with 1 working electrical output. During this time I had local contracters in to deal with stuff I either didnt have the time, expertise or legal certifications to manage myself. Barring a carpenter, every single one brought a makita radio/wifi unit, and every single one needed to leave the house to get hot drinks because the single socket could not safely take more devices plugged into it (chargers on an extension), etc.


Contractors I had were bringing their coffee in thermos until they were able to connect a kettle.


> renamed the web folder a couple of times and so on

Just to add a bit of information to this, they were all in RC or pre-releases. I was bitten by this myself, but if you choose to run RC code then you need to own that a little bit.


Story time;

I've recently started working with a SME (<100 users) who took this approach after abandoning the vendor's reporting recommendations for their Line of Business application (apparently it was just crap).

Other areas of the business cannot be without these reports under any circumstances. Negotiation is not an option.

Since about 2012 they've had various employees generating reports, alerts and data warehouses as objects in the database. Naturally some of these have been superseded. Very often the old objects were left behind "just incase". Almost all of the original authors have left and documentation has been lost or just didn't exist in the first place.

Many of these objects are dense and difficult (because SQL was the wrong tool for the job, or because they go many layers deep like matryoshka dolls).

Demands for changes to reports are frequent (weekly), and some reports functionally overlap, but produce wildly different results for different areas of the business based on various "rules".

Honestly, the database is a fucking mess (approximately 1900 objects relating to reports, alerts and data warehouses - some of which are dolls going many many levels deep., There's a huge sense of shame and fear over trying to regain control.

I totaly understand that this was entirely a human problem. With more restraint this wouldn't have happened. Both inside and out side of the IT team. However, a tool like PopSQL, metabase, etc. I believe would've helped with;

1. Ad-hoc queries that didn't really need to be full objects

2. Discoverability through prettier annotations, etc. - reduction of reports being duplicated

3. Auditing - last access/run times, etc. would be much easier to find to reduce the cruft

4. If the tool is more appropriate it could off load some of the things that SQL is not good at

There are undeniably good reasons for storing reports, alerts and data warehouses (developer maintained objects) in version control. If the report creation/alteration requests don't come often, then I'd argue it's ideal.

However living in the real world, I personally feel it needs to be balanced in conjunction with tools like PopSQL, metabase, etc. Right tool for the right job.


I've seen this same story at every job ive ever had or contracted for. Fortune 100's down to mom and pops.

I think metabase/redash/etc is a great fit for the development of reports and the kinds of things that dont need to be anything more than a sql query. IME fast turnaround can be huge for lots of businesses.

I agree with all of your points. A lot of the problem is organizational and political and historical - but the right tool can make a big difference.


Any tips on prying people away from Excel, and into something like metabase/Tableau/whatever?

That seems to be one of the biggest hurdles I think I'm going to have to figure out, rather than the technical side.


a decade ago I would have had a lot of ideas for you but as time goes on the more I have learned to embrace excel.

Theres two directions you could be coming from though - if its a problem of the users producing data and storing it in excel instead of your database thats more of an application problem.

But if its a matter of excel being where your users want to look at and work with the data - I say embrace that. Use your metabase/tableau/redash to provide them with that data and let them get it out in excel and do whatever they want. Give them a little time and then let them show you what theyre doing with it.

IME about half the time theyre actually doing things with it that you cant really help them with unless you spent a ton of time. The other half you can take what theyre doing and either apply those changes into the query itself, saving them time and sharing that work with others - or teach them a better/easier/faster way of achieving whatever it is theyre after. I use it as a long-tail way of gathering requirements a lot of times - let me give you all the data you might need for this and you figure out how it can be useful for you.


I believe Duplicati is the current tool du jour


Aha!

This looks interesting.

Just to confirm, you mean this Duplicati? (https://github.com/duplicati/duplicati)

Also, it says it's a client for backups which support cloud and local servers.

On their page it says it supports:

"Amazon S3, OneDrive, Google Drive, Rackspace Cloud Files, HubiC, Backblaze (B2), Amazon Cloud Drive (AmzCD), Swift / OpenStack, WebDAV, SSH (SFTP), FTP, and more!"

Only local options I see are SFTP/FTP. Do you know of other supported ones? I can't seem to find reference to other local solutions, so if you have more info that would be great!

I mean, setting up an FTP server is pretty simple and with such a client most of the hassle of actually backing up would be solved apparently, but are there any other options for local servers that you know of (just for comparison sake)?

Any input is appreciated. Thanks!


Yeah thats the one.

SFTP is SSH (i.e. it's not FTPS), so you don't have to stand up a FTP server :)


Ah.

I always thought SFTP was FTP. Will need to read up more on that.

Thanks!


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: