Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I would argue it's even worse than that. Spoof your UA string or other behavior to achieve the best user experience, then find yourself blocked by 50K Cloudflare sites for spoofing:

https://community.cloudflare.com/t/browser-integrity-check-b...

It's ridiculous, but we need a regulation stating that web sites designed for use by the public need to be accessible by the public. Of course I don't mean performing compatibility checks with the last 5 editions of three dozen different browsers, but simply avoiding situations that lead directly or indirectly to

if browser$ == a {render site}

if browser$ == b {render site}

if browser$ == c {render site}

if browser$ == d {render site}

else {echo "go away"}

Yes, IMHO it is the the old browser wars resurfaced and I don't really understand why. Arrogance cloaked by the veil of security? We've learned nothing. At worst I consider this a corollary to net neutrality...and we know where that currently stands. I would love for the EU to take notice, but they seem so obsessed with cookies. I briefly tried getting the EFF to take a stance, but don't think they bit. Maybe it's because everybody IS using Chrome?



I'm going out on a limb, but I can't see a downside to making an "ADA for the web". Any for-profit company of a certain size or revenue has to have their content available in plaintext, including audio and videos being transcribed. Content should also be available without artificial restriction (like DRM), so you can use whatever software you like to process the text.

If everything is going digital, this will be necessary for visually impaired people to access critical information.


The only thing I can think of as a negative is that it raises the bar for new players to join.

You already will have a very tough time hosting your own materials without using a major player as an intermediary. Emails in particular are far more likely to be blacklisted simply because they're not originating from a known approved source regardless of your DKIM, SPF, or DMARC settings. Adding that every new website has to be fully accessible to the visually impaired when on top of that there isn't a single collectively approved standard to measure yourself against would be an enormous hassle and stop a lot of new people from even being able to take a chance with starting a new website.

However, that is remediable with an approved standard to meet and clear guidelines on the processes that must be followed to meet them, so that's not as big of a deal as it can be made out to be.


Also it's quite unfortunate that bots can easily use the same interfaces designed for screen readers, so just using a slightly-standardized REST API or plain-old HTML can expose you to bot attacks (ddos, spam, etc) once you reach a certain size.


> A browser that is intentionally deviating from how the rest of the common browsers works cannot expect to receive the same level as support

There should be no need to "support" it. The server should just send the data requested and be done with it. Whether or not it works is the user's problem.

What has the web turned into?


I know CloudFlare is an increasingly popular bogeyman but that's like blaming the Linux kernel for dropping your request to a website after its owner added a rule to block your IP...

CloudFlare does tell users of their services how their different levels of protection will impact their visitors. The site owners CHOOSE to add those barriers. CloudFlare didn't force them, they're perfectly happy being a mostly passive CDN if that's what the website owner wants.


if browser$ == a {render site A}

if browser$ == b {render site B}

if browser$ == c {render site C}

if browser$ == d {render site D}

else {echo "go away"}

is the nightmare situation we're circling the drain on approach




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: