Hacker News new | past | comments | ask | show | jobs | submit | tropicalmug's comments login

Think it's funny that the person who drew the Pizza Hut logo for Domino's is considered more accurate than the person who drew a domino.


"One man alone can be pretty dumb sometimes, but for real bona fide stupidity there ain't nothing can beat teamwork."

Seldom Smith, from The Monkey-Wrench Gang by Edward Abbey.


"A person is smart. People are dumb, panicky dangerous animals"

-- Agent Kay (Men in Black)


There are other needs for a shortened URL. Perhaps you have to hardcode a link in code and don't want to be tied to a page whose contents may break. If you control your URL shortener, you can hardcode your shortened one, and update the redirect as needed.


> Perhaps you have to hardcode a link in code and don't want to be tied to a page whose contents may break. If you control your URL shortener, you can hardcode your shortened one, and update the redirect as needed.

And when your URL shortener goes down, all links are dead. Shortening URLs is the last thing you want to do when you need to preserve pages that might go down. Caching is a better solution.


http://perma.cc addresses this problem.


Why would one use this over archive.is?


I wish instead of a protocol improvement that focused solely on network resources, the next version will also include improvements for users such as encryption by default and doing away with cookies.


It's unfortunate that TLS ended up being optional in HTTP/2, but if the big browsers only support HTTP/2 over TLS (as FF/Chrome have said they will do) then we might see very little non-TLS deployment.


You can begin today to do away with cookies on your own sites and services. Start implementing richer clients and leveraging Open ID Connect and OAuth2.

Cookies solve real use case problems. Unless we all start building and experiencing and improving the alternatives, progress won't be made.

That said, good luck on getting rid of cookies all together.


Excuse my ignorance, but how can I do session management without using cookies?

I tried searching on the net, but it doesn't seem to give any concrete/valid results.

Can you give me any pointers?

Edit: I do use OAuth2.0 on my services and use Mozilla Persona to manage user logins, but I am not clear how can I keep sessions between requests if I don't use cookies.


You can carry the session ID in the URL. This also has the benefit of eliminating XSRF. The downside is that you have a horrendous URL if that type if thing bothers you, and you can't have a "remember me" check box in your login.


This approach has some massive downsides - the session ID is sent via Referer to outbound links, URLs are logged all over the place (including browser histories), it's easy for people to publicly share it without thinking which then ends up in Google as well...


Partying like it's LITERALLY 1999...


That's a horrible suggestion, it's not 1999 anymore…


HTML5 has local storage, so you can put auth tokens in there and only send them when you need them, versus on every request.


So rather than just using cookies effectively, you can make your application absolutely dependant on both JavaScrip and XHR?


"Applications" on the web are inherently dependant on JavaScript and most often XHR too, but I do agree that using Local Storage has little to no advantage over Cookies.


> "Applications" on the web are inherently dependant on JavaScript

No, they are not.


Then we clearly have different definition of "application". For me, an web application runs in the browser, not merely exposes an API over HTTP that can be used by HTML from a browser.


You have a very narrow definition of an application then.


Yeah, him and every other user that includes basic features like "when I select the first step in a sequence of steps, the UI immediately responds instead of waiting several hundred ms to fetch an entirely new set of markup".

Every time I use Tor, I appreciate your viewpoint. But trying to pretend that most developers are better off spending their time maintaining a separate renderer for a few edge case users is not really reflective of reality.


I didn't say JavaScript can't be used to ENHANCE an application, I'm saying it isn't necessary and apps should work without it.


Absolutely. I'm sick and tired of idiot developers fucking up the web with their bullshit "apps" which are slow, crash all the time, and break everything.

I've worked on more "web apps" than I can count and the reality is, only 2 of them were legitimate use cases for a pure JS solution. And we gain nothing from it. I've just spent all morning debugging an issue with a complicated angular directive (and not for the first time) that would have been a few lines of jquery a couple of years ago. Probably because a bored dev wanted to play with a new toy.

As you imply, we were writing sophisticated web apps long before AJAX was popularised and those apps were way more reliable and predictable than what we have now, and they worked in Lynx if you wanted.


Even in 1999, using JavaScript to make far more usable UIs was common. While I'm not a fan of these bloated apps that need a couple megs of stuff just to render some basic looking page, let's not pretend that requiring user action and a full round trip to render even the smallest change was some golden era.

>that would have been a few lines of jquery a couple of years ago

Irony?


Please elaborate.


I agree with you completely. Cookies are an older technology and is well supported in all kinds of browser. Plus it does the client side of session management for you(sending the tokens on every request). Localstorage is newer technology and might not be feasible in all situations. Plus JS+XHR are also not available to all kinds of users. (People using Tor, NoScript etc).

Also, I don't see the advantage of storing session/auth tokens in localstorage over cookies. Both are stored in plain text, and can be read if somehow it is obtained. Also, using localstorage means writing your own client-side implementation of doing the session management.

I also don't see the advantage of using session tokens in URLs. Anyway cookies are included as part of header of the HTTP request, you don't have to have your application send session trackers. I think both are functionally same and the tokens in URLs just does not look good!

And public/private key-based signing system is still not there yet, unless we simplify some UX issues about having private/public keys for every user, we are not getting there.

So, it looks like, to me, there is no really effective alternative for doing sessions apart from cookies (even in HTTP/2)?!


Maybe you can get around not using cookies for an AJAX application that keeps a constant connection open. But for the other 99% of the web, you still need cookies to get the statefullness required for any kind of authentication or tracking.

Cookies aren't auth tokens anyway, just session trackers.


Theoretically, we could move to a private key based system, where your browser encrypts/signs with a private key for each site, but there's neither the will to do it, nor the means to make it simple for the room temperature IQs. Shame, as the privacy and security benefits would be amazing.


This could be done today with TLS Client Certificates. There is already browser support (through either <KeyGen /> or the MS alternative, which is an API rather than an element, I believe) for creating a private/public key pair, and sending the public key to the server.

Unfortunately it's not fantastically simple to move to a new device (particularly not a mobile device where client certs are even harder to install)


>richer clients

Please, no. The internet works because it's compatible, and installing a local client for everything just prevents use of a service.


What tools would you suggest for doing this? Or even what algorithms to implement for doing this sort of work?


Open your user key-bindings (in OSX this is Sublime Text -> Preferences -> Key Bindings - User), and put

  [
    { "keys": ["ctrl+tab"], "command": "next_view" },
    { "keys": ["ctrl+shift+tab"], "command": "prev_view" }
  ]


Thanks. I've been living with not having this for a LONG time. Just never had time to deal with it.


Thank you so much for this...

you have just made my day


Newsblur[0] is a great open-source[1] feed-reader that not only beats Feedly in ease of use, but also has a bunch of great additional features over the standard RSS reader, with apps for both iOS and Android to complement the killer web-app. I highly recommend it.

[0] https://newsblur.com/

[1] https://github.com/samuelclay/NewsBlur


I went with NewsBlur b/c I didn't like Feedly requiring my gmail address.

After 5 months I finally subscribed, because it seems to work well enough and I want to see it survive. I'd easily have paid Google the same amount or more for a Reader subscription, but hey with Reader gone, I'll have to migrate to Google Plus right?

Wrong, Newsblur it is.


11 days? Two weeks? A month? Never?

Not every company values this sort of feedback from their users. Some go out of their way to prosecute those that break their services this way.


That is sad. There should be a some kind of TOS for public internet companies. While this kind of exploits does not hurt google but they can be very dangerous for users.


Sounds like it would be a good extension to Data Protection / Computer Abuse/Misuse Acts depending on what it allows access to. Anyone pointing out a failing in your system should not be prosecuted unless they've actually committed a crime.


That doesn't actually help; you need to think more like a prosecutor. By investigating the vulnerability they have, in fact, committed a crime - see for example http://blog.erratasec.com/2013/09/how-weevs-prosecutors-are-...


It doesn't say anything about her backtracking. It says she defends the program and that her "reform" bill maintains the status quo.


Another great Bloom Filter library not mentioned in this article is Bitly's implementation on GitHub[0]. It has the best of both worlds: a C implementation and higher-level wrappers (Golang, PHP, Python, and Common Lisp). If you don't want to play around with a fancier DBMS and really just want the damn filters, I would look here.

[0] https://github.com/bitly/dablooms


It is elliptically mentioned, via the link to this wonderful pull request: https://github.com/bitly/dablooms/pull/19


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: