Hacker News new | past | comments | ask | show | jobs | submit login
Stop asking users for their timezone – use per-device settings and JavaScript (trevoro.net)
96 points by neilk on Aug 7, 2013 | hide | past | favorite | 64 comments



This pretty much breaks progressive enhancement. If anything, the user's time zone should be first estimated from their IP address with a setting to override it. The client side solution to further refine it should be piled on top of this last, if possible. JavaScript is useful, but nobody should assume it's present.

Also, it should be noted which time zone a time is in. A naked date/time is about as useful as saying that you're 3 units away from me. What unit does "unit" represent? shrug


By your logic I'd be running a WAP site over Gopher to satisfy the progressive enhancement ethos. At a certain point it's absurd to assume javascript isn't there. [1]

What brought on this particular use case was a JSON response where all the times were in UTC. I'm not really into forcing a technique on someone when it isn't required. If that's not your use case, then you don't have the problem!

[1] http://www.mozilla.org/en-US/firefox/23.0/releasenotes/


No, that's patently absurd, although I have been known to use w3m while doing work in vim, and I'm super grateful when a site is usable from a text only browser. Screen readers and bots (including search engine spiders) have an easier time with well written HTML.

Progressive enhancement is far better for the end user and not all that hard to implement. Some people choose to disable JavaScript, and there are compelling reasons to do this especially if, for example, you're using TBB and want to minimize your attack surface, as has been demonstrated recently.

JavaScript isn't evil by any means, and it's really important for creating cool shit, but it should never be required. If I can't use your site with scripts disabled, neither can a lot of other people. Sucks to be them though, right?


> it should never be required

What a ridiculous statement. Lots of things are "required" if you want certain products to be useful. An XBox is required if I want to play XBox games.

Hell you require a web browser just to be able to enjoy the pleasure of checking that "disable javascript" box (though be careful you don't use Mozilla) in the first place (I mean you could use CURL but then you don't actually get to disable javascript).

The point is lets stop kidding around about Javascript here, no-javascript guy.

> If I can't use your site with scripts disabled, neither can a lot of other people.

People who need to disable Javascript are a vanishingly small portion of any potential market, there are many more use cases as far as web apps go where javascript is essential to any decent user experience. Not everything built on web technology is only about content these days. If you don't want to use that technology be my guest, but don't pretend your an important market to people developing on that technology.

> Screen readers and bots (including search engine spiders) have an easier time with well written HTML.

True but two things.

1) Web applications rarely rely on SEO outside of landing pages and a content strategy neither of which should rely on javascript for obvious reasons.

2) Designing applications for disabilities is extremely hard. Much harder than simply using "well written" HTML. Do you know how hard it is for blind people to play most video games? Again we're talking about interactive applications, not simply content. With content there is no problems but as we already went over the web is not only content now.

Edit: One more thought, technically translating the time to the local timezone setting of the browser is "progressively" enhanced. You could always just display UTC time if JS is disabled assuming that was important enough to your users to disable JS (or for bots).


  People who need to disable Javascript are a vanishingly small portion of any potential market.
Depends on who your target demographic is. I'm a "no-javascript guy" and my cousin is as too (well, "gal") for completely different reasons. She's on a rubbish computer not living in the U.S. (read: developing country) so often it's just faster and smoother to browse on a wireless connection with JS disabled. You'd be surprised at how common this is.

I frequently disable it while browsing casually and, for about a month or so, it was policy at work until we sorted out a few in-house security issues (namely browsing etiquette for some of the folks). Our CMS was broken during this time and the front UI was quickly re-written in plain HTML. After that, we sorta left it that way.

  1) Web applications...
The original post makes no mention of "application" or "web site" for that matter.


> You'd be surprised at how common this is.

Actually, I would. I would be completely surprised if it is common (by that I mean at least more than IE6 usage) at all. However, you've presented no evidence of this fact only anecdote. Seriously there is absolutely no evidence that there is a growing popularity of no-js people out there. Perhaps for very specific demographics, in which case I certainly hope whoever is building product for them knows their customer well enough to know that or God help them.

Either way they are not going to be building the latest in interactive experiences or web-based gaming for your cousin with her rubbish computer are they. That doesn't discount the fact that many people are building exactly that these days.

I'm tired of people on Hacker News making blanket arguments like "never do this" especially something as bland as requiring javascript. They have absolutely no clue what they are talking about.

Bottom line is if I (and Google and 37 signals and countless others) can choose to build something that doesn't support even IE8 I can quite happily choose to require Javascript to use my web application. I would be an idiot however if I required it for my blog.


Yahoo Blog on users with JS disabled (stats as of 2010):

http://developer.yahoo.com/blogs/ydn/many-users-javascript-d...

For clients in the UK, it's a legal thing:

https://www.gov.uk/definition-of-disability-under-equality-a...

http://www.rnib.org.uk/professionals/webaccessibility/lawsan...

FYI: Screen readers nowadays can run JS, but there are hiccups abound. Also I don't recall seeing "never do this", though I did see this: "JavaScript isn't evil by any means, and it's really important for creating cool shit, but it should never be required" from https://news.ycombinator.com/item?id=6176036 . Which, in context of the article, seems pretty reasonable to me.

This all started with when kintamanimatt mentioned that the article's technique breaks Progressive Enhancement, "JavaScript is useful, but nobody should assume it's present."

Obviously, I'll need to get an Xbox to play an Xbox game. I'll need Flash enabled on the browser to play a game built on that. Same goes for JS. But, again, the OP makes no mention of "applications" or "web sites" so I'm not sure why you're bringing up "the latest interactive experiences or web-based gaming" into this.


The Yahoo link says JavaScript was disabled on 1% of visitors as of 2010.

That percent should be even lower now instead of being higher.


Honestly this is getting completely absurd so I'll try to be brief

1. You're article actually states how small this is so point taken I guess

2. 99% of screen readers today actually support javascript [1] and as I stated, if you want to talk hiccups effectively supporting the visually impaired is a veritable minefield of challenges and just supporting javascript being disabled isn't even the tip of the iceberg.

3. Finally just this: Also I don't recall seeing "never do this", though I did see this: "JavaScript isn't evil by any means, and it's really important for creating cool shit, but it should never be required". Just read it back slowly, you'll find the word "never" in there if you're more careful.

You are also choosing to presume incompetence of these issues on the part of the author where I see none demonstrated. Even he bothered to explain his, fairly common these days, use case (JSON response processed on the client with javascript). If you want to argue against this sort of thing be my guest, but you're not really going to reverse the trend. Either way it isn't a debate I'm very interested in.

Reductio ad absurdum as they say.

[1]: http://www.punkchip.com/2011/03/why-support-javascript-disab...


Woah woah woah - can we talk about what is really going on here and note that Firefox no longer supports <blink>


1) Exactly. This doesn't help at all with server-side rendering of times, which is usually the use-case (times associated with forum comments, etc.). IP address detection is pretty annoying to do (and doesn't help with VPN's etc.), but a quick <script> in the <head> of any page that does an instant "redirect" to put the JavaScript timezone info into a server-side session variable is a functional, albeit hackish, solution. (If you don't have JavaScript enabled, then just serve up UTC times, it's an edge case.)

2) Usually your times are stored server-side in UTC, so you never have to worry about timezones anywhere. It simplifies things a great deal. "Naked" = UTC. You assume that, whenever it's stored, it's already been translated from the user's timezone when the date was initially generated, or else the time it happened on the server was the actual time.


Sorry, I meant naked date/times that are displayed on a page. It might be that site A gets the user's local time zone correct, but site B, C, and D don't. By displaying the time zone abbreviation next to the displayed time it removes all ambiguity and possibility for error. Also, if it's a time zone abbreviation that's displayed (e.g. EDT) make sure to mark this up properly and fully explain which time zone and country this is for. I can't remember which, but there are some time zones that have the same initialisms. Some don't have initialisms at all, and users from, say, Poland might not know what PDT is either.


This is what we did on a recent project. The times were left at UTC with a hover tooltip (JS with a fallback to title="Coordinated Universal Time") and we used the "timeago" plugin for jQuery http://timeago.yarp.com .

So far, no complaints.


Progressive enhancement of dates is easy. Output UTC, update to local time zone with JS. Simple!


An option to override this is a must, I frequently use VPNs and as a result appear in many different countries through-out the day - It's bad enough with google repeatedly blocking my accounts for suspicious logins, without having to worry about timezone differences too.

Also, it'd be nice to ALERT me when my timezone has changed and give me an option to keep the old one, or use what you think my current one is.


In this case progressive enhancement is offering a GMT date or other sensible default, and offer another date after javascript kicks in.

Personaly, I think for anything where timezone is relevant, the best move is to always let the user manually enter the timezone he/she wants, or have it match the timezone of the event/service he's looking for, and of course allow for override, so changing timezone has to done client side.

I.e. if I am in Germany and search from a flight taking off from Taipei two days from now, I don't care about the departure time from the German timezone. Same thing if I reserve a restaurant in Germany while I am in Taipei, you'll want the german time for the reservation. For something internationally broadcasted showing the UTC time as default and provide a way to adapt to other timezone is OK, as the event has no local anchoring anyway.

People can do the calculation for timezones, eventually some opt to not change their phones timezone to keep in sync with other more important schedules, etc. Trying to do 'smart' things seamlessly is often just screwing with the user.


Geolocating the timezone based on IP is so fragile and likely to fail that IMHO it's not nearly worth the time to implement.


Indeed and will get worse with ipv6 rollout no doubt.


Oh, I hate when website follow your approach.

To start, it doesn't work if you use VPNs. Second, it doesn't update when you travel.

Here's one example. Even though I'm logged into Google and used Google Maps' 'pinpoint my location' feature, Google Reader is currently emailing me my daily schedule (a feature I enabled recently) in the afternoon. That's also despite my appointments being annotated with timezone information.

I call this bad engineering.


I started to write a response, but it grew too long, so I wrote this blog post instead: http://tech.bluesmoon.info/2013/08/dont-guess-at-timezones-i...

tl;dr: the timezone you use depends not just on the user and their environment, but also the event and its duration.


I looked into getTimezoneOffset when implementing pre-orders here at Gumroad and realized that technically this is incorrect:

"Luckily, the browser already knows what timezone the user is currently in, so we can make use of the getTimezoneOffset() function"

Getting the timezone offset isn't the same as getting the timezone. Granted that in certain cases having the offset is good enough, but sometimes you need the actual timezone. In my example I needed to know when a merchant wants their pre-order to be released. For that to work predictably I needed to know their exact timezone in order to adjust the time for daylight saving properly. Remember that there are quite a few timezones that are UTC-7 and they can behave differently in regards to daylight saving.


Fortunately operating systems on laptops and mobile devices all adjust timezones automatically whenever they connect to the internet.

My laptop has never changed timezone automatically. And that is a feature in my eyes.

How can you assume that just because I'm doing a weekend trip that I want my whole life to be centered around the timezone in that particular place?


Most people want their clocks to be correct, local time, including their laptop clock. It's a better assumption to make for most people.

Of course, it can always be turned off for others, like yourself! :)


Indeed I do want it to be correct. Geolocation of IP's has placed me all over the country while I sit at my desk. My desk doesn't move.

You can reliably tell where a cell phone is by the tower it connects to, guessing by IP is nowhere near as accurate.


Citation needed. I'd say that a vast minority (that actually cares) wouldn't want the OS to change the timezone behind their back on a laptop.

On a phone (which also acts as a wrist watch for most people), yes. Laptop, just no.

What OS does this? And how?


OSX for sure. "Set time zone automatically using current location". Checkbox inside system preferences.

I think it's pretty safe to say most people want their laptop's clock to match their local time, because how is a wrong clock useful? Of course, if you're travelling for business with complicated schedules and meetings in home times, you clearly might have different needs, but most people just want their clocks to be right.

And you certainly don't want to keep your original timezone but change the clock setting -- then just everything's wrong! :)


On my laptop, any time that isn't the current time in my hometown is the wrong time.

And it is moronic for the OS to do this change behind my back (I don't know how OS X present this but thankfully Windows 7 doesn't do this madness by default).

How do I know that my OS got the time right? (did it perhaps get the time from my VPN location? was the network I connected to badly configured?) Since a laptop isn't connected to the net at all times I also have to consciously be aware of whether I've even been connected yet in this particular time zone.


So if you travel to Tokyo for a week, you still want your laptop displaying the time from your hometown? That seems rather unusual.

And for the record, OS X does it by sniffing for wifi access points. The particular access points available at any given location is a pretty reliable means of tracking location, assuming that there are any (if there are no access points, it can't find your location, and therefore won't touch your clock). The only exception are access points that themselves move, e.g. wifi on planes. I would assume that these access points are disregarded.

This is the same system that iOS uses to enhance the GPS (especially on iPod Touches and Wifi iPads, where you have no cell towers).


if there are no access points, it can't find your location, and therefore won't touch your clock

This is the part that scares me and has gotten me into trouble in the past. Basically it means I do not know if my clock is local time, home time, time of the last timezone I visited or something random and thus I cannot trust it.

This isn't just a hypothetical, but has bitten me in the past. Fortunately it lead to me showing up for my train an hour too early rather than an hour too late. But after that I never let my OS play around with my timezone settings.


OSX doesn't use IP geolocation to determine location... they use wifi triangulation. VPN connection-state shouldn't make a difference one way or another.

http://en.wikipedia.org/wiki/Wi-Fi_positioning_system


In one out of the one trials I've expected OSX to automatically set the clock ("automatically change time zone") it hasn't.

This touches on to the relevant point of DST. I no longer have to worry about changing clocks; the worry has been replaced by the confusion over whether all my clocks have changed or none have.

At a bare minimum I'd expect any change of time zone to prompt the user.


Yes, DST is a pain as well (and completely unnecessary, just ditch DST already).

Especially when you set the alarm for the next morning - and you know that DST kicks in but you have no way of knowing whether your phone knows that. And you can't change time beforehand because you don't know whether the phone will respect that or change the time back or make the change twice...

I found out that the stock android alarm app solves this (somewhat), when an alarm is set a notification tells you how many hours (and minutes) remain until the alarm goes off and this takes DST changes into account so you can see that it got it right.

Apple has succeeded in getting this wrong several years in row with the iPhone, quite embarrassing - and made much worse by the fact that you as a user can not prepare for it.


I feel that you are taking a narrow-minded, absolutist view here.

"How do I know that my OS got the time right?" Well, have a look. I've often changed the DST switch manually after noting it not being correct. It wasn't then overridden by the OS "knowing better".


Why should I have to "have a look". If I know that the time zone is always the one from my hometown I never have to look or worry about it being wrong. I'd rather calculate the time difference in my head (or take a look on my watch/phone) than "have a look".

So, I have never ever (after the OS installation) changed the time zone for any of my computers and knowing that the OS doesn't do this behind my back I've never had to look or worry about it.

But even if it were perfect, why would you want your laptop to reflect the time zone you are currently in? Shouldn't the time on your laptop be the time you are the most familiar with and that you most deal with? For your day-to-day needs you have your phone and/or watch anyway, if anything a reference to the home time would be preferable instead of having all devices say the same thing.


Citation needed yourself. For the claim that people with devices that show the time actively and largely don't want the time to reflect the location's timezone.

Phone, yes.

Laptop, yes. Times are still stored internally, and the timezone represents a localization. And if I go across the world for the weekend I see that locally, it is UTC+10, while entries in a DB are timestamped accordingly. Then I return home and I can look at the clock in my menu bar and see my local UTC-8, and all the while entries in a DB are timestamped accordingly.

I'd love to hear your use case for "if location and correct timezone can be reliably determined, it still should not be adjusted".


He's claiming that the number of people that don't want their time automatically change on their laptops is not insignificant. You really need a citation for that?


I do. One presumes that Microsoft, Apple and other major OS developers did exhaustive UX studies, and that this - "determine my location and set the timezone representation of the clock to that local time, rather than the original" - wasn't something that a significant portion of people felt.

Why do I say this? There are very few use cases I can think of - though I'm happy to be educated otherwise - for doing this differently. In addition, Outlook and Calendar on my Mac both pick up TZ changes and automatically re-render my calendar appropriately.

I really struggle to picture a use case of "I fly from my home in Seattle back to Australia, and yet I want the clock on my menu bar to display a wall-time 18 hours behind my current location, rather than displaying the current time where my eyeballs are at, and changing back upon return".

Actually, I can think of one - the fear that poorly programmed or tested applications may handle this incorrectly. In which case I can understand this desire, even as a facet of "solving the wrong problem"/shooting the messenger.


One presumes that Microsoft, Apple and other major OS developers did exhaustive UX studies, and that this - "determine my location and set the timezone representation of the clock to that local time, rather than the original" - wasn't something that a significant portion of people felt.

Would these be the same studies that Microsoft cites to justify force-feeding Metro to desktop users?


(Shrug) It seems like a no-brainer that people would want their laptop clocks to be correct. The problem is purely technical: there's no good way to ensure that any change made is correct.


I wrote a javascript library that detects the standard time, the daylight saving time for the current time zone : https://github.com/dsimard/timezonedetect

Garry Tan, cofounder of posterous, said of it : "Finally, timezones in javascript done right. The world has been made a better place via this fine javascript library."


I want to agree, but there are times when you have to ask for timezones, like when you want to collect and present data in discrete time units like days. If a user wants to know how many app installs they have per 'day', you have to know what time the time the day they care about begins and ends before you start dumping data into those time delineated buckets. It's not always the timezone their browser uses. Especially not with agencies that are providing these services for clients that are somewhere else.


UTC, not GMT.


I had the same reaction. Whenever I read GMT I know instantly that the author is not a big time nut and to take the rest with a grain of salt...


UTC offset is well and good if you are talking about today, but as soon as you start displaying past or future datetimes, DST will bludgeon you over the head.

And depending on your app, users might want control over the timezone. When a teacher creates an assignment in canvas-lms, the due date defaults to midnight of the selected day. It would be surprising if it set it to 2am, just because the teacher happened to be traveling when the assignment was created.


Exactly. I have run into this problem with event calendars. For example: Schedule a recurring event for 1pm. It happens to be EDT at the moment so you store the event date/time in UTC by adding 5 hours. Then, in days, weeks, or months, we change to EST. The 1pm meeting is now showing at noon.

To fix this I run a cron job on the ST/DT boundaries that increments or decrements stored date/times in the event calendar.


> Luckily, the browser already knows what timezone the user is currently in

Incorrect (though usually true, I'll grant). I've often seen people with older Windows laptops (newer Windows variants, 7 and 8 certianly Vista maybe, are much better at this) that are set to display UK time but think that they are in UTC-0700. This can be fun when using file synchronisation tools... If the OS doesn't know its timezone then the browser won't either. This is a moot point when you are talking to just one user, until that user fixes their clock settings of course, but becomes a significant issue as soon as you need to coordinate timed actions between users.

Detecting the local time offset from your reference clock isn't always useful either: the machine's clock could be completely wrong so it says 01:23 where the user's wall-clock says 21:00 and your reference time matches neither.

Also some users may wish to fix your interface to a timezone other than the one currently set on the device they are currently using. Someone on holiday or on a business trip might want to keep some things linked to local time back home so they find it easier to coordinate with people there. Conversely someone currently in their home timezone might want to force your interface to match a non-local timezone for coordinating with teams/friends elsewhere in the world. You can't rely on them adjusting their clock in this circumstance as they may not be using their own machine and so might not be able to adjust the time/timezone settings (or may wish to keep them accurate to current local for other reasons).

tl;dr: Dealing with time values when there are any international concerns it harder than we tend to assume.


While we're at it, why (if I'm in the US) do I have to keep selecting my state, town etc., in addition to my zipcode?


It depends on the application that you are working on. The more you rely on geographic information the more strict you must be. So let's review why you need all that info. If you don't include the zipcode with all the rest of your information, you run into the Washington Township problem. http://en.wikipedia.org/wiki/Washington_Township

You can't infer (reliably) your exact location from a zipcode because of the reasons listed in the other comments linked to your parent comment.


Because zip codes do not equate to cities 1 to 1. Many cities have multiple zip codes, and some zip codes span multiple cities.

Zip codes are based on postal routes, cities are not.


Zipcode plus address is usually plenty though. And it's certainly good enough to obviate the tedious selection of the state from a 52-item dropdown menu.


ZIP codes can be shared among multiple towns. Quite possibly multiple states, though I can't think of an example offhand.


That's correct; via wiki -

For example ZIP code 42223 spans Christian KY and Montgomery TN, and ZIP code 97635 spans Lake OR and Modoc CA.

Zip codes are also non-contiguous in some areas, and not bounded-polygons in others which makes them really bad for geo purposes.


It's less work for the developers, who might have more important things to do?


More important than accommodating the users, aka customers? Hmmm.


If you do localise content, make sure it's what your users want.

When it comes to Rotten Tomatoes, I'm glad it asks my location because I prefer US film reviewers to my local film reviewers. If I let Rotten Tomatoes know my time zone, I can't access US reviewers (that I'm aware of).


jstz is a great JS library that we use at my startup for automatic timezone detection http://pellepim.bitbucket.org/jstz/


Timezone — is not just offset! Proper use of timezones is not so simple unfortunately. And browser usually doesn't provide correct timezone name that could be used reliable (like Europe/Berlin).


It seems most people resort to storing time in UTC, but this isn't your only option: any fixed time zone will do. If you're in a corporate setting and the majority of your users are in the same time zone as your servers, then it makes sense to use that time zone. There's fewer date conversions that way. For half the year there will be basically no conversions and for the other half it's a 1 hour DST conversion - which many time handling libraries have optimized (I'm in the United States in an area that observes DST for half the year).


>any fixed time zone will do.

Theoretically, yes. In practice, you should just use UTC.

>If you're in a corporate setting and the majority of your users are in the same time zone as your servers, then it makes sense to use that time zone.

No it doesn't. Use UTC. The Internet does not have a time zone.

>There's fewer date conversions that way.

Your library is probably "converting" the date anyway, UTC or not.

>for the other half it's a 1 hour DST conversion - which many time handling libraries have optimized

Is it really that slow to convert a date?


Storing datetimes in fixed timezones that do not observe DST will do (but it is still not recommended), but you can't say any fixed timezone will do. Say you're storing datetimes in EST/EDT and you have a global audience. On the day the US goes back to standard time, how do you know whether 1:30AM in your database is the "first" 1:30AM or the "second" 1:30AM when you need to display the event to someone in Dubai? Are you going to store some extra metadata to indicate whether DST was in effect beside the datetime? Just store UTC.


A fixed time zone doesn't shift for DST, otherwise it wouldn't be fixed - by definition. All you need is a fixed reference point. There's nothing sacrosanct about UTC.


It wasn't clear what 'fixed' was, since that isn't normal terminology. I took 'fixed' to mean "pick a single timezone and stick to it".

If you need to pick a timezone which does not observe DST, that rules out (generally speaking) most of North America and Europe and a lot of other places. So maybe you want your new time base to be Asia/Tokyo, Asia/Shanghai, Asia/Kolkata, Europe/Moscow? You're still at the whim of governments who constantly tinker with DST -- sure Europe/Moscow is "fixed" now, but it wasn't as close as 2 years ago.

To each their own, but I would not get into the business of depending on any particular zone in tzdata staying as it is currently defined for any amount of time in the future.


I meant fixed as in unvarying. So we store our time data in EST (UTC-05:00), instead of UTC. For half the year the stored time matches the local time it's to be displayed in and so no conversion is required. The other half of the year when local time is UTC-04:00 we only need to shift the stored time by one hour to display it properly.

Many time handling libraries are extremely efficient at adding/subtracting an hour from time. This is important because we have many legacy systems using this data (think mainframe COBOL to today's latest technologies with a smattering of everything in between). So it's not just a single time handling library in use.


GMT? Surely they mean UTC as GMT is no longer precisely defined by the scientific community.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: