From a technical point of view this "redesign" scares me a bit:
All the old stories and user accounts 404 now. That is a major loss to their SEO. 14 million pages just thrown away and not passing any juice.
They don't employ canonical or robots.txt.
They have character encoding issues. ("'Superbird' Discovered")
They use a meta keywords tag, and it contains "celebrity news", which doesn't appear anywhere on their site.
They hardcode CSS with styles on div's. They don't use text-transform but type headers in all-caps.
They use the HTML5 doctype, but none of the new tags or practices (like ditching the obsolete 'type="text/javascript"' on script tags).
The layout breaks without javascript on, and the "upcoming stories" section doesn't get loaded. They provide no warning as to why the site's functionality stops working.
They don't combine or compress resources like CSS and Javascript.
They left debugging and TODO's statements in their javascript.
I see this bandied about on reviews of various sites, and I continue to believe it doesn't matter. If you have Javascript turned off and a page doesn't load, you should know why that page isn't loading. It's because you deliberately broke it for yourself.
You are free to believe that. But let us say I want to cater both to you and to another user that browses the web with NoScript (assume it is my goal to cater to as many users as possible). I have two options: A) make the layout accessible without JavaScript B) break the layout without JavaScript. Only with option A do I cater to both of you -- Regardless of deliberate breaking or bandied opinions ("Thou shalt never mix style and script").
>assume it is my goal to cater to as many users as possible
The thing is, it's not your goal to cater to as many users as possible. It's your goal to optimize your ROI on development time. The reason we have to have laws about accessibility is that your average fast food place knows that wheelchair-bound hamburger sales will never make back the cost of constructing a wheelchair ramp. The economics of the situation wouldn't provide for disabled people, so we have to have regulation to make sure they get taken care of.
(And before anyone jumps on that and misinterprets me, I am not comparing disabilities to turning off JavaScript. I'm simply showing by example that it is not good business to try to grab every possible customer at any cost.)
Indeed, I think your accessibility argument is spot-on. The same issues apply for making websites accessible. Thankfully, more often than not, it behooves a site to follow accessibility, because it falls in line with semantic web design, which in turn plays better with Google. Google search, in a way, is the regulatory body to keep sites semantic and accessible.
That's true, and at some point, I assume they'll make the site play well with noscript, but they've written the new version of the site from the ground up in six weeks with ten people. I'll give them some time to get things right.
I'm tired of having this argument over and over again, so let's approach this Javascript issue from a different angle:
It is bad practice to couple content (the "what") with functionality (the "how"). The content itself should not depend on how it is displayed. From an engineering standpoint, this makes your code unmanageable and unmaintainable in the long run. These stupid shortcuts of forcing users to have Javascript on just to display content only cause me minor inconvenience, but it will cause you much more harm when it's maintenance or upgrade time.
But you or everyone else on the JS bandwagon don't need to listen to me - I'll "temporarily allow" only the minimal amount of scripts on your page to see what I want to see and I will move on. You will need to live with this code during the next update cycle.
That's a rather egregious false dichotomy. You can separate your content completely from your JavaScript code, but that does not mean that your content is formatted as plain HTML. In fact, I would argue that storing your content within a page's HTML structure while separating content and functionality is exceedingly difficult, and that if you want to maintain that separation of concerns, you need to store your data in a totally separate format, and then write code that generates the HTML structure based on that format.
Then, taking that HTML and enhancing it with JavaScript (aka progressive enhancement) is also a violation of separation of concerns, and I would argue against it in any but the very simplest of cases. Instead, you should use JavaScript to operate on the raw data and display it to the user. You can then also write server-side code that generates a plain-HTML rendering of that same content, but that is more work, and that means it's a trade-off.
And I think in many cases that trade-off is worthwhile. But this idea that it somehow saves you time in the long run and doesn't end up costing you is pure fantasy.
>But you or everyone else on the JS bandwagon
God, that's annoying. My pointing out a flaw in a particular argument does not imply that I am "on the bandwagon" opposing that argument. I think that making your content accessible without JavaScript is a good idea whenever possible, but I do not think it is useful to pretend that you can do so without trade-offs. I don't have to agree with an argument just because I agree with its conclusion.
There are certain design elements that translate into a single line of JS, but require a couple of pages worth of CSS voodoo. Guess what's more maintainable.
You brought up the maintenance angle, and the response to that is that in some cases it's far more practical to use js for atyling than to rely purely on css. I gave an example below - a sticky page footer - do show me how to do it in the confines of your abstraction model.
Abstarting things for the sake of abstracting is a rather naive approach. It's a good starting point, but some abstractions complicate things way beyond what's needed.
While I agree that exceptions in very rare cases are needed, exceptions are becoming the norm and, like I said, are coupling functionality and content. This is why CSS was created, to decouple style from content, and now? We are are throwing all that work away because an entire generation of web devs are just too lazy to look it up.
What if the "what" goes hand in hand with the "how"? A distinctive layout could be of more value than the content, especially when it comes to social media (where the content is roughly the same everywhere you go). What's better, a site that doesn't display until you enable JS (then it displays perfectly) or a site that works without JS, but looks awful and drives away users? If you can't understand that the site doesn't load because you have JS disabled, you won't understand that the site looks like shit for the same reason.
You realize, many countries, most importantly the U.S., have laws regarding making web sites accessible? Section 508 requires that they work with Javascript off. This applies mostly to government sites, but was also used in a case against Target, in which Target lost and was forced to make their website more accessible.
Target's situation was a bit of a special case, however, as it has a brick and mortar store, which the court found their website was essentially a component of. If your business is completely web-based, the question of whether section 508 applies to you is, as far as I know, untested.
The problem with NoScript is that it intentionally breaks the intended functionality of a website. I just went into the Firefox element inspector and deleted the code for the list of links on the HN homepage. Now they're gone! How dare PG not cater to my will to randomly disable parts of the website?
You're free to browse the web with parts of it turned off. But you should expect that parts of it might not work, and if they don't work it should be immediately obvious why. Could Digg make it fall back? Yes. Should they? Only if they really want to. It seems they don't really want to. No one wants to cater to IE6 users, either.
No, the web is constructed of separate markup, stylesheets, and code for good reasons. If you prefer a tangled, dependent mess enjoy the results.
The advantages to a separation of concerns and the security risks of javascript are real. Otherwise, why not just go back to the bad old days of single, giant pages containing everything?
You're free to browse the web with parts of it turned off. But you should expect that parts of it might not work, and if they don't work it should be immediately obvious why.
Yes, it should be immediately obvious, but often it isn't, and that's my biggest complaint with sites that rely on scripting for basic behavior.
Too many times I've filled in a form, hit the Submit button, only to find that proper form submission does not work without scripting. Nothing on the page warned me, nothing stopped me from filling in the form, even though there are snake-simple ways to tell no-scripters of missing behavior, or to simply prevent bad behavior when scripting is turned off.
(The other complaint I have is why sites rely on scripting to do things that are baked into HTML. For example, using divs as text areas instead of, say, using a textarea as a text areas.)
Your final note is accurate, yes. But honestly, why should a web developer have to account for your niche-case decisions? Javascript is a perfectly valid language, and it's perfectly valid to assume that Javascript works in a browser since... it does. Normally. Why should I or anyone else have to add code specifically to tell users of NoScript that they need to be reasonable? You don't have to tell users of Mosaic that they need to upgrade to a browser made in the last 15 years. It's pretty obvious at this point.
I use AdBlock and Ghostery to disable ads and Facebook buttons. But when I get to a sign in page and there's no links, I know that it's a Facebook login button. When I watch a video on Hulu and it doesn't load, I know that AdBlock is preventing me from seeing that show. But I don't go on forums and complain that developers should cater to me breaking their websites. I say to myself "oh hey, I've made a choice that broke this site. Let me take 5 seconds to fix it myself."
"Your site doesn't work with NoScript" means the same thing to me as "they don't speak German at the Chicago McDonald's". I mean, that sucks for you, but honestly do you ever step back and wonder why that is?
"Your site doesn't work with NoScript" means the same thing to me as "they don't speak German at the Chicago McDonald's".
Not by a long shot. More like a sign in Braille explaining how to get further assistance if needed because not every sign the place may be accessible to the sightless.
Why should I or anyone else have to add code specifically to tell users of NoScript that they need to be reasonable?
Because so many sites use unreasonable scripting. NoScript is a reasonable defensive move because of too much aggressive scripting.
Honestly, do you ever step back and wonder why NoScript even exists?
Bottom line, though, is nobody has to do much of anything for anyone, and it's entirely up to the site owner to decide how they treat people with NoScript and what it's worth to add an additional 20 character or so of boilerplate text to a page.
If nothing else it's a courtesy.
It's just like smiling at people you deal with in stores or holding doors open for strangers. Do it, don't do it, whatever you think is proper. Make the world you want to live in.
I get that my choices are niche, and that's OK. I can live with the downsides. There are very, very few sites I can't simply close a browser on if I don't care for how it's presented for me. No one loses any sleep.
I suspect most people with NoScript use default blocking and whitelist sites as needed. So if I get a clear sign that I need scripting before I start doing anything, and it looks worth it, it's easy to allow the domain. That doesn't require any real effort from a developer, and if that's considered catering then we're all screwed.
Catering to NoScript users is dead simple - show them a broken website and they will whitelist it.
Seriously though, NoScript user base is - and let me really blunt about it - a vocal minority with a bloated sense of entitlement. Catering to them makes no practical sense. There are obviously edge cases, but the best option is to put up a note that JS is required. If Digg doesn't have it, they should add it, but beyond that it's money wasted.
I see it the opposite - website designers have decided that they are entitled to decide what my keys and mouse clicks do, among other immensely annoying things.
Well, there you go - a sense of entitlement. You are visiting other people's virtual property and yet you have a nerve to demand how they should be serving you their content. Do you not realize how ridiculous it sounds?
I bet you're less than 1% of digg's traffic and you just have to disable noscript to be able to browse it. It's not like you're stuck on IE because of your job.
I was pointing out that from my perspective, the entitlement of web developers has gotten so out of control that disabling JavaScript is the best way of making the general web tolerable to browse.
IE has to be specifically coded for because it doesn't support modern web design fundamentals. NoScript users have to be specifically coded for because of the same reason.
How often do you run into a website with Javascript that breaks your experience? Would it not be better to have a blacklist than a whitelist?
When I run into a website that breaks my experience, it slows down my computer, crashes the browser and takes away the comment I was about to write. No, thanks.
So then we're back to the previous comment: what does IE have anything to do with this? If your browser crashes because of a misbehaving script... I'm not saying you're using IE, but you're surely not using a browser that fits your needs. Chrome simply crashes the tab, and Firefox hangs for a few seconds until it asks if you want to stop the script.
What you're looking for is browser sandboxing, and it exists.
If you need code to display a text document, you're doing it wrong.
There is a philosophy (with some merit) of worse is better.
So, let's be honest... you're promoting sloppy coupling in order to save developer time. The problem is that the technical debt will have to be paid back in the future.
This is a silly argument. You could just as easily put a typo in your apache/nginx config and nuke your site. It has nothing to do with gracefully degrading with respect to javascript
Not worth keeping your brand looking right? This is not about code, this is about their brand and how they protray themselves. If people see a broken website (for any reason, be it cookies, JS, or whatever), they will think less of the brand.
Like you said, "...you compare the cost of the effort against the value added". It does not add anything to the brand, but not having a fallback takes away from it. And given Diggs current reputation, they need all they can get.
This is what I do. Most of my stuff degrades gracefully, but for the few things which can't (JS demos/game things), I generally do something like this:
<noscript>example.com is a HTML5/ES5/CSS3 3D graphics demo. Unfortunately, your user-agent doesn't seem to support ES5, sorry about that.</noscript>
Or you could phrase without all the academically correct technical buzzwords, so normal people who are your real audience will understand what you're trying to communicate. Like using "JavaScript" instead of "ES5" and "web browser" instead of "user agent".
> Or you could phrase without all the academically correct technical buzzwords, so normal people who are your real audience will understand what you're trying to communicate.
Normal people who are my real audience probably don't even know how to disable javascript, and most likely would enable it with no fuss if something looked broken.
That's not entirely true, there are hundreds of millions, if not billions, of people who use the Web only through a phone, usually a featurephone, and mobile phone Javascript is broken or nonexistent on many of these platforms. These are lower-revenue users, for sure, but there are a LOT of them.
In that situation, wouldn't it make more sense to show them a completely different site based on browser id? Not just one that doesn't rely on Javascript, but has a completely different layout entirely?
I would still question the millions or billions of users who view the web only on their non-smart-phone who will be using social media like Digg.
I know from firsthand experience, ignoring Digg for the moment, that there are at least 500 million people who use a featurephone in India to check news, cricket scores, horoscopes, make payments, send emails, and send text messages. Their threshold for device frustration is much higher than mine.
Apart from all the old pages being 404 (which is a huge mishap). Nitpicking on html is a fun and amusing pastime, but unless the page is deformed in some way, it falls into the "shit nobody cares about" category.
It falls into the "shit that will bite you in the ass one day" category. It's about engineering quality, and it goes all the way down to the stuff we don't see.
Given Digg's history, it's a bit of a red flag. However, it all depends on how this technical debt is managed. If this is an alpha release under the heading "release early, release often" and the tech debt is known, there's nothing alarming about it. But if this is truly considered a final product than it makes you question how capable these people are to revive Digg.
Either way, some of these issues suggest simply a lack of skill/knowledge rather than deliberate tech debt. That's not a good sign.
This is an MVP. The team is (apparently, from what I've read), doing quick iteration, fast cycles, and this is an MVP.
In fact, I think these nitpicks are an excellent sign for digg. They discarded anything which was useless for their launch. Very well done - an excellent MVP.
To get a website up and running, most of what I listed falls indeed in the 20% category (80/20).
But if you care about accessibility, usability and SEO you should probably care about HTML issues like pagespeed and validation.
Web Content Accessibility Guidelines 1.0
Create documents that validate to published formal
grammars.
http://www.useit.com/alertbox/response-times.html
Users still hate slow sites and don't hesitate telling us.
Google Webmaster Guidelines (Design & Content)
Check for broken links and correct HTML.
So perhaps it is the "shit nobody, but front-end engineers, should care about" category.
That's not the point. Why rush the release when you know that historically Digg had problems with losing people because they severely botched a redesign?
Didn't Digg at one point purge their comments and there was a huge uproar? The new management shouldn't have went online without the migrated data, or at the very least, the user profiles.
Was the data included in the purchase of the sale?
I am looking at the login page and it seems to require Facebook authentication, and not even an option to use my old Digg.com account. In the FAQ it implies that eventually the old data will return.
I'm starting to feel like they decided to release a work-in-progress to capitalize on the wave of press they've received due to the sale. Most people had high hopes for the new management, and releasing early with a product that is less than half finished really causes a smell.
The data was bundled with the domain sale, thankfully. It's a shitty move that they took it offline for an unspecified duration, and that they don't plan to handle existing urls.
That's completely insane. A new design is one thing, but there needs to be a clear and concise reason to break away from URLs that have been indexed, and are likely huge in page rankings.
I would agree if the criticisms were super nitpicky and obscure, but honestly any site a front-end professional made in a day or a week would not include may of these errors.
These are pretty strong signs pointing to the fact that at least one person coding this site is very much an amateur. Which is a pretty serious implication for the future.
I was listing stuff that made me a bit scared, as I browsed the source, not stuff to criticize the website with. As for criticism: I think the design and social integration is fresh and well executed. I make websites that contain more errors and look worse than that. It is more of a web application, than a web site.
I listed MicroFormats, because: As a foundation for a modern website, it is a great idea. Digg has the traffic and authority to make rich mark-up work. And I think adding stuff like rel="me" to your social profile-links is kind of standard.
It was certainly not a prioritized list of what to work on next. That would likely be more like: Deploy descriptive 404 page, Get old content redirected to new location, Comment functionality, non-Facebook log-in option, categories, option for a non-masonry list view, work on page speed.
Add to that the old rss feeds are broken, so all those rss readers apps have stop working with digg.
I also really don't understand what was the decision behind breaking all older stories, it seems ludicrous they do not take advantage of that.
Even worse you now have to login with facebook! they did not bring back the old accounts, this seems like a very frustrating issue. Unless they want a completely new userbase...
I think they just threw the last remaining digger to reddit.
What about my data from the old Digg?
We believe that users own their data. We’re working on a system that will extract all user data from the old Digg infrastructure. In August we’re launching an archive website for users of the old Digg to find, browse, and share a history of their submissions, diggs, and comments.
from http://www.digg.com/faq
Thing is, is it worth it to relaunch without user accounts and without a proper login solution?
It seems to me you are really looking at not having those users that was on the site for a long time, it's not like digg and facebook really goes hand to hand together...
When I landed on the page I though "Nice, they made a responsive layout, like bostonglobe.com." So I resized the window, and... not. It just happened that my window was originally sized to their fixed width.
From what I've read, their turnaround was pretty fast, so I'm not surprised by this. I suppose this was a "hack it out, get it out, refactor later maybe" type of job.
Unfortunately, that ends up being a standard, with the "refactor later maybe" staying as a maybe.
Seriously, the only time I've see the "refactor later maybe" style work is when there are devs brave enough to ignore deadlines and use the weakest of excuses to refactor a class or system.
They say they know it's not perfect. But if you cut corners like using style attributes, you'll never get the code into a proper production-ready state. New features will always win over cleanups.
Start right, it's really no extra effort to use CSS over style anymore. It is in fact quicker.
Just curious why you think they should use text-transform: uppercase instead of just typing in all caps. If you localize your site, text-transform: uppercase can render some things meaningless in other languages, or just cause issues in general with some languages (Turkish i, certain Greek accented letters).
In short: The all-caps serve a visual function, not a semantic one. It is akin to using linebreaks ("<br>") to create a visual margin.
If you write: "... arrested by the FBI" as "... ARRESTED BY THE FBI." you lose information (which screen readers could use to better pronunciation). If you style it with CSS, this information (capitalization, acronym, stress-words) is retained.
I didn't know about the issues with some foreign languages.
Seriously, it's been what, a month since they started working? They wanted to ship, and ship fast. It's not ideal (no software is ever ideal) but it works. I'm happy they managed to release something after just a month of work. They will now get the feedback and see how users are welcoming the new version.
Their model of going in lean, quickly deploying, and quickly iterating, all the while doing so with good communication and transparency is something I'm surprised is getting as much flak as it is from HN of all places...
It's almost like they don't care about all factors that matter when doing a major relaunch of a site; It reminds me of the lack of care that came to be with the big initial design revamp that caused the exodus of Digg users, including myself.
A UI that looks good doesn't mean it functions well.
OT, but this seems like an appropriate place to ask.
I'm a competent computer programmer who hasn't done any front-end web development since 1999. Does anyone have a link to a good explanation of how to avoid these "obvious" mistakes?
Most of what I've found via google has been along the lines of "put interesting content above the fold" and "have consistent site navigation area," whereas I'm more interested in character encoding, where and how to specify CSS, when to use span vs div, and that sort of thing.
Personally, and I'm probably in a minority, I hate these huge, slabby, infinitely scrolling displays of photos/images with precious few items on the screen. It's a lot of work to try to find any sensible number of items of interest.
I seriously prefer an interface like HN - spare, clean, and information rich.
It's great for pictures. In fact, it's better than great for pictures - you can digest thumbnails dozens at a time, and decide what you really want to look at.
But for textual content, I agree with you. They're rubbish. Let me read the headlines, middle-click links, and leave it at that.
I'd like something in between myself, maybe more like Slashdot but with better summaries. Or shorter ones, even, but more than the title. The problem I have with HN's titles-only index page is that there's far too much emphasis on the title as attention-grabber, which tends to favor either linkbaity things, or else stuff that is just relatively simple to state ("Company A does thing B", "Why you should use C"). In particular it disfavors things that need any context to explain, even 2 sentences of context: it has to be immediately clear from the title why you should click.
Agreed. It works with sites like snip.it because no one post is more important than the other, but in Digg's case, you want people landing on the page and knowing right away what they should be looking at.
This is great. The large scrolling format (similar to pinterest) is a good idea as the last thing that was needed was something that looked like reddit or HN. Sites like the Daily Mail have been very successful with this style.
I'm not a fan. To me, this looks like a magazine, not social news. It looks curated, like I'm not meant to interact with it, but merely consume. I closed the tab pretty quickly, and I guess I can put to rest any notion of ever returning to Digg.
I have to agree, I like it. While they certainly have their work cut out for them on re-building the community and on the technical front, this redesign shows a willingness to be different from the other news aggregators. It's popular to hate on digg, but I wish them the best of luck!
It is impressive how much hate and negativity the comments include. Sometimes we forget that while traffic keeps up community quality can easily degenerate. They will have to put also a lot of effort into community-re-building.
This shows again shows to me how hard and impressive the task is they picked. It's social software. It's not only the rewriting of the software part but also rewiring te social part.
Truly inspiring!
Quite frankly if it upsets all those people going "Facebook login?! I'M NEVER COMING BACK" and they don't, I'd probably be more interested in using Digg again
No, but the people who rail against what they've said is a temporary measure with all the reasonableness of a bull smashing a china shop aren't the sort of community I'd be pleased to participate in.
These comments are potentially made by people who have been using Digg for many years and have continued to be loyal in spite of v4. Imagine yourself in their position when one day you visit your favourite site and everything is gone.
OK, I fire up the site. I see 6 big pictures, 6 links to actual content and minimal navigation. A whole screen and only 6 bits of actual content? Its like a children's book. And more and more sites age going for this style. Is this the web now? The BBC Olympic site has gone this route and its awful to navigate.
Why is this the new way to go? IMHO, it is terrible.
Look at this site, a simple and clear list of links to stories and associated comments. Simple, easy and lots of content on one screen, in my case, 27 linked stories. Please don't ever let HN go this way. I don't want big pretty pictures taking up my screen when Im trying to look for written stories to read. I am not a child.
The team had to make drastic changes. They wanted people to forget about the old Digg and its fiasco. The best way to do it? Become completely different. They also needed to become different from their rivals.
That's why I don't understand why people here are complaining on the new format. I personally find it great and novel. The clean design and only six stories in sight allow you to focus and actually pay more attention to each of the stories.
When I see 30 stories clumped into one page, I run through their titles like a mad man, often reading just 50% of the words. Having just 6 stories eliminates the haste, it allows me to relax and slowly study all the stories before deciding if I want to read them further.
I like the new Digg. I think the team made a right move with the new format. Now all they need is really good content on the front page and users will come. I know I will check it again tomorrow.
No they absolutely didn't. All the team needed to do is put a fresh coat of paint on Digg, similar to it was in the beginning, and capitalize on the traffic before they wanted to make drastic changes.
Digg's whole problem started with when they tried to become the Twitter of news. That's not original intent of the site. If you purchased all that data, all that SEO linking, you'd have to be absolutely crazy not to migrate over at least the stories into your new design.
So, put a fresh coat of paint on a product that failed miserably? Why is that a good idea?
They didn't pay for the data and SEO linking, they paid for the name and buzz in the media. In my opinion they are not using the old data on purpose, they don't want to have anything in common with the v4 at all. It's not modified Digg, it's a completely new Digg. This is how I'm reading their message.
I'll be honest, I'm surprised by how much I like it. For a quick "give me something - anything! - to read over lunch" it serves the purpose for me very well.
Same here. Looks like a good front page for general news, whereas something like reddit mostly shows it benefits when you get actively into the subreddits.
I don't mind the whole facebook sign up process. They have a justification for it. But then when they ask can they start posting to my feed it makes me feel like they were fibbing to me the whole time!
They just want free advertising via my social feed. Pft.
I must admit I haven't even visited digg in over four years, but wasn't it originally intended to feature user-submitted content? The front page pretty much screams the exact opposite to me. I instantly pictured a bunch of hired goons sitting in an office, submitting content in return for their weekly pay.
The headlines, photos and layout reminds me of Ars Technica without the original content.
There's far too much scrolling for far to little content.
This just wont be able to compete with reddit in its current format. People who used Digg before wont come back to Digg because this Digg isn't what they want.
Also, as I have the tab open, I have a notification at the bottom of the screen:
"We're hard at work on finding new stories for you... Try refreshing in a few minutes."
Things I wanted to do but couldn't/couldn't immediately figure out:
1. Comment. What is the point of an aggregator without comments?
2. Go to a specific category. Are those things above the titles categories? Why aren't they clickable? WTF?
Their argument is that commentary on the original Digg was essentially the first of its kind, whereas commentary happens "everywhere" and they need to spend time aggregating commentary too. Not sure how they can do this effectively, but will appear over the next few releases.
Honestly, it's like releasing a car without wheels. Comments are the bread and butter of social news sites. I'm shocked and horrified that they omitted comments. It's a glorified Pinterest at the moment.
I'm really disappointed - I really hoped they had learned some lessons.
Comments are where community forms.
Without community, you have nothing but a shiny, editor-controlled front page. Congrats, CNN 2.0 (No wait, even CNN allows comments...)
This thread makes me want to gouge my eyes out with a toothpick. The nitpicking is outrageous. Think about your last project with a tight deadline. OMG a bad meta tag made in! Should we delay the release? If you are running a business not a personal site the answer should ALWAYS be no. This is a website not shrink wrapped software. It can be updated a hundred times a day. Meta tags can be changed. It's ok to roll out a site that isn't 100% where you want it to be. I, for one, welcome the new Digg and look forward to them improving their meta tags.
If anyone from Digg is reading this, please make #site-header-container 2px wider. The right edge of the header is bothering the crap out of me.
Other than that, great job new Digg team!
Edit: after a few refreshes and zoom in/out, the header appears to be centered correctly. it seems to be offset if i resize my window and/or other random times. the header css is kinda janky, but who cares, you guys did great for a 6 week build from scratch. :)
The new Digg doesn't look terrible, by any stretch. But - the developers' insistence on an arbitrary six week push deadline when there clearly remain unsolved problems is fairly amateurish and reeks of self-indulgence.
Users /with existing expectations/ (which Digg invariably presents) do not care about "release early, release often." They want something that kicks ass and gets it right on the first try.
Kind of disagree, if users are still on the site to voice opinions that means they are pretty resilient. They aren't going to give up right away because the first release doesn't have everything.
I find the look interestingly different to what I expected. The loss of colour is brave (given the Digg green) but is at least a clean break, which is perhaps what's needed.
It would be nice to have a small bar at the top that when clicked would show the most popular and upcoming rather than having to scroll to the bottom (although I'd keep the material at the bottom as well for those that don't notice).
The Popular section looks too big for my liking.
Of course the big thing is the content. How relevant is the content? Sadly on face value, it looks like it fails (but then again it would as the content will be the same as v4).
In conclusion it looks nice and is a welcome break, but the content doesn't make me want to sign up. I do wish the new Digg team the best of luck though, and will check back in a while to see if the content's improved.
Design wise. It's clean, and lacks personality - which I presume is the point to impartiality.
The makes me think of the phrase 'go big or go home.' This is not a big enough departure from the last Digg to really warrant a whole rewrite and losing the original posts.
To be honest; it seems only a step away from traditional newspaper websites - which have the benefit of the zero-hour news feed. They need to inject yet more community back into this.
As someone who's spent a decent amount of time leading the redesign on a newspaper site - I remember putting far more innovation into a traditional newspaper site than this has.
Digg, Reddit, and Hacker News are easy to browse for a reason.
I like the redesign, but I don't think it fits with social news Digg-style. When I go to a site like reddit or Digg, I know I'm not going to like every article posted. Mostly, I want to find what is interesting to me as fast as possible. Sites with large images and scattered, newspaper-style layouts make it hard for me to find what I want. This layout would work on a focused blog where I know that I am interested in most of the articles, but not on a mass news aggregator.
I opened the page, was overwhelmed with images, scrolled down a bit, saw that the information overload continued, and closed it.
When I open it, I have no idea what I am supposed to do. Information dense pictures are being shown to me across two dimensions, my eyes have no clue what flow to follow.
Newspapers have had hundreds of years to learn how to make these sort of layouts work, it seems like Digg said "screw that, we're just going to shove it all out there!"
I'm surprised to discover I actually like the layout. Mainly because, with sites like Reddit, much of the time the content is just an image, and you're just interested in the top couple of comments. If they can put this content straight onto the main page, it's actually more efficient to consume.
I hope they implemented threaded votable comments soon, though -- the community discussion is the main thing, for a lot of people!
Seems good, I like the layout. I can already see myself visiting for a daily dose of news. Reading through the comments on the blog makes me think they are going to have problems with the community. Lots of complaints about use of Facebook login even after Digg explained it was a temporary system so they could launch quicker (they are working on a system of their own).
The design is quite ok for me, but what about the provided value?
I skimmed through the links, and I saw many of them already, on other social sites I visit, I mean HN, Techmeme and reddit. What would draw me to digg could be a community of great commenters, like one on slashdot, which would add value by providing insightful/funny commentary. Without that, it's a bit redundant to me.
Personally, I don't think that looking like a newspaper/magazine and adopting a techmeme like share mechanism is a significant enough change that the market would need.
Looks like http://www.likabl.es and http://dropula.com. I still prefer the Reddit/HN style layout. Feels like reading a magazine (no sense of rank) now rather than a list of ranked articles.
It looks like a plain version of yahoo. What is their goal here?
If I were digg, I'd just hire a bunch of engineers, and let them put out a new product under the old name. Something not directly related to their past. Though I would keep their archives online for SEO.
I would point to Best Buy as a prime example of this "release now, worry about the flaws later" philosophy. Their website has been a mess for years mostly because of this very approach.
It's an old adage, but you should really do it right the first time.
I'm personally not a fan of this layout. Looks like Pinterest. It's good if you have some time to spare and looking for pretty images, etc... like most Pinterest users, but for power users it's too much distraction.
"502 Bad Gateway". I see the HTTP standard has already captured everything you need to know about digg's relationship to the internet in one succinct phrase.
All the old stories and user accounts 404 now. That is a major loss to their SEO. 14 million pages just thrown away and not passing any juice.
They don't employ canonical or robots.txt.
They have character encoding issues. ("'Superbird' Discovered")
They use a meta keywords tag, and it contains "celebrity news", which doesn't appear anywhere on their site.
They hardcode CSS with styles on div's. They don't use text-transform but type headers in all-caps.
They use the HTML5 doctype, but none of the new tags or practices (like ditching the obsolete 'type="text/javascript"' on script tags).
The layout breaks without javascript on, and the "upcoming stories" section doesn't get loaded. They provide no warning as to why the site's functionality stops working.
They don't combine or compress resources like CSS and Javascript.
They left debugging and TODO's statements in their javascript.
tries to load content from a discontinued website.A lot of empty placeholder divs to js-load content into. TPL's are also stored inside the page contents.
URL's are not properly encoded (space instead of %20).
Links to the same article are both with 'target="_blank"' and without.
They don't use microformats or schema.org.