Hacker News new | past | comments | ask | show | jobs | submit login
SvN: Design Decisions: The new Highrise signup chart (37signals.com)
54 points by sjs382 on Dec 30, 2008 | hide | past | favorite | 11 comments



Once again, more wild-west driven design from Jason Fried. He may be a great gunslinger, but walking into town unannounced and starting to shoot isn't always the best way to clean up the home front.

What I mean to say is that while Jason is damn good at clean, usable, good looking design, he is also very "gut-feeling" driven. Though some people -do- have killer instincts, you always should "trust, but verify". A little unbiased data collection thrown behind the redesign wouldn't hurt.


Yaw, I think that testing a lot of the ideas they threw out would be worthwhile, but that might take more than 4 days a week. ;-)

And, honestly-- they're keepin' it Real. Focus on low-hanging fruit, don't work too much, work on the stuff that you WANT to work on. They could instrument the hell out of it with A/B testing or they could move on to other products/ideas.

They're probably leaving a few bucks on the table by gunslinging, but I bet they enjoy their jobs more than if they went the other direction.


Oh no, collecting data is so droll and boring. I'm a decider!

(Sorry, your comment really bugs me. I don't actually adhere to running A/B testing for everything, but come on, everytime I look at analytics data doesn't make my site less "fun".)


He's also very goal-oriented. Keeping focus of the goals of a project is something lots of designers (and people who hire designers) seem to forget.


FYI from the comments:

"JF: While the lack of a free plan lead to increased paid signups, we decided we’d bring it back because we’d be missing out on a lot of upgrades from free -> pay. That’s a lucrative pay path for us."


I've done some reading on split testing, where you show two different designs and analyze the results over a period of time to determine which design performs the best. Has anyone here done anything like this and if so can you elaborate on your findings? If not, I'm going to try it with a new product we're launching in about 2 weeks (hopefully!)


Anyone who considers themselves a professional designer or user experience expert has done this. Google Website Optimizer makes it easy for anyone to do, and there are more advanced systems like Omniture's offerings that provide more options and more data on the results.

This is how a lot of "best practices" are determined. I know I've found out changing button colors or icons can increase conversion rates on email newsletters and changing a sign up process can vastly decrease the number of people who bailout at some point. Testing designs is very important.

I am guessing if 37signals conversion rate dropped significantly, they'd redesign the page again - it's kind of like a rudimentary way of split testing :)


This is meant as a comment about personality types, not in any way to disparage the blog post:

Considering:

1. My unusually high level of apathy over the discussion.

2. The level of detail others about it care here.

3. My own lack of skills in web design.

I'm definitely going to have to look into hiring a right-brained designer for any web startup I do in the future :-)


They should have done A/B testing with all these different designs, no?


I think all of the designs look nice. If I were interested in that product, I'd buy it from a page that looked like any of those.

It would be nice to have some data on what actual users thought, though.


While I like this change, it's more pleasing: it really just looks like a stylesheet change with a very touch and go markup alteration. That's not to say it's a bad thing




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: