Once again, more wild-west driven design from Jason Fried. He may be a great gunslinger, but walking into town unannounced and starting to shoot isn't always the best way to clean up the home front.
What I mean to say is that while Jason is damn good at clean, usable, good looking design, he is also very "gut-feeling" driven. Though some people -do- have killer instincts, you always should "trust, but verify". A little unbiased data collection thrown behind the redesign wouldn't hurt.
Yaw, I think that testing a lot of the ideas they threw out would be worthwhile, but that might take more than 4 days a week. ;-)
And, honestly-- they're keepin' it Real. Focus on low-hanging fruit, don't work too much, work on the stuff that you WANT to work on. They could instrument the hell out of it with A/B testing or they could move on to other products/ideas.
They're probably leaving a few bucks on the table by gunslinging, but I bet they enjoy their jobs more than if they went the other direction.
Oh no, collecting data is so droll and boring. I'm a decider!
(Sorry, your comment really bugs me. I don't actually adhere to running A/B testing for everything, but come on, everytime I look at analytics data doesn't make my site less "fun".)
"JF: While the lack of a free plan lead to increased paid signups, we decided we’d bring it back because we’d be missing out on a lot of upgrades from free -> pay. That’s a lucrative pay path for us."
I've done some reading on split testing, where you show two different designs and analyze the results over a period of time to determine which design performs the best. Has anyone here done anything like this and if so can you elaborate on your findings? If not, I'm going to try it with a new product we're launching in about 2 weeks (hopefully!)
Anyone who considers themselves a professional designer or user experience expert has done this. Google Website Optimizer makes it easy for anyone to do, and there are more advanced systems like Omniture's offerings that provide more options and more data on the results.
This is how a lot of "best practices" are determined. I know I've found out changing button colors or icons can increase conversion rates on email newsletters and changing a sign up process can vastly decrease the number of people who bailout at some point. Testing designs is very important.
I am guessing if 37signals conversion rate dropped significantly, they'd redesign the page again - it's kind of like a rudimentary way of split testing :)
While I like this change, it's more pleasing: it really just looks like a stylesheet change with a very touch and go markup alteration. That's not to say it's a bad thing
What I mean to say is that while Jason is damn good at clean, usable, good looking design, he is also very "gut-feeling" driven. Though some people -do- have killer instincts, you always should "trust, but verify". A little unbiased data collection thrown behind the redesign wouldn't hurt.