Hacker News new | past | comments | ask | show | jobs | submit login
The Tim Ferris way of testing ideas and how I did it. (docleyblog.com)
82 points by dawie on Oct 20, 2009 | hide | past | favorite | 37 comments



There is an ocean of difference between 14% of visitors committing their email address to something, and that same figure actually pulling out their credit card.

A more prudent assumption would be 10% of that 14% will actually pay for your product when it is launched.

I learned this first hand. Your mileage may vary of course, but it's incredibly easy to drive traffic and stimulate interest, enough that people will give you their email address or whatever other personal information you want. The moment you ask them to pay though, you're in another world in terms of conversions :) This MVP-type google ad testing doesn't really translate that well into real world data like purchase conversions. I don't think that's the point anyway.

It's good for seeing what people might be interested in, which is often something that startups aren't sure of :)


I agree with you. Maybe a good idea would be to ask for a credit card, but not to charge it. I think this borders on being illegal though.

I am still pretty happy with 10%, since this was my cut-off for going ahead and building the product.


Instead of asking for the credit card information (which does sound illegal), I've heard of others who have placed a "Buy Now"-type button on the landing page or fill out form fields just up to the credit card fields (say, for a multistep payment process). A user who clicks on "Buy now" or fills out the first part of the payment form is a pretty convincing datapoint.

I haven't tried any of this myself since it raises a big issue for me: Doing this makes a promise that you don't intend to keep (at that moment). If trust is what gets users to buy, isn't it counter-productive to violate a user's act of trust?

I'd like to get over this because I can see how experiments like this can be useful and I hope you'll keep us all posted with your success rate. Some things I'm curious to know are:

* How much time passed between your AdWords test and your MVP launch? * Of the emails captured, how many participated in the beta and eventually became paying users?

Best of luck!


I will definitely do a follow-up on this. It seems like there is quite a bit of interest in this type of testing. I will provide some numbers as soon as I have them.


You can preorder several video games. But maybe it's illegal if you don't know yet if you're really going to build the product.


What would it take to make you feel comfortable about it?


That's a good question. My reservations mostly have to do with this:

Of the emails captured, how many participated in the beta and eventually became paying users?

Which is really asking: Will those users really come back after you've broken that initial promise?

Other success stories say yes. And I speculate that there's a certain window of time that you need to deliver in to make it work.

There'll always be another question though, so I guess it can't hurt to just give it a shot.


About a month ago, I filled out an application/request with an NGO (Acumen Fund) about opening a local chapter where I live (Melbourne). My understanding that if they get enough requests, they'll try to get a local chapter going.

Now I got a response that they want to extend a 'special invitation' to use their 'to register on our brand new community site' where I can get involved and maybe new chapters will materialise.

The timing seems pretty convenient. I suspect that the request/suggestion form was really a list building excercise for the community site. It's a Ning. Those things need to cross a substantial chicken-egg hurdle before they are any use to anyone.

I admit, there was no blatant lie. It isn't really exactly the same thing. But it is sending out feelers & building email lists. They could have also put up a big 'join our community' button on the site that led to an 'sorry, under construction' page after you "sign up." Maybe they did. It was a little bit sneaky. I still joined & even started a group. I'm not angry with them. I'll even mention it to others.

BTW: http://community.acumenfund.org/


In his book, I believe that Tim says that it's not illegal if you don't store a user's credit card information when selling a product that isn't fully-baked. But I agree that it is a bit shady...


"This product is currently in private beta. Enter your credit card details to guarantee your spot in the next phase of release. You won't be charged a cent until you get access"

Then do the authorize.net CC storage thing where you don't even need to worry about securing private details (pre-auth, don't capture payment till you have a product to sell).


One real world example of this is https://www.fitbit.com/order If you try to order the Fitbit they say they will not charge your credit card until the item actually ships.


Maybe do a pre-order the way the do with video games sometimes? You'd have to be pretty serious about hitting the release date though. (And the customer would have to be pretty serious about buying your product.)


Per fookyang, the conversion is just 1.4%.


That seems hard to believe. I did the same testing strategy with my Job applier software (http://fastjobapplier.weebly.com/) and first of all I only got around 30 ad clicks total, and no one filled out the form. (I was thoroughly disappointed.)

Granted maybe I just suck at sales copy, or perhaps there just isn't demand out there for software that helps you apply for jobs.

Still though, a 10%+ conversion rates seems too high.

Do you find you're still getting a similar conversion rate now that you're actually selling the product?


> perhaps there just isn't demand out there for software that helps you apply for jobs.

The problem might be that you're targeting an audience that by definition doesn't have much spare money to spend.


I am not selling the product yet. I will definately post about my conversion rate once I do.


How long has it been since you started that AdWords campaign till now?

How long did you tell those that signed up, it would take for you to get back to them?

I, too, would love to see some more hard data. I have actually been thinking about doing this for some time now, but am hesitant about the quality of the feedback.


yea 10% seems pretty high, I think it might have to do with saying the 1st month is free and you can cancel any time.

Therefore, the 10% rate isn't really a 10% sales rate but more of a 10% willing to try your demo. But 10% is still pretty good for that I think.


Actually, I found it pretty believable that this percentage put their email in. I ran something similar...a SurveyMonkey questionnaire about the product, where the last question was "If you'd like to be notified when this launches, put your email here." 25-30% of those who finished the survey gave their email.

This is ONLY email, of course, not actual signups. I'm expecting 90+% of those people will drop out once we actually email them.


I agree that 1st month free and cancel anytime might have something to do with it. If I do a test like this again, I will probably leave that out.

I will have these terms in my live product though...


The core issue isn't that you offer a month free, the issue is that your form doesn't measure purchase intent. You have no idea if 1% or 10% of your leads will convert into sales.


Please dont take this the wrong way but this is a terrible idea in general. Essentially your selling a product that many people would liken to say spam.

I dont think your issue is in the execution of the trial but in the product itself.


I've been itching to try this for some time. I would appreciate any war stories HNers could share.


Email addresses don't covert on a 1 to 1 basis to sales. (If. Only.)

After the autoresponder runs out (6 days in my case) they've only got only a moderately higher chance of conversion than my average trial user. (~5% vs. 2.8%)

I really, really don't like the Put Up Fake AdWords Ad, See Who Gets Suckered minimum viable product. Aside from misleading customers and hurting the AdWords ecosystem (they don't actually get offered what they clicked to get offered, which Google will come down on you like a ton of bricks for if they detect, incidentally), it doesn't test your product. It tests your ability to write AdWords copy.

I could get 14% conversion to stabbing toothpicks under your fingernails, but what does that tell you? The world has weird people in it, and I'm good at qualifying non-weird people out with my ad text and keyword selection? Whee?

Similarly, if I delivered an AdWords campaign that was awful (low CTRs, low conversions), would that tell you the product is awful? Because I've done that before, too -- the first draft of my AdWords campaign for Bingo Card Creator was terrible. It took me almost a year before I figured out how to write AdWords decently. That was not an indication that the product was terrible (I sell it quite successfully), it was just an indication that I needed to adjust to a new way of copywriting.

If it can't tell you if an idea is good, and you can't trust it if it says the idea is bad -- what is the point, again?


A minimum viable product experiment is good at measuring whatever a minimum viable product experiment measures. You definitely can't make exact projections, but it is a piece of info that is useful. At the very least it can tell you not to build something.

It might also be good to test several ideas with this method to decide which one to pursue. Again, they shouldn't be the only factor but all else equal, better fake conversion is better.

It isn't an accurate projection, but it is info. It is also a chance to get an understanding of the adwords environment, click costs, etc.


I think you've hit on some good points, but missed a big one. Running ad words drives traffic to a site that currently has no traffic, better yet that traffic has shown some interest in the product you're claiming to have. If some of those people try to buy (or express interest in) this non-existent product, that says something.

You're not measuring how much traffic ad words brings in. You're using ad words as a traffic generator (ideally generating interested traffic), and of this traffic that comes to your site, you can determine if people are clicking through.

For ~$100 you can determine if there is any interest at all. This can be powerful for people who really have no idea what the market will do. It's not perfect, but not useless.


This can be powerful for people who really have no idea what the market will do.

These people should take their hands off the keyboard and not put it back on until they have spoken to at least one Real Person who has the problem that they are trying to solve. After they have done this, they will know if there is any interest at all. This produces actionable insights: if they tell you they have a problem, then at least some people have the problem. If they tell you that the industry's favorite way for solving the problem costs $1,000, that immediately establishes that there is a viable market for solving the problem.

Compare this to the AdWords campaign: can you tell me what numbers you can get that say "This is a good idea -- do this" and what numbers you can get that say "This is not a good idea -- don't do this"? No -- subjectively "good" results don't imply purchasing intent, subjectively "bad" results don't imply lack of it. You'll get noise -- very precise noise, noise you can measure down to the click, but still noise.


Suppose I came up with an idea that I absolutely knew was brilliant, let's say edible shoelaces. Unfortunately these edible shoelaces aren't cheap to produce. I'm a one man shop and have almost no money.

I make a fully functional website to sell this product, only I don't have the product yet. When people browse the site and have gone through the pricing page, etc... and finally click "buy" I simply tell them that we're out of stock.

Now I want to see if people are actually trying to buy this thing, so I start marketing but have a limited budget. I throw $1000 at ad words. If I get 100 people trying to buy my edible shoelaces at $20 each then I know that maybe it's worth investing in manufacturing the product.

How is this not useful data? Instead of going all out and spending money on manufacturing and a website, you only pay for the website and hold off on the manufacturing expense until you're confident it'll pay off.


Except that we are talking document management here, not edible shoelaces. Or anything that requires "manufacturing" in the made-in-a-factory sense of the word.

A fake ad survey for a document management system from someone who "works in the electronic Document Management and Records Management industry" is shouting out loud, "I'm too scared to actually sit and code this thing because it's hard and I might take too long to do it or completely fail to do it."

And I don't know why people like throwing money away. With $100, I'd rather buy 5 months of Linode server time than 12 email addresses. When the best entrepreneurs are saying be cheap, I think we ought to listen and be cheap. And with 2 weeks of time, I'd rather sit and code the most basic functionality of my app and show it to people I know who might need it.


> If it can't tell you if an idea is good, and you can't trust it if it says the idea is bad -- what is the point, again?

The point is not to prove the idea is good, but to prove that the idea isn't bad. In other words you can't prove that email addresses will convert to sales, but you can prove that no email addresses will convert to no sales. It's useful to know if your idea will produce no click-throughs, which will produce no sales.


Because I've done that before, too -- the first draft of my AdWords campaign for Bingo Card Creator was terrible. It took me almost a year before I figured out how to write AdWords decently.

An SEO specialising in ad-words and PPC would have saved you a lot of time ... I thought all this SEO stuff was obvious! ;0)>


Those are really good points. Is there another type of market testing you recommend?


I worked at a software company where we should have run an experiment like this. There's an upgrade on user accounts that we charged a one-time $10 for, and people buy it all the time. We decided to try it as a subscription.

We spent a month getting the subscription code ready, then deployed it to some % of users, only to discover that nobody wanted it. What we should have done was build the front-end only, get people to buy subscriptions, and then just grant them the one-time permanent version.


Like I said in my blog post, it worked pretty good for me.

From here, I basically went on and created a Minimal Viable product.


Here's another (cheaper, although possibly less realistic) approach to this: I recently did some market research using Amazon's Mechanical Turk, and added a lead collection form as a bonus.

I put up a simple survey asking users about relevant background information and their experience with the problem my application is trying to solve. Included in this survey was a field to collect the user's email address. This field was very clearly marked as optional (it even appeared after the confirmation code that allowed turkers to complete the "HIT") and the label included something like "Your answers to this survey will help us to evolve the website to better meet your needs. If you'd like to be notified when the new site launches, please enter your email address here." along with an indication that users would receive exactly one email from us due to this form.

I was pretty pleased with the results of the survey--I received a lot of actionable information very quickly and cheaply (at about $0.10 per response) and I was pleasantly surprised that a little more than 10% of the respondants entered what looks like a valid email address.

I don't expect many of those 10% to convert (and lead generation wasn't the point of the excercise anyway) but I was very happy with the ROI on this survey.

Seeing who clicks on ads (and at with what copy) and later "soft" converts may be a more realistic test, but the MTurk approach is an order of magnitude less expensive.


Actually, the figures here may be pessimistic. It takes a certain kind of person to submit their e-mail address when there is quite obviously no deliverable being explicitly promised to them for immediate purchase & download. If there were one, there might have been higher conversion.


I don't get it. So you "sell a product", including a 30 day trial, but once the user signs up there's no product?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: