This will not, in fact, produce reliable results. Your conversion rate will tend to change over time anyhow regardless of the "treatment" option in the second "statistically significant amount of time", because your conversion rate is sensitive to things like e.g. traffic mix, PR, and whatnot which are not uncorrelated with when you take the measurements.
This is why we don't do medical trials by giving people aspirin, measuring symptoms, then giving the same people a sugar pill, then measuring symptoms again. Instead, we give different people the two treatments at the same time, such that one population functions as a control group for the other. This is the essence of A/B testing, too.
The right way to measure this if you want reliable results is to put a load balancer or something in front of the page at issue and split half the people into the old architecture and half the people into the new architecture, then measure their conversion rates simultaneously. 37Signals knows this, and they allude to it in their blog post. That's OK though. You don't need to apologize for not gathering good data on whether making your site faster is better. Testing costs money, and testing known-to-be-virtually-universally-superior things is rarely a good allocation of resources.
You just probably shouldn't attribute your increase in conversion rates to the change you made without testing.
This is why we don't do medical trials by giving people aspirin, measuring symptoms, then giving the same people a sugar pill, then measuring symptoms again. Instead, we give different people the two treatments at the same time, such that one population functions as a control group for the other. This is the essence of A/B testing, too.
The right way to measure this if you want reliable results is to put a load balancer or something in front of the page at issue and split half the people into the old architecture and half the people into the new architecture, then measure their conversion rates simultaneously. 37Signals knows this, and they allude to it in their blog post. That's OK though. You don't need to apologize for not gathering good data on whether making your site faster is better. Testing costs money, and testing known-to-be-virtually-universally-superior things is rarely a good allocation of resources.
You just probably shouldn't attribute your increase in conversion rates to the change you made without testing.