Very nice, the suggestions seem more useful than some of the other tools I've seen. Is it possible to combine this with an SEO checker, spellcheck, and browser incompatibility check?
PS: running slowcop on slowcop.com yields a few areas of improvement ;-)
Really helpful. Using this to improve www.khanacademy.org results right now. I find this slightly easier to parse than pagespeed/yslow and look forward to the "performance changes over time" reports.
I like it! It's nice and fast and comparable services make me wait in a queue before receiving a report.
One issue is that I'm going to forget about your service by tomorrow. I only optimize my website when I make significant changes to the design or template, which is just a handful of times per year.
It would be nice if there was some kind of hook that would remind me about your useful service in the future. Maybe if you were able to detect when I change my website layout or add a javascript widget, you could send an email notification like "we've detected some changes in your website, visit us again to optimize page load times."
Awesome! Automated YSlow is definitely something I would like to have. If I pushed something and it's making my pages load slower and that's costing me users, I need to know.
I know some Mozilla folks were working on an automated YSlow tool called Cesium, but progress seems to have stopped there, so I'm glad someone picked up the torch.
http://blog.mozilla.com/webdev/2009/07/09/cesium-01/
Interesting to note that a lot of the suggestions I received were about minifying external JS files. It is kind of ridiculous how much external JS every page has now.
Pretty useful. I did some quick mods and went from a 89 to 98 pretty quickly which isnt too bad.
Only complaint is that I had to click to expand out the page speed problems. You might want to add a horizontal triangle or some other visual indicator there is more to look at. Either that or expand them all by default but allow people to close them.
Only issue was that it suggested I could minify my JS and gain a 0% reduction in a few cases.
There is always a fair amount of noise which I just tend to ignore. YSlow always penalises me for not using a CDN for example.
Something useful to add would be links for the compressed images you use to work out how compressing the images could save space. A side by side comparison would be pretty useful so I could see how the compressing changes the look of the page.
Just for fun I ran it over http://duckduckgo.com which comes back with 100. Google comes back with 98. Interesting.
Thanks for creating this. It helps a novice 'see the forrest for the trees' to make significant improvements easier. I love some of these suggestions, too.
How about this one?: Make it game-ish. "Your rank of 88/100 means your site loads faster than 75% of the sites we've tested," or "Congratulations! You've unlocked the Road Runner Badge!" Make the following improvements to unlock Speedy Gonzales."
Better than tools.pingdom.com in that it loads all the javascript and code form the site.
Next features that would make it very useful to me are (in order):
- Display timings on the timeline (in Chromium I can't see them, maybe display them on click or hover)
- Recurring checks (of course this is your core. You are already working on this I imagine)
- Different locations in the world (including being able to slice up my reporting based on the location)
- Custom alerts on specific urls (url X cannot take more than Y seconds to load inside my page, beyond more classic ones like total page load time and such)
- Hot cache-Cold cache
- In case of alert also generate a tcptraceroute and compare it to one that is collected every X minutes.
- Ability to set the host header separately (so I can use the IP address in the site url and the host for a specific virtualhost, this is useful when a site is geographically distributed and you just want to cut out the DNS lookup).
Also have a look at many of your potential competitors like Gomez.
Nice site and design. It seems it doesn't really add much to YSlow but it's nice anyway.
It does have the same problem that all the other speed checkers that I've tried. When you have a decently optimized page, most of the errors or problems it founds have to do with external services over which you do not have much control.
For example, facebook widgets and google apis (analytics, charts, ...).
It is always a little frustrating when you are told to change something that you can not really influence:
There are 5 JavaScript files served from static.ak.fbcdn.net. They should be combined into as few files as possible.
In the Resource Timeline, when hovering, it would be nice to have the exact millisecs in addition to the existing proportional colored rectangles. The absolute total time could be added on the black hovering div on the left.
Thanks to the tool I discovered that the DNS time of my domains was far from perfect, thanks! (my host in on amazon EC2 west, but my DNS is french Gandi.net...)
Also nice would be the performance on reload (ie with a hot cache instead of a cold one).
Given that you're the guy behind web.go, is Slowcop is written using Go and web.go? If so, how are you finding writing production web code in Go? Can you share any info/tips on the hosting setup behind Slowcopy?
(FWIW I'm an avid Go programmer and filed an issue on httplib.go when it was broken by release.2011-02-15. Thanks for the quick fix!)
One little tip: many people use Google Analytics and there's no use listing "http://www.google-analytics.com/ga.js under "Leverage browser caching" since we can't do anything about it and caching it would defeat the purpose anyway.
Oh and by the way, what do you use to losslessly compress PNG's to check for image optimization? Pngcrush didn't yield nearly as big improvements as slowcop suggested...
The main problem with using YSlow is it doesn't show performance trends over time. One of my goals for Slowcop is to give a dashboard where you can track performance across deploys.
Also, there are a bunch of other tools and features I'm planning to add, like measurements from different regions and tools that track lower-level HTTP issues.
Tool looks really nice. I expect you'll be adding resources to the Academy and then linking to them from the results page? Losslessly compress my images? Recommend some tools. etc.
Nice start, and some good comments/suggestions here for you to follow up on.
My own suggestion: make sure to reference relevant tutorials in the 'Academy' section from within the reports. I made a couple of reports before really finding the 'Academy', which could be a very valuable resource.
- On the report page, call to action should be highlighted more.
- Call to action may be positioned at the bottom of the page. Since I immediately want to scroll down to see my site's result, I will most probably skip the one at the top.
- Graph does not show time (at least for me). Just colored bars and the legend.
- I don't know if you're heading to the page execution side of the problem, but if you'll do it, when you show specific vertical bars, add a hint to explain the significance of that bar (This is where JQuery's document ready fired, etc.) on the graph. Hover popups would be much better for inexperienced folks.
- Allow me to exclude some warnings from the report, future reports. Google Analytic script gets a caching warning but it's given and not something I can improve.
I've been using http://www.webpagetest.org which is pretty well known in WPO circles. It's not as "pretty" as your site, but offers more features like recording video, Dynatrace recordings, firstview vs 2ndview, etc.
+1 for http://www.webpagetest.org , it's probably one of the best if not THE best performance testing webapp on the web. I've been using it for a while and it's just amazing. I love your Academy section, as well as the simplicity of the site landingpage.
Would it be possible to show sites that scored in the same range? The site I tested garnered a score of 100/100. How is that score calculated? There must be sites served faster/better than mine - who are they? Also, how about a "Top 10" list of fastest, slowest, etc?
I like what you've done. As an aside, I noticed you're using Slicehost. You can get better performance, for cheaper from Linode if you want. I've switched many a client to Linode and never heard a complaint.
Make it clear that a higher score is better for page download speed. I think this is true, but there's no real indication of this on the page
Some kind of explanatory histogram showing relative performance vs. other web sites would be useful. It might even be useful to group load time by page type and size (landing page vs. web app internal page, for example)
For the tune-up items at the bottom of the page, you've reversed the scale -- high numbers are now less significant than low ones. Also, it seems that a 100-point scale here might be overkill.
The suggestion may exist because it's possible to do nasty things if a user (particularly a logged-in user) goes to your http address.
If you don't have cookies configured to use https only (you don't), an attacker can grab a user's cookies. They can also intercept your redirect and send users to another site (see sslstrip).
I don't understand what you're talking about. My entire site is HTTPS, I also use Strict-Transport-Security, have a ruleset in HTTPS-Everywhere, and I don't use cookies at all. If I were to use cookies, I'd make sure to add the secure and httpOnly flags.
Me putting a redirect on http to https is no less secure than not providing http at all.
Nice - how closely have you looked at yotaa.com? They seem to be doing a very similar play but are further along with UI and with testing from multiple locations.
Yeah I've seen them. There are a few tools that do similar things. My long term goal is to be a full stack tool (both client and server side). I'm not sure where Yotta is going.
Very good tool and gives fairly optimistic results (slowcop.com gets 95/100) and quite some interesting tips. Going to use this next time instead of YSlow.
The only thing is that it really is into minifying the css and gives high numbers of potential savings. It'd be more helpful to display a number that takes gzip compression into account. Something like "Minify this css to save 31% (5% after compression)"
One more site that does google's "Page Speed" test and presents the result as its own? Do you think your firefox+firebug+pagespeed backend will be able to handle heavy load?
Looks like http://www.webpagetest.org and http://siteloadtest.com are the only honest tests out there
No, I'm sorry, this is my brain-fart. We recently switched our static pages to a Heroku instance in anticipation of a web app launch in a few days, but keep our server stuff on the Rackspace cloud. Proceed with down-voting.
Obvious UI, works fast presents the information in a very clean and clear format.
The blog & academy pages could do with some of the polish of the main site, but that is understandable! I also wasn't totally clear on what the NN/100 numbers represented on the report page?
Neat. Another great tool for measuring performance is webpagetest.org. It gives you the ability to choose a browser and a location to test from -- both of which are really important factors to consider when timing a site. It's open-source too!
The long-term goal is to have a subscription service that tracks performance issues. There are other possibilities, like ads, referral programs, consulting, etc..
One problem with following your suggestion is that you cannot refer to an older test again. By including the test id in the URL, I was able to email myself the link, make some changes to my site, then run it again and compare the new results side-by-side against the old results. If only referenced by the site URL, no such comparison would be possible.
will run the test on www.slowcop.com at the time the link is clicked, but have a way to retrieve a permalink URL to that particular test which is like:
Thanks for the suggestion, this is a great idea. I couldn't decide on a link structure for reports, so I just decided to use ids. But more meaningful links would be a useful feature.
If you built this out so it handled multiple locations, browsers, pages, etc. we would pay big bucks for it.