Hacker News new | past | comments | ask | show | jobs | submit | bumblebird's comments login

Last time I measured, on a largely firefox userbase, adBlock was used by about 6%. Not enough really to bother about.

I'd expect the number who use NoScript to be 1-2% if that.

Note this is out of a mainly firefox userbase. For the general web those %'s would be far lower.


  <script>
   oldalert = window.alert;
   window.alert = function() {};
  </script>
  <script src="http://json.org/json.js"></script>
  <script>
   window.alert = oldalert;
  </script>


If someone knows enough to do that, they probably also know to copy the file onto their own server.


Heh. I didn't know you could do that.



Crockford would then reply with: (inside his js file)

  window.location = "http://www.meatspin.com"; //that'll teach em


If you care about IE6 these days, and aren't selling to corporate users forced to use it, then you're doing something wrong.

Also, you can just enable it when it's available, and fallback to comet etc when it's not.


I think the main issue here is not IE6 but all the modern browsers (FF 3.5, Safari 4, IE8) that do not support this feature. Targeting Chrome users only is not that appealing.


The feature was added to webkit so both safari, chrome and other will get the feature and so is firefox https://bugzilla.mozilla.org/show_bug.cgi?id=472529


This is HTML5, they will support it soon if not yet.


That's why you would use it where available, instead of targeting.

if (browser.supportsWebSocket) useWebSocket(); else useComet();


While the numbers are falling, IE 6 users still account for a sizable percentage of the people that buy things from my site. Ignoring them would be "doing something wrong". Guess it just depends on what you're doing.


Also in case you are doing something that spreads virally, not supporting IE6 can change the viral loop from increasing to decreasing (you could lose 5% of invites/feed messages/emails/whatever depending on app).


It really depends.

I recently did a bit of consulting work for a guy who has a large number of domains receiving primarily organic search traffic. Around 20% of the people hitting those (several million a month) are still using IE6.


The bidding seems like a complete lottery. What may work well one day, may completely fail the next. You could be throwing money away.

It's a sort of interesting idea, but completely impossible to calculate ROI etc before you spend.

Also, WTH "only accepting US credit cards." Hate it when sites pull that one.


Agree. Once tried to order something from Newegg with a European credit card (I'm in the US at the moment) and after ordering I got an email that I canceled my order. I had to go the customer chat room and chat with one of their guys to find out they do not allow foreign credit cards. Went to Amazon and purchased the same product without problems. That was a $100 loss for them and I'm sure I'm not the only one.


Also they all run adblock etc.


I don't think that's likely to be an issue in this case because the ads are not presented in a standard ad format, instead they are part of the usual list of reddit items.


Sure. I was more meaning that they may not be easy to monetize once they click through to your website.


Yeah, probably not good for ad-based sites, but those usually don't have high enough revenue/user to make advertising cost effective anyway.

For a cool new consumer product or service that makes money directly, though, this could be great. When I tried it, the CPC was very low. Whether it's worth it depends on how good a fit the thing you want to advertise is with the audience of Reddit.


I guess it would be a good way to get some interest in a new service, thinking of it similar to a HN review my app post which you can keep at the top for more than a couple of hours.

I can't imagine making a great return selling something directly on there unless you product was so cool that a regular post would have had nearly the same effect.


Does the WebSocket spec allow for gzip/deflating streams as well? I can't see anywhere if it does or not.


A quick Google search (http://www.google.com/search?q=websocket+compression) revealed that it's not supported yet, however it might get added in a future version: http://www.ietf.org/mail-archive/web/hybi/current/msg00789.h...


There's some pretty sluggish websites out there due to poor use of js libs. Browse them on a phone or netbook and it all adds up.

You'd certainly manipulate hundreds of DOM elements at a time, consider say a twitter stream, where each post has "10 seconds ago" marker, and they all need updating.


The bottleneck in that situation would be finding the elements, not updating them. If there is any worry about performance, the change would be to cache that list of elements rather than searching again and again, regardless of the library used. (Unless your library was very clever, and could cache the results for you. I'm not sure what browser support there is for ondomupdated events, which you would need to watch for this to work in a general fashion.)


>> "The jQuery example, from the beginning, was creating DOM elements from HTML strings, while RightJS was wrapping the document.createElement API. This is not the same thing and you cannot learn anything from comparing apples to oranges."

What you can learn though, is that using the built in DOM methods, or wrapping createElement if you need to, is far faster than using some other abstraction from the DOM.


I would assume that adding a layer of abstraction would be slower than native calls, regardless of the language.


Actually, using HTML strings is quite a bit faster than using the DOM methods in many browsers. Strange but true. (I guess it's because the extra time spent parsing the string is trivial compared to the extra time you spend mucking around in the JS interpreter when you use the DOM.)


Write your own specific tests if you suspect bias.

I doubt there is any real bias though, js libs have different aims. Some aim to have complete browser support at the expense of speed+size, some assume a certain level of browser, and so come in much faster.

The fastest lib though is always going to be no lib. It's far easier to optimize your own code.


The fastest lib though is always going to be no lib.

This is only true if you already know all of the performance hacks in each browser. By using a library, you can outsource the need to worry about that to a third party. Also, only about 5% of your code is ever going to underperform, so deciding to not use a library means you're wasting time on the other 95%.


why would someone right their own tests when all they're trying to do is point out why the current ones are flawed.


To see if it performs better for their particular use case?

You know, choosing something based on how well it fits the problem?


Why should he write his own tests? Jquery has proven performance and reliability where as all rightJS has are objectionable tests.


It depends on your use case. I don't think you should be against something just because you have a solution you assume is the best it can be.


It depends what your writing, and your target market. Most people using webapps/modern websites are doing so with good recent browsers.


[citation needed]

Are you related to the project somehow? You seem to be vigorously defending it, but I see no affiliation mentioned (and your profile here is blank).


How is my above comment defending it? :/ I actually extremely rarely use any js libs. I'd rather just do it myself. And no I'm nothing to do with the project.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: