Hacker News new | past | comments | ask | show | jobs | submit | yhlasx's comments login

Clickbait title, content about why something like this won't happen. screw this.


I thought the same. I was genuinely curious to read about this story and then bam:

"Well, that story above? Never happened. Totally made it up."


To people not in the developer community it makes a stronger point.

They think that exact story is actually possible, and drawing them in ("see, this is exactly what I'm afraid of!") goes a long ways towards convincing them otherwise.


>>People love not to think

O_O

judging by the comment you wrote right after that, I would assume you are one of the people who likes not to think.

They are making people click checkboxes and deviating from the old model of recognition. Your comment makes no sense.


Yeah every once in a while... a little bit of heuristics, a little bit of laziness.


Seriously guys? This made to the top of the front page? First of all, to all people saying "HUR DUR GOOGLE WANTS YOUR BROWSING DATA", well they already fucking have/had it for a looong time.

Secondly, If you tell me that one dude [author] ruled the one+ year work of the engineering team at google as a flaw and simplified it as [So what Google is trying to sell us as a comprehensive bot detecting algorithm is simply a whitelist based on your previous online behavior, CAPTCHAs you solved.] and that you believe it, I would question your intelligence.

This is supposed to be tech savvy community at least to some degree, what the fuck.

Now, in the google's blogpost it reads [Advanced Risk Analysis backend for reCAPTCHA that actively considers a user’s entire engagement with the CAPTCHA—before, during, and after—to determine whether that user is a human.]

[However, CAPTCHAs aren't going away just yet. In cases when the risk analysis engine can't confidently predict whether a user is a human or an abusive agent, it will prompt a CAPTCHA to elicit more cues, increasing the number of security checkpoints to confirm the user is valid.]

So my guess would be they analyze users behaviour on the page where captcha is located, things like mouse movements, time it takes to type out the words, spelling mistakes corrected and whatever else humans do differently than bots - and only then combine that with your historical cookies. Maybe it is much more complicated than that, I, as well as you, don't know the details.

Do you really think that they would go ahead and implement a such system without rigorous testing of effectiveness? I am sure that they tested it extensively with users, AND with bots, and decided that it is better than the current system, and ONLY then deployed it. Rant off.


>So my guess would be they analyze users behaviour on the page where captcha is located, things like mouse movements

If they can track mouse movements why in incognito mode i'm not a human for them anymore? I was expecting same but from what I see it's just a whitelist. And it's OK. Problem is, which you probably didn't care to read, is it's vulnerable to simple clickjacking which opens another weakness - i can use your click on my page to get your reCAPTCHA token and feed it to my spam bot.

I'm actually happy with No CAPTCHA, because it's making progress. But it's not good enough (see the rest of comments, it could be a background AJAX request instead).


>>which you probably didn't care to read

I did read it. My point is, you, or I, or anyone for that matter does not know the inner details of how it works.

>>If they can track mouse movements why in incognito mode i'm not a human for them anymore?

Maybe having a clean cookie history is not good enough during the risk assessment.

Look, my entire point is, google is not a joke company. I am certain that they tested it for effectiveness before deploying.


> I did read it.

So what do you think about clickjacking issue? I made an assumption about their algo and maybe I'm wrong and they do track your mouse, but there's exploitable weakness. My post is 1) your algo seems simple 2) here's a bug in it.


The curious thing is, I could not replicate the clickjacking issue. Everytime I make a click on original wordpress registration page, I am verified as a human immediately.

If I do the click on your github page, I get a challenge. My clicks were never accepted as human on your github page. My clicks were always accepted as human on wordpress page.


No incognito tab? Maybe they fixed it


yes they fixed it but i don't know how. Likely there's a way to bypass.


> one dude

Since you obviously don't know who Homakov is I can't take your post very seriously.

Homakov has exposed several serious security flaws at Facebook and Google before. I'm pretty sure Google is actively trying to headhunt him since he is one of the best in the web security field.


He's probably best known to HN for his GitHub exploit with Rails in 2012. I wrote a profile of him earlier this year (http://jobtipsforgeeks.com/2014/03/27/homakov/) which talks about his background a bit more.


> Do you really think that they would go ahead and implement a such system without rigorous testing of effectiveness? I am sure that they tested it extensively with users, AND with bots, and decided that it is better than the current system, and ONLY then deployed it.

I think the gap between the marketing material for nocaptcha (a simplified website, a youtube video with animations) and the seemingly lacking actual implementation is why this blog post was relevant for me.

Like other tech people around here, I was hyped up by the "smarts" of a system that uses cursor detection etc. to silently validate that I am a human. This blog post seems to indicate that the validation is a much simpler issue of previously passed tests and the amount of data that Google has associated with the user.


That's exactly why I wrote this post. I wish Google proved me wrong and demonstrate us how they use cool tech to detect bots instead of user.isGoogleUser? and user.acceptedCaptchas > 5


>>>So what Google is trying to sell us as a comprehensive bot detecting algorithm is simply a whitelist based on your previous online behavior, CAPTCHAs you solved.

That is a bold statement, something presented as a fact, not a hypothesis.


Half of the post is about how the new technique is vulnerable to clickjacking.


The google's blogpost says that 98 something percent of old text could be deciphered by AI. My point is, regardless of vulnerabilities of the new system, I am certain that it is more effective than the old alternative. They would have tested it.


IMO they should take some cues from IOS and allow users to allow/disallow such things when apps make the requests, not dump all permissions as allow all or none when users install the apps.


IMO this all-or-nothing mentality when it comes to installing apps is the single biggest problem with Android right now, trumping even the perceived fragmentation problem.


I think this is what people overlook most of the time. I did not use the calendar app because it was overloaded with things that were completely unnecessary for ME. Now, it is much simpler, might start using it.

Majority of their users probably are better off with the new calendar app, so they readjusted it for a better fit. If it turns out this is not the case, they will change it further, maybe bring some old functionalities back.

They focus on what will please majority of their users, as any other company would and should, not their power users.


I did not use the calendar app because it was overloaded with things that were completely unnecessary for ME. Now, it is much simpler, might start using it.

What's wrong with just ignoring the features you don't need (yet)? All software has a learning curve. The more you use it, the more features you should discover and find useful. Trying to flatten the learning curve by removing features (or making it much harder to discover them) just keeps more users from ever advancing beyond the "novice" stage, because there isn't all that much more to discover.

Power users became power users because they explored and discovered more functionality, and this is because they were motivated to explore in the first place; extremely simplified UIs reduce much of this motivation, which results in less power users - and eventually developers. Maybe "we don't want power users and want them to disappear" is their goal (I hope not but wouldn't be surprised if it implicitly is), but this constant trend of lowest-common-denominator dumbing-down apps is, in the long term, not beneficial to anyone.

Software should have defaults that make it easy to use initially for the beginner, but it should also encourage growth of the user's knowledge. Instead we get software that's easy to use initially - and then effectively stunts their growth, and all the aesthetics they throw at it cannot compensate for functionality. I think it's really quite sad; although if you believe in the theory that Google is aiming to take away user's control over their devices and replace it with Google's, then it makes a lot of sense to keep users as blissfully ignorant as possible. ("You can't easily check your schedule anymore... but look, we gave you beautiful rounded buttons!")

From the parent:

is essentially now unusable on the phone for anything other than a near-term agenda/itinerary.

Looking at the way it doesn't even show the year anymore, "near-term" is right!


All things considered, no surprises here. Extreme majority of Google's profits depend on how well their search works. Same is not true for Apple with siri, or microsoft with Bing.


I have good internet. I am in Korea though, so that might have been part of the problem, but the video was VERY laggy and annoying.


Pay me 1 BTC and I will make it for you. 1CgZ2qaZdY5UxZi5RBkq5426a7zxMmiZaa


That's not even hexadecimal...


I don't see anything wrong with this. If the option is selected, probably someone at the shipping out location (or retail location) manually will install firefox, or windows image with firefox on it. It is perfectly fine. Even if automated, automation is not free. Are they overcharging? Probably.

I disagree that it does not follow mozilla's terms, they are selling the service, as stated, not the browser.


4 rupee price was before, when he first went to buy them. Years passed, and India has crazy amounts of inflation, and now it costs 2.5 rupees. I think if calculated, he probably made somewhat of a 70+% discount or more.


Yeah, it was 4 rupees in 1998


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: