Hacker News new | past | comments | ask | show | jobs | submit login
Selfie-based authentication raises eyebrows among infosec experts (theregister.com)
51 points by alexahn 6 months ago | hide | past | favorite | 25 comments



This is common in Brazil and there's already a scam based on it: unsolicited delivery of flowers and a selfie to confirm receipt, that is actually used to validate transactions done with stolen personal data from the mark.


Don't want to praise scammers, but that is a very clever trick still. Especially since it preys on the mark being too happy about getting flowers for free to think straight. One could go the extra mile and search up data on people to do it on their birthdays or around significant dates.


Is it a matter of not being able to think straight? "Selfie authentication" is no less legitimate for flower delivery than for anything else. And you wouldn't expect to know anything about the company. We don't know anything about most of these companies that hold so much information on us.


Nowadays it's a good rule to not let delivery people photograph you anyway. Anything I order comes by mail and there's no such nonsense. Unsolicited whatever, lemme just take your pic? You can take it back thanks


Some delivery apps like Rappi sometimes require delivery man to take a photo of the order upon delivery. Unfortunately, mail doesn't deliver a hot cheeseburger in 10 minutes.


Picture of the delivery is different than picture of your face.


"Selfie" authentication, or any data gathering, is not legitimate when it comes to unsolicited things. Just because someone knocks on my door with a surprise gift doesn't mean I owe that person anything.


I also hate selfie based authentication because some tasks it asks for “liveliness checks” are downright impossible for some people. For example, my smile simply doesn’t pass as a one for some reason and has prevented me from getting a bank account in the past


For some people, smiling is not a natural thing.

https://www.tumblr.com/intj-explained/10233359295/the-death-...


The article seems to suggest that “liveness checks” are an effective counter to stolen selfies.

But aren’t all those checks just running against videos? Why can’t those videos also be stolen/mocked?

All in all: yikes.


Speaking as GCash user: sometimes the instructions say "blink", sometimes "stay still", sometimes "look right", and it is thus not possible to predict the timing to a sufficient degree to pass it with a pre-recorded video.


I guess the solution is to render fake video on the fly doing pose and expression emulation which is possible with a few static shots, open source tools, and a halfway decent machine. It might not fool a human, but you dontnhave too and I suspect it won't be long before fooling the average human will be pretty easy too.


Nothing a little faceswap connected to the webcam feed can't fix.


Yup, but it’s an arms race.

They will come next.


It's funny to extend this to the max. Prick my finger to take a blood sample into a DNA sequencer enclave within the processor of each device?


At some point I expect a physical presence at a given location is what will matter.

If someone identifiable has to physically be at, say, a police station, or DMV, it cuts out 95% of the complexity involved in all the security stuff we’re dealing with.

Just from an attack scalability perspective.


In an AI powered world it’s just a matter of time before the liveness checks used today (take selfie video and perform actions as instructed like “turn your head to the right”) can be perfectly replicated. Supposedly the livness checks will get more complex and try to stay ahead of the AI scams.

In my opinion we logically end up needing something that looks like a pre-arranged Proof of Identity (Aadhaar, Clear) or Proof of Humanity (World Coin, etc.)


I recently took my kids to an child-focused restaurant outside Mexico City where they implemented a funny "selfie-based" authentication mechanism.

The place has lots of activities for kids to run around and do, but Mexicans are scared of child kidnapping (rightly or wrongly I do not know).

So upon entry to the restaurant, the whole family has to take a selfie (on their device), and they need to show it when exiting. So in theory kids can only leave with the people they came in with.

Of course, the staff doesn't really check the timestamp, so I suppose a kidnapper could just take a selfie with the target kid, rendering the whole thing useless... but I nonetheless find it interesting how businesses in emerging markets roll their own half-baked, low-tech security solutions.


As someone who is much more concerned about privacy issues than most, I think this solution is excellent (ignoring whether or not it solves a real problem). I'd do it without hesitation.

The photo would be taken by me, on my device, and no data about it ever enters the control of any other entity. It's hard to fault.

I agree that it would be improved if the staff also required the photo to have a timestamp attached and actually confirmed the time on exit. But even just the photo alone probably gets you 98% of the benefit. Security is never 100%. The main point of any security system is to increase the difficulty of doing the bad thing, and this system does that.


I've suffered through the opposite situation: a neighbor I was helping lost access to an important account because the automated selfie authentication always failed with an error saying that the image is "low quality", no matter what we tried.


I’ve been a customer of Twilio for over 10 years and they recently started requiring something like this where you have to upload a picture of your drivers license and let them look at you on a webcam. They were also in the news a week ago for getting hacked. I’m sad to have to drop them because I really enjoyed using the service.


I have dropped online services and gone elsewhere for asking for this ridiculous requirement.


imo, biometrics are a user experience and an authorization method, but not an authentication method, as they are not consistent enough to provide cryptographic inputs themselves. I read the paper showing how to do it years ago, but the entropy of the inputs (after sampling the image) reduces it to a dependency on the encoding system and not the key, as per kirkhoff's principles. sure you can use them to unlock a key, but it's just a ritualistic UX based on magical thinking. that you have to effectively dance for some bureaucrat to transact makes me unsympathetic to biometrics anywhere, and the adoption of alternatives more urgent.


Please drink verification can.


Crypto exchanges have this now too. :D Thank god for KYS regs.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: