Hacker News new | past | comments | ask | show | jobs | submit login

Probably not going to work for the same reason that the police generally won't go around and arrest someone based purely on dashcam footage: who is to say that the video Person A has of Person B driving illegally is actually real? It is all too easy to doctor videos, and if insurers/police just blindly trusted videos people find on the internet/record on their phone/dashcam/etc then it would be pretty easy for Person A to frame Person B for crimes they did not commit.

Innocent until proven guilty, not Innocent until some grainy youtube footage that kinda maybe looks like you were speeding. Police equipment is calibrated and the evidence stored appropriately for a reason.

That said, at least the in the UK, the police do use video footage from the public to a certain degree. I believe this is often used to go and "have a quiet word" with the driver in question, i.e.: <unexpected knock knock on the front door> "Was this you sir/madam? <shows video on smartphone of sir/madam driving dangerously>" and then give them a warning if they own up to it (...and this sort of intervention is probably enough to on its own without having to go any further, i.e. having a police officer standing on your doorstep with video footage of you driving like a prick and being able to "get away with" just a warning/telling off), but I do not believe that the footage on its own is enough evidence on its own since it is so easy to fake.




I agree. Thing is, most actuarial risk factors are really bad predictors individually. Combined though, they work enough for a functioning market. This would be just another factor. Don't exclude any of those license plates straight up, just ask 5 or 10% higher premiums. Some of these people will stay and pay some more, some will leave for other insurers and stop cutting into your bottom line. No such thing as a bad risk, only wrong premiums.


You can make secure dashcams that add some hashsum/crypto signatures based on all GPS signals received at moment.

That's quite general problem around deepfakes - how to generate video that's guaranteed real. Some form of DRM or blackchain is probably needed, not to anyones liking.


Guaranteeing that the video is genuine still doesn’t solve the problem that [person] != [car] != [license plate]


In my country [person] != [car] is one of the exceptions on innocent proven guilty. You are responsible for who as access to your car and it is on you to reasonably prove that it was not you driving.

Typically happens with family members.


>who is to say that the video Person A has of Person B driving illegally is actually real?

Is this ever a real concern with CCTV footage?

But sure, a dashcam won't necessarily ID the driver, just the car.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: