Terrifying startup prospect: Imagine a small dashcam-like consumer device that was able to perform the same type of license plate recognition coupled with GPS location. A data collection company could provide these devices for free or at a low cost to anyone, and then pay drivers for every license plate scan their box sends in.
Obvious downfall: someone will figure out how to spoof a GPS signal and randomly generate license plate images on a computer monitor and make a bunch of money for uploading junk data.
You're hinting at another major problem in this area, too:
It's not just that ppl have the data and use it for amoral/unfair purposes. It's that people come to trust the data, believe in it and use it to make decisions. This is disastrous b/c there are huge margins of error.
Errors are possible at all levels in systems like this, including sabotage and incrimination and your suggestion of junk data.
E.g. in the motorcycle example cited above. It's quite possible a motorcyclist needed to stretch their legs, so they pulled into a driveway for a moment, stretched, then google snapped a photo, then it was used as evidence. It's not likely or common, but it's utterly possible and not anticipated by these systems.
For the EU, Data Protection Directive 95/46 (and the incoming changes from 2012), attempt to offer protection against automatic decisions.
In that document there is the concept of consent and that the consent can only be for the purpose the data was collected for. So should some clever company sell widgets at a loss to collect data and should that data then have an extra purpose in aggregate that does not fall into a protected area (like national security, government functions and all that) and the person did not consent to that reason, then you should (in theory) be able to bubble that argument up to the relevant courts for further inspection.
That's the theory, how well that works in practise... only time will tell.
In the licence plate situation the point would be that this isn't personal data, and thus the data protection directive and the consent requirement doesn't apply.
In the motorcycle example, there is no personal data involved whatsoever - it's just a photo that some motorcycle was there; again, DPD doesn't restrict anything at all for such cases.
Did you read my comment or just scan the second paragraph and assume I was saying license plates weren't public data? I was talking about automated decision making using collected data, I was not saying license plates were or were not public information (in fact, I have previously made some of the same points you are, that a license plate is public information and that an individual has no expectation of privacy in public).
So if a private company were to automate the collection of license plates and therefore have a trove of information that included times and places a car was and then that system was to grow so that insurance companies used that information so rates could be calculated through Bayesian classification using that data to determine risk, then that would be an automated decision based on that data AND THAT is protected (in theory) by that directive and the laws based on it in the member states.
That was the point of my comment. Automated decisions are protected unless you consent to them.
Okay, yes, automated decisions are prohibited in EU when based on unverified, unconsented data - that still doesn't prohibit from automatically gathering that data without consent, storing it forever, distributing it to third parties, and having then some human be interested or prejudiced to you based on that info.
When Google Glass first was announced, I imagined a future where it would become more common to rent out access to one's vision. I imagined a service, called BuyMyEyes, that would set up per-minute or per-user fees that sellers could use to offer access to their line of sight to interested viewers. Then there could be a chatbox where the viewers input what the "eyes" should go look at next. In such a world, I could imagine a system where police departments send a notification: "The person 50 feet in from of you is a potential person of interest in an investigation. Keep him in your line of sight for 10 minutes and win a free iPad!"
>I could imagine a system where police departments send a notification: "The person 50 feet in from of you is a potential person of interest in an investigation. Keep him in your line of sight for 10 minutes and win a free iPad!"
I'd be surprised if it doesn't go like "The person 50 feet in from of you is a potential person of interest in an investigation. Keep him in your line of sight for 10 minutes or you'll be arrested for not collaborating with our investigation."
You realize that's essentially what's already happening the article, right? Large databases companies like DRN are providing free or low cost ALPRs to repo men who submit a minimum number of records per month.
I'm not sure consummerizing it makes it much more terrifying, given the tremendous size these databases have already achieved.
...and someone else would pay good money to a clown to put on a fake license plate cover of their business rival and commit aggressive acts, so vigilantes and a mob would go after them.
We need time-varying crypto license plates. apparently.
How about a google glass like device, that we all wear all the time, and the camera runs all the time, and sends face pictures of people with the GPS coordinates and the timestamp?
This is all fun to talk about, but none of these is going to end well.
>Terrifying startup prospect: Imagine a small dashcam-like consumer device that was able to perform the same type of license plate recognition coupled with GPS location.
your car already has GPS, "parking" cam and Wi-Fi/Bluetooth (for engine computer at least, not necessarily that you know about it). You're just not being paid for the info your car collects.
>Obvious downfall: someone will figure out how to spoof a GPS signal and randomly generate license plate images on a computer monitor and make a bunch of money for uploading junk data.
cross-referencing with streams from other cams will immediately identify "low trust" streams.
Obvious downfall: someone will figure out how to spoof a GPS signal and randomly generate license plate images on a computer monitor and make a bunch of money for uploading junk data.