Still reading the paper and forming an opinion. But my initial thoughts are what exactly is new here that couldn't be done through some other means? I'm sure there will be interesting implications, but right now nothing seems particularly novel.
But is there really anything to report really? What's new that isn't doable with low level automation, I've been able to drive a browser with python for a while.
> Human-like denial-of-service. Imitating human-like behavior (e.g. through human-speed click patterns and website navigation)
It's already easy to simulate activity, clicking a random link on a page would be basically as good. Are there even DoS
> Prioritising targets for cyber attacks using machine learning. Large datasets are used to identify victims more efficiently, e.g. by estimating personal wealth and willingness to pay based on online behavior.
So, like, sum and sort all the spending data?
But then again, maybe policy makers already didn't know what was possible?
Still reading the paper and forming an opinion. But my initial thoughts are what exactly is new here that couldn't be done through some other means? I'm sure there will be interesting implications, but right now nothing seems particularly novel.