This is part of a much bigger evolution occurring in the UX space. This whole change is part of Apple trying to unify their UX across tablets (where "paging" with a swipe is nice) and laptops. One of the most core tenets of UX is consistency and this is an attempt to start to add some UX consistency between Mouse pointers and Gestures. Long term this consistency is a good thing. Short term its going to cause some pain.
Here's the thing its only going to get worse as more and more devices and interaction models start working their way in. My company does bespoke app design and development across a lot of emerging platforms. We're building apps that work on connected TV's, Game Consoles, and mobile devices. Right now we have a project which has an app on LG/Samsung connected tvs which need to be navigated via a traditional up/down/left/right enter remote and the LG TV "Smart Remote" (basically a Wiimote pointer - or a mouse). We're porting this app to the Xbox, on which we need to support the console constroller/Kinect Gestures and Voice. As well as parts of the design are making their way into tablet devices with gestures. One App/Design - 5 different input mechanisms.
Consistency between gestures and the mouse is a terrible goal, absolutely terrible because they're so different. The pinch to zoom and swipe to scroll works well on iOS devices but it is horrible on track pads because of the presence of a cursor. On a trackpad and a mouse you use your fingers to move a cursor, not to swipe and gesture.
Maybe it's just me, but mouse gestures, swiping and scrolling are far too easily activated. Take the worst offender in this area of all: Google. There is no UI inconsistency worse, IMHO, than scroll to zoom on Google maps. Absolutely atrocious. No matter how long I've used Google maps, I cannot seem to stop myself from using the scroll wheel/swipe/whatever to move around on the page, inadvertently zooming. Not only that, but the trackpads and Apple's "magic mouse" are impossibly sensitive, scrolling, swiping, and gesturing seemingly at random. How anyone uses these, I cannot say.
Here's the thing its only going to get worse as more and more devices and interaction models start working their way in. My company does bespoke app design and development across a lot of emerging platforms. We're building apps that work on connected TV's, Game Consoles, and mobile devices. Right now we have a project which has an app on LG/Samsung connected tvs which need to be navigated via a traditional up/down/left/right enter remote and the LG TV "Smart Remote" (basically a Wiimote pointer - or a mouse). We're porting this app to the Xbox, on which we need to support the console constroller/Kinect Gestures and Voice. As well as parts of the design are making their way into tablet devices with gestures. One App/Design - 5 different input mechanisms.