Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's because macOS implements kinetic scrolling in the toolkit, not in the driver (like xf86-input-synaptics and some Windows drivers do). The driver itself on macOS isn't really exceptional.


>It's because macOS implements kinetic scrolling in the toolkit

Which is the proper way to do it.

Other vendors wanted to emulate macOS’s inertial scrolling but they had to stay compatible with all sorts of legacy APIs that only knew mouse events, so some of them cut corners and started sending these fake mouse events from the driver, to be the first on the market with “macOS-like scrolling”™.

Of course this is very much a bolted-on solution, which leads to all sorts of nuisances like what’s described in the parent comment, and contributes to the general sad state of trackpad behaviour on Windows and Linux.

They’ve pretty much cornered themselves, because fixing it would require a coordinated effort from the hardware/driver vendors, the OS vendors and the UI toolkit vendors. It’s just unlikely to happen unless you own the whole stack like Apple does.


Not really, at least not in the GNU/Linux case. The proper way is to do it in the driver (or in some HID abstraction of the operating system - so compositor maybe), but clearly differentiating such events in the client API, so it can be properly interpreted by clients and ignored when not relevant. Otherwise you fail to handle the case where you put your finger still to stop the scrolling steadily enough to not generate any cursor movement (libinput+GTK combo fails here, btw) and to fix that you have to move all gesture processing (including basic scrolling) to the toolkit, which is a bad idea in an environment where multiple toolkits can be used at once (basically a recipe for UX disaster).


Well, at least Apple’s gesture handling API have a proven track record.

You can’t abstract away gesture processing because it’s deeply intertwined with the UI logic. Inertial scrolling, rubber band effect, concurrent gestures with complex exusion rules between them, etc. If you try to abstract them away from the UI library, you’ll end up with a bloated meta-toolkit that still won’t cover all use cases, and is inflexible and prone to ossification.


Apple's gesture handling API lives in a completely different environment.

But I generally agree with you. It's up to client/toolkit to handle complex gestures. However, even Apple abstracts pointer movement and scrolling on input level, and that abstraction includes inertial scrolling. Rubber band effect doesn't really interwine input with UI logic (it can easily be entirely encapsulated in the widget itself) and for other gestures you're going to use some gesture-specific API anyway. The scrolling inertia is a function of the input, so the proper place to calculate it is on the OS side, not in the client/toolkit - it can be in toolkit only when the toolkit itself is a part of the OS. You don't have such toolkit in the GNU/Linux land.


GNOME has never seemed afraid of changing the whole stack.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: