There are underlying reasons, much of which can be gathered under the umbrella term of "virtualisation". Modern computers use layers upon layers of software or hardware to present an interface pretending to each higher level that things are simpler than they really are.
These abstractions, however, are leaky, and almost all of them are leaky in the temporal sense. There's a pretend continuity or constant throughput that's not really there.
Everyone knows that operating systems run short time slices of processes on physical cores, so that each program can "pretend" that it runs continuously on bare metal. But of course, not really, so there are gaps in the flow of execution that can occasionally be perceived by end-users.
If that were the only sin of pretense, then that could be worked around, or carefully tuned, but the reality is that it's just one of many layers.
The garbage collector of managed languages (e.g.: JavaScript in Electron) pauses execution within the process too.
Even unmanaged languages have variable overheads when allocating or de-allocating from the shared heap.
The desktop window manager helps each application pretend that it has a rectangular surface from (0,0) to (w,h), when in reality that is transformed and overlaid. That can introduce a lot of variation, particularly because the DWM has its own threads and its own garbage or heap.
The video card in turn is no longer just a block of memory mapped into the address space of the program doing the drawing, but is its own little computer with cores, threads, schedulers, locks, clocks, delays, and so forth.
The display in turn might further delay things because it has complex overdrive or scaling logic, so it needs to buffer frames.
> Everyone knows that operating systems run short time slices of processes on physical cores
The Amiga 1200 I mentioned earlier does pre-emptive multitasking on a 14 MHz CPU with 256 bytes of cache. I can switch between my IDE, my paint program, the OS desktop, a file manager and a simple text editor without any noticeable delay. It's as fast as flipping between desktops in FVWM on my PC (an operation which, incidentally, never seems to suffer from latency).
> The video card in turn is no longer just a block of memory mapped into the address space of the program doing the drawing, but is its own little computer with cores, threads, schedulers, locks, clocks, delays, and so forth.
The graphics architecture of my Amiga consists of several different chips all timed to a PAL signal and, since they're sharing memory with the CPU and other I/O, are also affected by constant interrupts.
> The desktop window manager
There's a DWM on my Amiga as well, called Intuition, providing several abstractions for programs to open screens and windows and render graphics and text in them. Plus, of course, GadTools, the system library for drawing UI widgets.
> The display in turn might further delay things because it has complex overdrive or scaling logic
The cheap, modern flatscreen connected to one of my Amigas upscales and upsamples _and_ does A->D-conversion on the analog RGB signal and yet manages to show my double-buffered displays scrolling in one pixel increments, with 50 Hz vsync without stuttering or tearing.
Yes, the layers of abstraction have increased in number and complexity, but so has the speed of the surrounding architecture. My PC's clock speed is more than 100 times that of the Amiga, it has 4000 times more RAM (in fact the caches in my cheap CPU exceed the amount of RAM on the Amiga), displays are now connected to the GPU with a wide-bandwith digital interface, and so on.
All of this could perhaps be valid excuses if it was consistent. Yet typing in a Firefox <textarea> feels faster than typing in for example FocusWriter, and typing in an xterm faster still. I can paint smooth freehand curves in Gimp with instant feedback (something the Amiga is not always capable of, depending on how much bandwidth the selected resolution requires). The computer is capable of full screen, full frame, fully vsynced full HD movie playback without stuttering or dropping frames.
The most interesting aspect is of course that a computer that might feel laggy in certain applications is fully capable of emulating an Amiga, complete with the perceived snappiness of the UI, despite all the overhead of emulation _and_ the supposed delays of the surrounding architecture.
These abstractions, however, are leaky, and almost all of them are leaky in the temporal sense. There's a pretend continuity or constant throughput that's not really there.
Everyone knows that operating systems run short time slices of processes on physical cores, so that each program can "pretend" that it runs continuously on bare metal. But of course, not really, so there are gaps in the flow of execution that can occasionally be perceived by end-users.
If that were the only sin of pretense, then that could be worked around, or carefully tuned, but the reality is that it's just one of many layers.
The garbage collector of managed languages (e.g.: JavaScript in Electron) pauses execution within the process too.
Even unmanaged languages have variable overheads when allocating or de-allocating from the shared heap.
The desktop window manager helps each application pretend that it has a rectangular surface from (0,0) to (w,h), when in reality that is transformed and overlaid. That can introduce a lot of variation, particularly because the DWM has its own threads and its own garbage or heap.
The video card in turn is no longer just a block of memory mapped into the address space of the program doing the drawing, but is its own little computer with cores, threads, schedulers, locks, clocks, delays, and so forth.
The display in turn might further delay things because it has complex overdrive or scaling logic, so it needs to buffer frames.
It's turtles all the way down.