I was one month into my first full-time job, when I've (unknowingly of his rank) challenged the CTO in a technical discussion - in a public email exchange. Regardless of the outcome - I've been treated like an equal. This one short exchange has influenced not only the rest of my career, but my entire worldview.
Apple: m68k -> PowerPC (32), OS 9 -> OS X, PowerPC (32, 64) -> x86 (32, 64) -> Arm. They've dragged giants like Adobe (kicking and screaming) thru most stages.
Windows NT has always been portable, but didn't provide any serious compat with Windows 4.x until 5.0. At that time, AMD released their 64-bit extension to x86. Intel wanted to build their own, Microsoft went "haha no". By that time they've been dictating the CPU architecture.
I guess at that point there was very little reason to switch. Intel's Core happened; Apple even went to Intel to ask for a CPU for what would become the iPhone - but Intel wasn't interested.
Perhaps I'm oversimplifying, but I think it's complacency. Apple remained agile.
Yep, but only because of Apples terrible design. Take those same chips and put them in a machine with proper cooling and they fly. Its frustrating when Apple fans always blame that situation on intel, when in reality Apple messed up the design bad. Its almost like they purposely designed the last generation of intel macs to run hot and throttle just so people had bad memories of them after upgrading to Apple silicon.
I really like the principles behind AMD's chiplet design, of course they've had different design goals behind it (easier diversification of their product portfolio), but it remains a fact that you can slap a not-so-terrible GPU right next to a CPU core.
There's probably a lot still missing: Apple integrated the memory on the same die, and built Metal for software to directly take advantage of that design. That's the competitive advantage of vertical integration.
> It's a fairly common issue on Linux to be missing hardware acceleration, especially for video decoding. I've had to enable gpu video decoding on my fw16 and haven't noticed the fans on youtube.
I've worked in video delivery for quite a while.
If I were to write the law, decision-makers wilfully forcing software video decoding where hardware is available would be made to sit on these CPUs with their bare buttocks. If that sounds inhumane, then yes, this is the harm they're bringing upon their users, and maybe it's time to stop turning the other cheek.
I run Linux Mint Mate on a 10 year old laptop. Everything works fine, but watching YouTube makes my wireless USB dongle mouse stutter a LOT. Basically if CPU usage goes up, mouse goes to hell.
Are you telling me that for some reason it's not using any hardware acceleration available while watching YouTube? How do I fix it?
It's probably the 2.4GHz WiFi transmitter interfering with the 2.4GHz mouse transmitter. You probably notice it during YouTube because it's constantly downloading. Try a wired mouse.
Interesting theory. The wired mouse is trouble free, but I figured that's because of a better sampling rate and less overhead over all.
Maybe I'll try a bluetooth mouse or some other frequency, or the laptop on fired Ethernet to see if the theory pans out.
Easiest way is to use Chrome or a Chrome based browser since they bundle codecs with the browser. If you're using Firefox, need to make sure you have the codecs. I know nothing about Mint specifically though to know if they'd automatically install codecs or not.
> [...] someone with serious name recognition like Linus Torvalds starts to lead that kind of effort [...]
Linus is a kernel hacker, and already busy tending to his own project.
"GNU/Linux" is effectively a committee of communities, with sometimes conflicting goals. It took Canonical and Valve to put things into shape on the desktop, and that's mostly because desktop was becoming less relevant.
I see two ways for things to change here:
- A massive, for-profit corporation, someone willing and able to challenge Google and Apple on an even ground, is hell-bent on making a Linux-based phone (Microsoft failed even after acquiring Nokia);
- Another platform shift happens, making smartphones irrelevant in comparison (think: when smartphones displaced desktops).
Microsoft was stupid, in EU they were slowly reaching 10% when they decided to kill WP, it was getting momentum as the alternative for those that didn't want Android and weren't going to spend Apple money for a phone device.
And actually the development experience was much better than Android to this day.
But that isn't coming back, especially after they killed all developer good will on Windows OS for everyone that invested into WinRT as platform.
How much of that 10% was them basically paying OEM's and consumers to use Windows, which is what the Nokia deal amounted to? It wasn't sustainable.
Whatever benefit we'd have from a Windows Phone today, it's laughable to think that Microsoft wouldn't be doubling down on exactly the sort of locked-down devices Apple (and now Google) have or are moving towards.
Their only vaguely "open" platform (Windows) is like that because of legacy compatibility and customers, but for anything new Microsoft always wanted to sell you an Xbox that could make phonecalls. Try writing and deploying an app on that without a developer account.
> I really would like to have been payed
> to use Windows phones
I meant paid in the indirect sense of being the beneficiary of a loss leader for Microsoft.
I.e. I'm poking holes in your (somewhat unstated) premise that they'd already reached around 10% of marketshare, and could have just organically grown from there. As reporting at the time shows[1] the average selling price of these phones was €72.4.
So Microsoft (Nokia, but we all know who was really running/paying for the show) were spending a lot of money to buy themselves into the market, and just barely holding on to double digit market share for a bit there by subsidizing entry level phones.
I went through 3 generations of Windows Phone devices for work. The only thing phenomenal about them was the Zune-style UI. They were buggy and unreliable, even for the few apps they had.
Considering that Google and Apple can use them too, tt's unclear to me whether you think AI coding assistants will make it easier or harder for a third competitor to enter the field.
Apple is quite happy to patch & extend stuff. Their ssh-add(1) accepts --apple-use-keychain, --apple-load-keychain; vanilla OpenSSH doesn't even know what long flags are.
I think it's entirely OK for long-established programs to adhere to their conventions; it's less surprising for the users. If you're going to change how things work, do so with minimum impact on the UI.
> [...] Linus has not bothered to write them down himself [...]
He's a kernel hacker and a technical leader. He doesn't write specs for the userspace, that's the least of his concerns. Linux has very strong guarantees on syscall backwards compat - Go doesn't even use libc. This is all by design.
Even the name "GNU/Linux" was something FSF tried to impose on distributions. The distributions being distributions, were free to brand themselves as they willed, which is 100% fair under the license terms.
This ecosystem has always been a bazaar, if that's not for you - use a BSD. (e.g. macOS.)
This is not a judgment. It is simply a fact. Linux uses cross-distribution standards, because of this fact; these standards should not be misidentified as cross-platform standards - which I believe TFA was doing - as that's not the purpose they exist for.
I don't mind the ads as much as all the mandatory meta-baiting. Not the MB itself, but the mechanisms behind it.
Even if you produce interesting videos, you still must MB to get the likes, to stay relevant to the algorithm, to capture a bigger share of the limited resource that is human attention.
The creators are fighting each other for land, our eyeballs are the crops, meanwhile the landlord takes most of the profits.
Right, that's the issue. I really doubt that creators love having to spam the same "Don't forget to like/subscribe/comment!" message in every single video they produce, but Youtube forces them to.
As a viewer I certainly hate that crap and wish Google didn't intentionally make it this way.
reply