Hacker Newsnew | past | comments | ask | show | jobs | submit | rollcat's commentslogin

I was one month into my first full-time job, when I've (unknowingly of his rank) challenged the CTO in a technical discussion - in a public email exchange. Regardless of the outcome - I've been treated like an equal. This one short exchange has influenced not only the rest of my career, but my entire worldview.

Apple: m68k -> PowerPC (32), OS 9 -> OS X, PowerPC (32, 64) -> x86 (32, 64) -> Arm. They've dragged giants like Adobe (kicking and screaming) thru most stages.

Windows NT has always been portable, but didn't provide any serious compat with Windows 4.x until 5.0. At that time, AMD released their 64-bit extension to x86. Intel wanted to build their own, Microsoft went "haha no". By that time they've been dictating the CPU architecture.

I guess at that point there was very little reason to switch. Intel's Core happened; Apple even went to Intel to ask for a CPU for what would become the iPhone - but Intel wasn't interested.

Perhaps I'm oversimplifying, but I think it's complacency. Apple remained agile.


Agree. My 2017 MBP cooked its own battery (spicy pillow) by 2021.

My 2019 Thinkpad T495 (Ryzen 3600) does get hot under load, but it's still fine to type on.


Yep, but only because of Apples terrible design. Take those same chips and put them in a machine with proper cooling and they fly. Its frustrating when Apple fans always blame that situation on intel, when in reality Apple messed up the design bad. Its almost like they purposely designed the last generation of intel macs to run hot and throttle just so people had bad memories of them after upgrading to Apple silicon.

I really like the principles behind AMD's chiplet design, of course they've had different design goals behind it (easier diversification of their product portfolio), but it remains a fact that you can slap a not-so-terrible GPU right next to a CPU core.

There's probably a lot still missing: Apple integrated the memory on the same die, and built Metal for software to directly take advantage of that design. That's the competitive advantage of vertical integration.


> Apple integrated the memory on the same die

It's on the same package but not the same die


> It's a fairly common issue on Linux to be missing hardware acceleration, especially for video decoding. I've had to enable gpu video decoding on my fw16 and haven't noticed the fans on youtube.

I've worked in video delivery for quite a while.

If I were to write the law, decision-makers wilfully forcing software video decoding where hardware is available would be made to sit on these CPUs with their bare buttocks. If that sounds inhumane, then yes, this is the harm they're bringing upon their users, and maybe it's time to stop turning the other cheek.


I run Linux Mint Mate on a 10 year old laptop. Everything works fine, but watching YouTube makes my wireless USB dongle mouse stutter a LOT. Basically if CPU usage goes up, mouse goes to hell.

Are you telling me that for some reason it's not using any hardware acceleration available while watching YouTube? How do I fix it?


It's probably the 2.4GHz WiFi transmitter interfering with the 2.4GHz mouse transmitter. You probably notice it during YouTube because it's constantly downloading. Try a wired mouse.

Interesting theory. The wired mouse is trouble free, but I figured that's because of a better sampling rate and less overhead over all. Maybe I'll try a bluetooth mouse or some other frequency, or the laptop on fired Ethernet to see if the theory pans out.

> Maybe I'll try a bluetooth mouse

Bluetooth is also 2.4 Ghz.


Or just switch to 5GHz or 6GHz range.

Easiest way is to use Chrome or a Chrome based browser since they bundle codecs with the browser. If you're using Firefox, need to make sure you have the codecs. I know nothing about Mint specifically though to know if they'd automatically install codecs or not.

You specifically don't want to use the bundled codecs since those would be CPU decode only.

Straight up false. I have both Chrome and Vivaldi installed on Linux, both have hardware video decoding on OOTB...

You check it by putting chrome://gpu in the address bar.


Interesting. I'll look into that more.

Im using Brave and it seems the enable hardware acceleration box is checked.

> [...] someone with serious name recognition like Linus Torvalds starts to lead that kind of effort [...]

Linus is a kernel hacker, and already busy tending to his own project.

"GNU/Linux" is effectively a committee of communities, with sometimes conflicting goals. It took Canonical and Valve to put things into shape on the desktop, and that's mostly because desktop was becoming less relevant.

I see two ways for things to change here:

- A massive, for-profit corporation, someone willing and able to challenge Google and Apple on an even ground, is hell-bent on making a Linux-based phone (Microsoft failed even after acquiring Nokia);

- Another platform shift happens, making smartphones irrelevant in comparison (think: when smartphones displaced desktops).


Microsoft was stupid, in EU they were slowly reaching 10% when they decided to kill WP, it was getting momentum as the alternative for those that didn't want Android and weren't going to spend Apple money for a phone device.

And actually the development experience was much better than Android to this day.

But that isn't coming back, especially after they killed all developer good will on Windows OS for everyone that invested into WinRT as platform.


How much of that 10% was them basically paying OEM's and consumers to use Windows, which is what the Nokia deal amounted to? It wasn't sustainable.

Whatever benefit we'd have from a Windows Phone today, it's laughable to think that Microsoft wouldn't be doubling down on exactly the sort of locked-down devices Apple (and now Google) have or are moving towards.

Their only vaguely "open" platform (Windows) is like that because of legacy compatibility and customers, but for anything new Microsoft always wanted to sell you an Xbox that could make phonecalls. Try writing and deploying an app on that without a developer account.


I really would like to have been payed to use Windows phones, especially as former Nokia employee.

I was in Espoo, the week following the burning platforms memo.

However it represented a third option, to a percentage no Linux phone distribution has ever achieved since Open Moko.

Maybe Maemo could have been it, had not been for Nokia's board decision to bring in Elop.


    > I really would like to have been payed
    > to use Windows phones
I meant paid in the indirect sense of being the beneficiary of a loss leader for Microsoft.

I.e. I'm poking holes in your (somewhat unstated) premise that they'd already reached around 10% of marketshare, and could have just organically grown from there. As reporting at the time shows[1] the average selling price of these phones was €72.4.

So Microsoft (Nokia, but we all know who was really running/paying for the show) were spending a lot of money to buy themselves into the market, and just barely holding on to double digit market share for a bit there by subsidizing entry level phones.

1. https://www.theguardian.com/technology/2013/oct/01/microsoft...


> Maybe it's time for a third large phone OS [...].

Apple and Google conspired to never allow that to happen. They've pushed Microsoft out of that sector. Microsoft! Name a bigger challenger.


Microsoft pushed itself out of the sector by having a lousy mobile platform.

Microsoft had a phenomenal mobile platform. The only problem they had was that they failed to convince anyone to build apps for it.

> Microsoft had a phenomenal mobile platform.

I went through 3 generations of Windows Phone devices for work. The only thing phenomenal about them was the Zune-style UI. They were buggy and unreliable, even for the few apps they had.


The minor issue of not having any developers, developers, developers

GNU/Linux phones already exist, although they're indeed being harmed by the duopoly.

That was before AI coding assistants.

A language model will create the market force to displace an oligopoly in the most influential sector of our society?

Hedge your bets.


Considering that Google and Apple can use them too, tt's unclear to me whether you think AI coding assistants will make it easier or harder for a third competitor to enter the field.

Lmao

Apple is quite happy to patch & extend stuff. Their ssh-add(1) accepts --apple-use-keychain, --apple-load-keychain; vanilla OpenSSH doesn't even know what long flags are.

I think it's entirely OK for long-established programs to adhere to their conventions; it's less surprising for the users. If you're going to change how things work, do so with minimum impact on the UI.

(I wish their GUI teams understood that.)


> [...] Linus has not bothered to write them down himself [...]

He's a kernel hacker and a technical leader. He doesn't write specs for the userspace, that's the least of his concerns. Linux has very strong guarantees on syscall backwards compat - Go doesn't even use libc. This is all by design.

Even the name "GNU/Linux" was something FSF tried to impose on distributions. The distributions being distributions, were free to brand themselves as they willed, which is 100% fair under the license terms.

This ecosystem has always been a bazaar, if that's not for you - use a BSD. (e.g. macOS.)


This is not a judgment. It is simply a fact. Linux uses cross-distribution standards, because of this fact; these standards should not be misidentified as cross-platform standards - which I believe TFA was doing - as that's not the purpose they exist for.

> has not bothered

> This is not a judgement

"did not" is not judgemental but "did not bother" strongly implies judgement.

I do think it's true, however, that had Linus expressed an opinion on this it would have been hard to ignore.

Edit to add: Bear in mind that XDG is a relatively new thing compared to Linux (more so compared to Unix).


As a “BSD” even MacOS seems to be the odd one unlike the others.

Who writes .gitconfig?

I don't mind the ads as much as all the mandatory meta-baiting. Not the MB itself, but the mechanisms behind it.

Even if you produce interesting videos, you still must MB to get the likes, to stay relevant to the algorithm, to capture a bigger share of the limited resource that is human attention.

The creators are fighting each other for land, our eyeballs are the crops, meanwhile the landlord takes most of the profits.


There is much data to support that asking for likes and subs actually increases likes and subs.

Right, that's the issue. I really doubt that creators love having to spam the same "Don't forget to like/subscribe/comment!" message in every single video they produce, but Youtube forces them to.

As a viewer I certainly hate that crap and wish Google didn't intentionally make it this way.


That is my entire point. The creators fight each other in a pit, for the Emperor's amusement.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: