It _can be_ useful. It can also _not_ be useful to others. It sounds like it's not a choice in this case, but a forced feature, and that's fine for some and not for others.
No one used the word must until you right now. The OP comment was posting a valid thing that Windows has that Linux does not. It’s fine if Linux doesn’t have it but I don’t understand where you’re coming from as presenting this as though someone said Linux must have this.
That implies Linux must or should have an equivalent to those features found in Windows -- you can choose any word you like, friend. There is no other reason to make that statement but to challenge the fact Linux doesn't have those options.
Fun fact: I switched to Kubuntu recently and I didn't even have to install a graphics driver. It was just there, just worked, and my AMD 7700 XTX is working fine and playing Windows "only" games via Proton just fine as well as Linux native games just fine.
I'm simply trying to get people to think about design choices and questioning or stating why one thing is better than another.
That literally does not imply a need for those features. It points out a thing that Linux lacks, which is true. And that's where it stops. You are projecting an implication that "Linux does need x, y, or z because Windows has X, Y, or Z."
We're not sitting in a thread talking about what makes Linux/Windows better than the other, we're in a thread talking about just factual differences between the two. You can talk about two things, compare them, and even state your own preference for one or the other without stating that each should do everything that the other can do.
E.g. snowmobiles are easier to handle while driving in the snow than a Boeing 737. I like driving my snowmobile in the snow more than I like taxiing a Boeing 737 in the snow.
We can talk about things without implying changes that need to happen.
This line of thought is precisely why Linux continues to falter in mainstream acceptance.
Windows exists to enable the user to do whatever he wants. If the user wants to play a game or watch a video, Direct3D is there to let him do that. If he doesn't, Direct3D doesn't get in the way.
This is far better than Linux's (neckbeards'?) philosophy of Thou Shalt Not Divert From The One True Path which will inevitably inconvenience many people and lead to, you guessed it, failure in the mainstream market.
Contrast Android, which took Linux and re-packaged it in a more Windows-like way so people could actually use the bloody thing.
Not to contradict but it seems to me that *nixes have always split user interaction and 'compute'. To them running a headless toaster is probably more valuable than a desktop UI.
> Windows exists to enable the user to do whatever he wants
It's very bad at that, then, considering it insists on getting in my way any time I want to do something (_especially_ something off of the beaten path).
> If the user wants to play a game or watch a video, Direct3D is there to let him do that. If he doesn't, Direct3D doesn't get in the way.
I don't see what the point you are trying to make is, this is no different on Linux. What does D3D being in the kernel have to do with _anything_? You can have a software rasterizer on Linux too. You can play games and watch videos. Your message is incoherent.
>I don't see what the point you are trying to make is
Parent commenter said Linux shouldn't have <X> if it's not useful for everyone, though more likely he means for himself. Either way, he is arguing Linux shouldn't have a feature for dogmatic reasons. Violating the Unix ethos of doing only one thing, or something.
Meanwhile, Windows (and Android) have features so people can actually get some bloody work done rather than proselytize about their glorious beardcode.
You said "why must Linux have" a feature that can be useful to some and not useful to others. Taking that to its strongest conclusion[1], you're saying Linux shouldn't have something if it's not useful to "everyone" and asking for counter arguments; this is not unlike the "Do one thing and do it well." Unix ethos.
Clearly, as demonstrated by history, most people prefer that their computers can and will do the many things they need or want with minimal finagling. That is what having DirectX inside Windows means, and why Linux which makes that a finagling option at best and flat out refuses as heresy at worst flounders.
> ... you're saying Linux shouldn't have something...
I said no such thing. You're taking a question and converting it into a statement in your own head.
Why must any operating system be designed with a 3D rendering engine compiled into it? It's just a question. I'm trying to learn. I've never once said it should or should not have the thing, I'm asking why would it need it? Why should it have an equivalent to Windows' implementation of such a thing? What do I gain? Is that always a good design choice? Is that true of Windows Server, and if so, why do I need 3D rendering baked into my Windows Server? What about Windows Server Core... does the NT kernel have it baked in there?
>It _can be_ useful. It can also _not_ be useful to others. It sounds like it's not a choice in this case, but a forced feature, and that's fine for some and not for others.
>So again, why _must_ Linux have an equivalent?
That is very different from simply asking why Linux should have a "Direct3D" built in like Windows does Direct3D.
>What do I gain?
To answer this again and more in-depth this time: A central, powerful subsystem that can be assumed to exist. We can assume Direct3D is and always will be available in Windows.
One of Linux's biggest problems is you can't safely assume anything will exist, in particular cases not even the kernel. This is the reason containers were invented, because you need to bring your own entire operating environment on account of being impossible to assume anything. The cost for this workaround is performance and complexity, the latter of which most users abhor.
>Is that always a good design choice?
Yes, it enables users thereof.
> Is that true of Windows Server, and if so, why do I need 3D rendering baked into my Windows Server? What about Windows Server Core... does the NT kernel have it baked in there?
If the server is a media server, say, having DirectX means the server can do encoding and decoding itself and that's something many people want.
Windows itself also needs Direct3D for rendering the desktop, which Server also obviously has.
I'm using Linux right now, and sadly I only have access to an 80x30 black and white terminal. I'm writing this comment as a raw HTTP request to this site. Send help. I just need colour and at least 1024x968... please help! I wish I could watch videos!
> If the user wants to play a game or watch a video, Direct3D is there to let him do that. If he doesn't, Direct3D doesn't get in the way.
I _just_ moved from Windows 11 to Kubuntu. None of that stuff is missing. In fact, unlike Windows 10/11, I didn't even have to install a graphics driver. My AMD 7700 XTX literally just worked right out of the box. Instantly. Ironically that’s not the case for Windows 10/11. This isn’t a “My OS is better than your OS” debate — we’re talking about why D3D being integrated into the kernel is a good idea. I’m playing devil’s advocate.
And thus, you missed my point: "Why should Linux have an equivalent to Direct3D" isn't me arguing that Windows having it is bad, it's me asking people to think about design choices and consider whether they're good or bad.
> This is far better than Linux's (neckbeards'?) philosophy of Thou Shalt Not Divert From The One True Path which will inevitably inconvenience many people and lead to, you guessed it, failure in the mainstream market.
If you think Windows having Direct3D "built in" is why it has mainstream dominance, then you potentially have a very narrow view of history, market timing, economics, politics, and a whole range of other topics that actually led to the dominance of Windows.
>I _just_ moved from Windows 11 to Kubuntu. None of that stuff is missing. In fact, unlike Windows 10/11, I didn't even have to install a graphics driver. My AMD 7700 XTX literally just worked right out of the box. Instantly. Ironically that’s not the case for Windows 10/11.
How did you install a driver on windows if your gpu didn't work out of the box?
No. That's not true. It does not do that. I've reinstalled Windows 11 several times to resolve issues or try these kinds of things out. It has never offered to download an AMD driver for me. This is false.
Windows 10 can 100% download and install an nVidia proprietary driver for hardware it finds.
Indeed I inadvertently trapped it in a boot loop by fitting 2 dissimilar nVidia Fire cards with different GPU generations. This works on Linux if you use Nouveau but not with nVidia drivers.
Win10 lacks an equivalent of nouveau. It booted, detected card #1, downloaded and installed a driver, rebooted, the card came up; then it detected card #2, which wasn't working, downloaded and installed a driver, and rebooted.
Now card #2 initialised but #1 didn't work. You can only have 1 nVidia driver installed at a time.
So, Windows downloads and installs the driver for card #1... reboots... #1 works, #2 doesn't... download driver, install, reboot...
The only way to interrupt this is to power off and remove one card.
When I replaced both cards with a single AMD card, it downloaded the driver and everything worked.
You are wrong. Source: my own personal direct experience.
Windows Update can and will grab most third-party drivers for your hardware if you let it, this includes video card drivers from Intel, Nvidia, and AMD.