Hacker News new | past | comments | ask | show | jobs | submit login
Open Source on its own is no alternative to Big Tech (berthub.eu)
166 points by lhoff 77 days ago | hide | past | favorite | 192 comments



I think the key point is this;

>> (By the way, all new software without accompanying support & guidance is doomed to fail. And if that software comes from a dominant player, you’ll just have to deal with that by the way.)

There's a temptation to conflate the software license with the software business. This is natural, but places software as the primary value in the chain.

From a business perspective the software though is a cheap part of the chain. And the least interesting part.

I don't pick say accounting software based on price. Or access to the source code. I base it on effectiveness. And a big part of that effectiveness is that staff can run it. And when it all goes wrong there's someone to call. I'm buying a -relationship-, not software.

Thats why RedHat is a business. They're not selling Linux, they're selling the reliability, longevity, services, support etc.

In truth the license doesn't matter. My accounting software might be open or closed. My supplier doesn't sell me based on the license. They sell me by convincing me that everything just works, and when it doesn't they'll be there to fix it.


>Thats why RedHat is a business. They're not selling Linux, they're selling the reliability, longevity, services, support etc.

>In truth the license doesn't matter.

It's funny to bring that up in the context of Red Hat who have started to circumvent the GPL by terminating their relationship with anyone who tries to actually make use of the rights granted by it. "The license doesn't matter" because they've found a loophole in it, but it clearly does matter in that they had to do so in the first place and weren't able to adhere to its spirit due to business concerns.

[1]: https://sfconservancy.org/blog/2023/jun/23/rhel-gpl-analysis...

[2]: https://opencoreventures.com/blog/2023-08-redhat-gets-around...


> From a business perspective the software though is a cheap part of the chain. And the least interesting part.

This is only because true most of the time businesses use a lot of publicly funded work without paying for it. If software development were entirely private, I'm sure businesses would find excuses that actually no it has to cost 100x what it would cost otherwise.

Everything you say about maintainability and stability is true. But writing software that can be operated as a service in the first place is substantially harder. It's just not as easy for a company to capture.


> They sell me by convincing me that everything just works, and when it doesn't they'll be there to fix it.

and they'd tell you to pay up 10x, or lose this stability in the future;

If it was an open source software, you will have the option to go to a competing vendor.


Is there a single example of multiple vendors selling support for the same open-source piece of software, where I can just hire a different vendor if I no longer like my current one, without changing anything major in my operations?

You could say that Canonical and IBM RedHat compete on offering Linux support, but the reality is that it's not that much harder to switch from RHEL to Ubuntu than switching to any other OS, so I don't think this counts.


It works, but just for _some_ solutions. For instance, there are multiple providers of support for PostgreSQL. And there are many companies offering support/consulting for WordPress.

IBM RedHat is the steward of RHEL, and Canonical is of Ubuntu, so if you want support from them there are no real other options, but they do work with multiple different ISVs.

If you want to stay 'independent' and have the most leverage you can take a linux which is not from a bigco, such as Debian.

I'm really worried about cloud-lock-in for bigger companies. My previous company switched large amounts of product to AWS, when I asked about how this was feasible after doing back-of-the-envelope calculations they said: well you should not consider list price, nobody is paying list price, we get discounts.

This reminded me of an ISP I worked for in 2006 that invested in large amounts of Solaris machines because they got big discounts instead of going for the much more obvious Linux route. Then after two years or so the new (Oracle at that time?? I'm not sure) sales rep paid a visit and they said, when they were not able to sell MORE new servers: OK screw the discounts, from now you're paying list price. So that got them stuck in a real bad place. I'm afraid the same might happen to companies who move to cloud providers as well.

And then I've not even touched on issues such as privacy, security, business continuity, and losing the skill to actually run your own hardware


> losing the skill to actually run your own hardware

i see a trend in recent years in losing the skill for making proper software too - Something that works and is usable and is not about changing colors and pixels and protocols everyday for the sake of it. Maybe the opposite trend of the everybody-can-program since 1995? Or maybe not - not sure how ML-generated stuff will impact all this.

btw there was ~2021 talk/article by same guy about how outsourcing-everything leads to total deskilling:

https://berthub.eu/articles/posts/how-tech-loses-out/


Actually yes; SuSE sells support [1] for RHEL, Ubuntu LTS, and SuSE itself). Quotes: "SUSE Liberty Linux consolidates support for your entire Linux environment, including CentOS, RHEL and SUSE Linux Enterprise Server (SLES) distributions" and "Our world-class support team is trained to assist your entire mixed Linux estate — not just SUSE solutions."

[1]:https://www.suse.com/c/embrace-linux-diversity-simplified-mu...


Postgresql has many companies offering support for example.


Wordpress


Webhosting


>> If it was an open source software, you will have the option to go to a competing vendor.

You miss the point. Enterprises don't go looking for another vendor. Vendors come to them with a sales offering.

If I'm running SQL Server the I pretty much know where I stand with Microsoft, and there are endless MS approved support people.

With PostgreSQL some vendor has to come to me and convince me to switch. PostgreSQL is really well supported, and it's at least an option. 99% of Open Source though has 1 or 0 support entities, and 0 sales people.

Sure, with PostgreSQL I can do my own research. I might even have skills to do it myself. But now I have to explain my choices all the way up the ladder.

Am I going to use an OSS accounting system with no sales people? With no support people? Or am I gonna pay $99 a year or whatever for QuickBooks?


That's the main point. No one buys software, they buy solutions. Accounting is a good example. I use a SaaS solution, but it doesn't matter because I could also take all my invoices to an accountant, and the effect would be the same. Also, mixing open licenses into business doesn't usually make sense. I also think that mass source scraping for ML/AI training will make businesses less likely to participate in open source.


> I also think that mass source scraping for ML/AI training will make businesses less likely to participate in open source.

I totally agree with this. And not just businesses, individuals too.


Large Corps arent exactly well known to handle the Explore part of the Explore-Exploit Tradeoff.

On the flip side lot of open source devs are going to get 100x more productive in the Exploit part than the avg coder monkey at large corp.

Nothing is obvious and predictable about where that story goes in an ever growing ever changing system.

Large corps will keep funding whoever gets the job done. While AI might replace lot of Large Corps activity which is basically on the Exploit side of the Tradeoff.


Years ago I tried to build a certification / service team out of independent software vendors and open source systems - ie you could buy 1 years of support for Apache httpd from any certified vendor (ie they knew enough about httpd)

It’s hard but I still think that’s the way to support OSS


Software license may not consciously matter to end users, but they do have a huge impact on everything else. That is, the end user would not have the software, or would have vastly different software, if the licenses were different. They just don't know and don't care about the licensing details and effects, like so many other technical aspects.


License does matter. Without OSS, computing as we know it doesn't exist. A better analogy would be if roads and utility cables were built as open source, everyone used them for free, then they were acquired by giant companies who charge for their use.


It would exist as I knew it until 2000's, and hence why I see a parallel where current non-copyleft adoption has taken us back to.


That's the point. It wouldn't exist. It's not possible without research and OSS. You can't write off the entire foundation of CS before a date. Well you can, but it's ignoring history.


There was plenty of research at my university without OSS.


The whole internet is running mostly on OSS so not really


The Internet used plenty of closed source UNIX back then, and did just fine.


Even Microsoft used PDP10's back in the day.


And, since when was Xenix open source?


> "Without OSS, computing as we know it doesn't exist."

The rise of the Internet and the dot-com boom happened largely without OSS, on proprietary UNIXes, proprietary web server engines, and proprietary database engines.

FAANG and other high tech businesses can easily afford very expensive servers and datacenters to house them thanks to the very very fat profit margins. They can also easily afford the cost of an OS license and other software tools.


This is nonsense. Consumer workstations were proprietary. The internet was made by government grants and us.


Not sure who “us” is meant to be, but the first Internet boom (1995-2000) used a whole lot of Solaris, Windows, and Cisco. Of course there was plenty of OSS too, but Linux servers, or Intel servers, weren’t the standard.

I remember visiting early Hotmail and their sharded “capital unit” was a great big Sun storage server and a bunch of Intel desktop towers running BSD. The latter was considered rather wild at the time.

Last I checked, some eBay URLs still had “ebayISAPI.dll” in them, which is a remnant of that period.


How is software an utility?


analogy: A similarity in some respects between things that are otherwise dissimilar.


What is the working analogy here? Where is the similarity?


The rest of us understood it


Red Hat also does the verification against standards which allow them to be used: https://access.redhat.com/articles/compliance_activities_and... Not every distribution does/can.


So if Adobe open-sources all of their software tomorrow, that would not impact their business?

> In truth the license doesn't matter.

Come on. What matters is the way the business extracts value from you, and the license is part of that. Especially when the software you produce is so great that nobody needs to be called, because it just works.


I think the OP framing is about the enterprise / government framing, so Adobe maybe isn't the best example.

Still, the licence doesn't matter - while probably being a bit of an overstatement - is somewhat true. If my enterprise relies on an Adobe service, it's primarily about my relationship with them, not the product license.

... But of course, product price and therefore revenue will decline if competitors can sell my product too or customers can download and use it for free.


> If my enterprise relies on an Adobe service, it's primarily about my relationship with them, not the product license.

That is forgetting why your enterprise relies on an Adobe service. It is because nobody else has software that does their job as well as Adobe does.

This discussion is non-sensical to me. Of course the license matters.


>I don't pick say accounting software based on price. Or access to the source code. I base it on effectiveness.

So you would pick a software costing 1 million over a software that is 90% as effective but costs 1 thousand?


Weird question. There is no 100% effective software, there is no way to measure effectiveness (what sells in enterprise is marketing and a good sales team, not "effectiveness") and it all depends on the budget.

If it fits your budget, and a commercial product has a good sales team (vs a cheaper opensource one with zero marketing), the commercial product is gonna get chosen even if it costs infinitely more. That's basically IBM and Oracle's play book.


I haven't make any claim about 100% effective software. However, one software can be less effective than another. Saying that one is 90% as effective than the other is just to help visualise things.

> what sells in enterprise is marketing and a good sales team, not "effectiveness"

I work for a SaaS company. Our marketing and product teams work hard to convince potential customers that our software fulfills their needs at a reasonable cost.

When we buy services and software we don't look at any of the marketing materials, there are very trough analysis of costs/benefits being made, dollars and even cents are counted. Every cost that can be cut while we can still deliver something to our customers will be cut.


I run a SaaS company as well. We are outliers, especially if you are selling to other tech people, who are technically proficient and don't value their time very much—they'd rather spend hours writing their own half-arsed solution than paying someone else $5/month. I know how hard it is to sell to someone that believe your product is nothing more than "a shell script written over a weekend" (paraphrased comment from my Show HN)

Outside of tech, that's not how it works. They don't have in-house developers. They don't read HN. When they need a database, they ring up a large vendor and spend hundred of thousands a year for Oracle, when a PostgreSQL container would do just fine. They often don't even care they could pay zero, simply because PostgreSQL doesn't come with a phone number to call when the DB crashes.


> they'd rather spend hours writing their own half-arsed solution than paying someone else $5/month

Paying someone else to solve your problems can also carry huge risks of all sorts. Sometimes investing 1000 dollars into a risk-free solution is better than 5 dollars a month with gigantic strings attached.


> In truth the license doesn't matter. My accounting software might be open or closed. My supplier doesn't sell me based on the license. They sell me by convincing me that everything just works, and when it doesn't they'll be there to fix it.

The license matters indirectly: if it's open source, you know that as a fall-back other suppliers might be able to step up and take over, if your original guys fail or get too insufferable.


Moreover, if you have an issue with OSS software and have competent people in your own IT team, they could attempt to fix the problem and get results faster than going through the whole incident-report-blame-the-victim-finally-have-them-confirm-your-repro-wait-for-next-version-release-which-hopefully-includes-the-fix ordeal. Then if you contribute the fix back to the project, the community of users benefits as well, with possible free publicity for your organization to boot.


That's the theory, but in practice that's not how it works.

Firstly, of course, most customers don't have any programming skills at all, so the point is moot. But let's limit ourselves to customers that do have IT departments.

It's worth noting that IT departments are already busy. Deploying resources because you found a bug in PostgreSQL is unlikely.

But ok, you found a bug. And we happen to have a highly paid C programmer on staff. Let's ask him to take a look.

He's not familiar with PostgreSQL architecture- do it'll take a few days to download the source, make a build, hope the build is close to our version, and deploy to production. (This has already cost the business more than a enterprise support contract from Postgres but whatever...)

He then spends a month working through various subsystems to determine the exact flow that leads to your bug. (I'm gonna ignore all the cases where the "bug" isn't a bug.) He makes a tweak, merges it into the current build, and deploys.

He even submits the PR which may or may not be accepted. Until it is he's regularly pulled from his task to update the PostgreSQL source and rebuild.

Everyone else plagues him either PostgreSQL questions twice a week now. His manager gives him a 'poor' rating at the next review because the thing he was actually hired to do is not getting done.

Unless the company is selling to other PostgreSQL developers, there's no "free publicity". That's not how publicity works.

So yes, your scenario is possible, but its simply not how things happen. You complain to the IT department - they add PostgreSQL enterprise support to the next budget. They're not looking to take on extra load unnecessarily.

Yes, the major contributers to big OSS projects are tech companies contributing full time programmers. And that's a good thing. But customers do not have the time or resources (or inclination) to go down this road.

Frankly, it's too hard (in most cases) to just build the software, much less understand the code to make changes.


Yes, that's why I was carefully formulating this as a fall-back insurance for the worst case, not something most customers would do lightly.


It might seem like "fall back insurance". And I guess it does happen sometimes. But it's of negligible value in the _purchasing_ decision.

If we're worried about the supplier now then we don't bu from them. If they suddenly change down the road (and most established ones don't) then the fallback is either "we don't need support anymore" or (more likely) we start looking for another system.

I've been on the other side of this. We sell a product into an established mature domain. We're the "newbie" on the block. Most of the sales we get in this space are from customers who are unhappy with their supplier. We offer better sales and service, and obviously a smooth transition from their existing data (which we import.)

Neither product is OSS - but even if it was that would be irrelevant to the user. (It would however have made our integration code a lot easier to write, so there is that...) A lot of the users we convert have "bought" their software. It's costing them nothing to use it. But they switch to us anyway (we have a subscription model) because our model can afford to fund full time support staff, whereas the sales model cannot.

So, I think this "choose another servicer" is more of a theoretical than practical feature for most OSS systems. Obviously there's really good support for the really big projects, but basically nothing from 99% of them...


> If we're worried about the supplier now then we don't buy from them.

This sounds just like a general argument against prenups and specifying any kind of contract penalties?

In any case: one way a supplier can make me less worried about them now is by open sourcing.

I mostly agree with most of your points.


This is as long as you don't take _risk_ into account.

RedHat providing OSS licensed software is _less_ risk than RedHat providing proprietary closed source operating system.


> In truth the license doesn't matter.

It only doesn't matter if you don't care at all about software supply chain risks.

This is not a sane position in 2024 to hold.


I think the author is not learned in the technology economics.

IBM to save it's business had to merge with Red hat almost 50% 50% in 2018.

Microsoft it's security and cloud offering had to, open source it's .net framework, aquire GitHub, ditch Visual Studio fot Visual Studio Code,

ARM is eating the world, it over hauled the x86_x64 architecture, and became the Defacto architecture.

We can go on and on and on and on,that the Open Source business model, became necessary to survive in tech, not just to exist.

If you don't open it, they will eat you up.


Nice explanation, except IBM has been one of the largest Linux contributors since forever, they saw it as a means to reduce Aix development costs.

Linux only took off during the dotcom days as IBM, Oracle and Compaq started adopting it into commercial workloads, back in 2000.

Visual Studio Code isn't in the same ballpark as Visual Studio. It was already an Azure project, as the Monaco editor, and it was a way to kill Atom.

ARM is only successful on mobile devices and Apple hardware.

If you mean ARM on server, the most successful company, Ampere, is largely owned by Oracle, and there are some ongoing discussions about a full acquisition.


> ARM is only successful on mobile devices and Apple hardware.

Your "only" is funny. That is by far the biggest computing market worldwide.


Until phones get to replace laptops and desktops, it doesn't matter much.


We're literally watching this happen.


For some niche segments, yes.


If by “niche segments” you mean everyday computing tasks, sure.


Linux took off when PCs were finally able to run operating systems with virtual memory. All of a sudden devs did not need to pay for licences for C/C++ compilers and other dev tools, but most importantly they no longer had to pay tens of thousands of dollars for Unix workstations or servers. It coincided with the commercialisation of the Internet (it started as a non-commercial project funded by DARPA).


Linux took off AFTER PCs were finally able to run operating systems with virtual memory.

I was using VM systems running on PCS from 1989 (OS/2) Linux only started in 1991 and did not take off for say 10 years, by then Windows NT existed.

So VM was necessary for Linux but was not the reason for it taking off.

In my experience Linux came in for servers replacing other Unix servers. Windows NT servers continued for some time.

As for desktop you still need Excel and to a lesser extent Word and these are still best on Windows.


Linux definitely replaced Unix servers. I remember calling Digital Equipment Corporation rep in the UK in 1994 for a quote for a server and was told I'd need to pay a minimum of 100,000 GBP for a minimum running config. That's 200,000+ GBP in today's money for something that had less power and storage than a RaspberryPi with an 32GB SD card. Yes, the price included the license for the operating system and the http server.


Linux was nowhere around when PCs were already running OS/2 and Windows NT, it was something to toy around at home, for doing university homework, and only because Windows NT POSIX wasn't good enough.

Had Microsoft known better, and Linux would never taken off on PC.

Paying for software was never an issue back then, piracy was quite common, you could get whatever you wanted on the countries where street bazaars are a common thing.

Check the list, make your order, come around the following week.


AWS Graviton servers are ARM. These tend to be cheaper and more reliable than Intel/AMD counterparts.


Don't confuse "giving software away for free because people have been conditioned to expect software that costs nothing" with "open source". And I have no idea why ARM is on that list: sure, they broke the Intel monoculture, but they certainly aren't free or open in any sense of the word.


> ditch Visual Studio for Visual Studio Code

How has Microsoft ditched VS for VSCode? VS is lightyears ahead in features and performance.

The two are not even remotely comparable. VSCode is a text editor that wants to be an IDE, but if you work with C++ or .NET you're shooting yourself in the foot if you use VSCode.

VSCode is not a serious alternative to VS or other IDE's like JetBrains Rider.


Exactly. Even if you do C++ development using VSCode on Windows, likely you are still relying on MSVC compiler for the Intellisense (and of course compiling). And people who mainly write JavaScript/Java/Python/Go etc have never used Visual Studio for development and never will be. VSCode didn't replace VS, they replaced Notepad++/Sublime Text/Atom/Eclipse etc, plus Intellij based IDEs for some people.


ARM is not open source.


ARM broke WIntel.


Smartphones broke Wintel, their ISA didn't need to be ARM as long as it was power-efficient.


And who cares? Functionally there is Apple ARM with its extensions and Qualcomm ARM with its extensions.

If anything, the x86 world was more open and more compatible. We have enterprise distros running on both Intel and AMD, supported by their hardware makers. Who in their right mind runs 3rd party Linux distributions on smartphones in production environments (i.e. the CEO's smartphone)?


Yeah, but how exaxtly is that relevant here?


Visual Studio Code is not open source.

GitHub is not open source.


I mean, technically true, but from context we can infer they were referring to vscode, which is open source. Visual Studio Code is vscode + ms stuff, but at it's core the project is MIT, and has been recently forked by a lot of teams (cursor, void, that fruit scandal, etc).


Yet what "sells" about VSCode is the closed-source features, not the open source core. Everybody uses VScode today because of collaboration features and Copilot integration. VSCodium is very niche.


Citation needed? I use vscodium explicitly for its license, but my friends and coworkers who use vscode don't use collaboration features and copilot anyway and could use vscodium as well if they cared. What sells vscode is that it's a nice extensible cross platform IDE filling a niche between "just use vim" and "full blown jetbrains IDE for every language that you use".


https://ghuntley.com/fracture/ - "Visual Studio Code is designed to fracture"

> Whilst Visual Studio Code is "open-source" (as per the OSD) the value-add which transforms the editor into anything of value ("what people actually refer to when they talk about using VSCode") is far from open and full of intentionally designed minefields that often makes using Visual Studio Code in any other way than what Microsoft desires legally risky...


Do you by any chance work with Python? If i recall correctly, the default Python library for VSCode doesnt work out of the box in VSCodium, as its pulled from MS servers, which VSCodium does not allow. I think you need to enable this connectivity on your own. So my understanding is that by “crippling” the maket, its more convenient for people to just use VSCode.


I mean, I used pyright (the free one) in emacs for years, and it was fine. Pip installable et al, maybe this all happens automagically in VSCode?


ARM isn't open source. Companies like MS and Google use OSS that complement themselves but the core money makers are closed source and closely guarded.


Open source won those battles but the war doesn't end. The next fight is AI and thanks to a leak we have open source (weights and inference) models now.

Without that leak we would not have the ecosystem evolving around Llama.


Meta was always planning to release LLaMA to the public. They were literally sending LLaMA 1 to anyone with a ".edu" email.


Bell Labs did the same with Unix, but Unix was still not open source. This is why we run GNU/Linux today, not Unix™.


There is still enough Unix™ around, including from a well know fruit company.


No, macOS is FreeBSD, not the Unix from Bell Labs.


macOS is many things, a bit of NeXTSTEP, a pinch of Mach, q.b. BSD, and a UNIX™ certification from OpenGroup.


A UNIX™ certification is not the same as being actual code from Unix™.

NeXTSTEP was 4.3BSD plus Mach, using Display Postscript as its windowing system and TIFF as its image format, supporting transparency for icons. macOS is FreeBSD plus Mach, using Display PDF as its windowing format and PNG as its image format, supporting transparency for icons. Basically NeXTSTEP but every component upgraded to its then-modern equivalent. (Except Objective C, they kept that.)


Where do you mention Unix™ is the original source code, and not UNIX™ as defined by OpenGroup, the owners of UNIX™?

> Bell Labs did the same with Unix, but Unix was still not open source. This is why we run GNU/Linux today, not Unix™.

I know pretty well how NeXTSTEP used to be, my graduation project was to port a visualization framework from NeXTSTEP/Objective-C to Windows/C++.


The issue at hand was me making an analogy about how Meta releasing LLaMA to many .edu addresses still would not mean that LLaMA would be actually used widely, since the Unix™ source code was similarly released by Bell Labs, but the actual Unix™ source code did not end up being the code which we now use.

The fact that UNIX™ later went on to become a compatibility specification, not a specific implementation, is irrelevant to the analogy.


I work in the health sector at a company with nearly 1,000 employees. In our IT department, we rely on a wide range of proprietary software and spend substantial amounts on Oracle, MS SQL, and other licenses. I’ve been trying to convince management that PostgreSQL could be a solid alternative for many of our use cases, but it’s consistently dismissed as “not an option.”

Meanwhile, we continue to pour money into Oracle licenses, not just for basic access but for additional features—like enabling data reading and analysis on the Oracle-embedded database in our main app. And, if we need to allocate more CPU cores on our VMs, we face yet another round of licensing fees.

Sometimes you don’t need much support. Yet pay tons of money.


Every time I hear a story like this - "management says 'no'" - I wonder if anyone cared/dared to ask follow up questions.

Why was PostgreSQL not an option according to management? I would not take their dismissal at face value. I'd want to know why not. But that might be Dutch culture.


Same, but not necessarily due to nationality. It very much depends on the company culture and interpersonal relationship in question


Insurance. The piece of paper that states who the buck stops with when there's a claim or lawsuit.


Let's say that is the reason, then that seems a very valid reason, right? Accountability is important, especially in large organizations.

I guess what ticks me off a bit is the trope of dissing on management for saying 'no' but leaving out all the relevant context that might show management to be right.

Don't get me wrong, I've seen my own fair share of bad management. But it's not so black and white when it comes to grand sweeping decisions like "should we invest in PostgreSQL or keep paying Oracle big license fees?"


> Accountability is important, especially in large organizations.

Why? So we'll have someone to blame when things inevitably go wrong? In my experience, the people who like to say the buck stops with them are nowhere to be found when that happens.

I much prefer organizations focussed on actually getting things right, instead of worrying about who takes the blame.


I'd say it's also to stop people from doing dumb things (i.e. proactive defense).

Say, if the org runs Postgres in-house, there's a mighty chance that an intern somewhere might decide to ...test things out in a creative way.

Perhaps the idea of outsourcing that to Oracle is that Oracle has the processes/controls to rein in such interns. As opposed to e.g. a hospital having to create such processes/controls.

(Oracle is still a bad idea IMHO, just slightly less so comparatively)


I once had a few drinks with an Oracle salesperson many moons ago, and he did a pretty good job of convincing me that they sold risk minimisation over personnel changes to my uber boss, which honestly makes more and more sense to me as I continue in my career.


I disbelieve that starting a lawsuit against Oracle because their software didn't work as it was supposed to would ever end well.

Do you know of any examples of that happening?


Crowdstrike.


It's a US approach not considered the same in Europe/internationally.

A good example is the GIS industry where ESRI (ArcGIS) dominates. In Europe the open source qGIS is generally an acceptable alternative despite less 'support'. In America its hard to find anyone using qGIS and ESRI is basically a monopoly.


Is the issue only support though? Oracle has a lot of features that PostgreSQL doesn't.


> The regular IT environment in the European Parliament is managed by whole teams of professionals, it comes with training, and is supported by Microsoft partners and ultimately by Microsoft itself. There are also large amounts of computing power available to make things work well.

> An Open Source experiment meanwhile is typically operated by an enthusiastic hobbyist with borrowed equipment. Rolled out without training and without professional support, by someone who likely did this for the first time, it’s no wonder things often don’t work out well.

> After the experiment, the faction was disappointed and concluded that Nextcloud was no good. And that was also their lived experience. “Let’s not do that again!”

This is a rhetorical trick known as implication or insinuation. By presenting information indirectly, the author prompts readers to make a connection themselves without explicitly stating it.

The author implies that the European Parliament's failed experiment with Nextcloud was due to a lack of professional resources and expertise, suggesting it was handled similarly to typical open-source projects led by hobbyists without proper support. However, he doesn’t provide any factual evidence that the Parliament’s Nextcloud experiment actually lacked professional resources, training, or adequate equipment. Instead, he hints at this by describing common issues with open-source setups, leaving readers to assume the experiment suffered from similar shortcomings.

I would have appreciated some facts, or even sources for his claims, but there are none. And I couldn't find any information about the Nextcloud deployment having failed.


I always hear people say things like there needs to be support for the thing I'm using or, it costs time to implement open source.

I hate to break it to you but it takes time to implement closed source solutions as well. They also always have terrible documentation, because they make money on support.

Purely open source stuff lives and dies on how easy it is to start up.

Closed source paid stuff doesn't need to be easy. Often a decision has been made before implementation, and there are people to help you through it.

It's also easier to get approval for open source most of the time because there isnt a new bill, just my time.

I usually reach for open source first.


You are mentioning that an experiment with nextcloud has failed? I cannot find any evidence regarding that, even more I see it highly used among governments and municipalities in the EU.


Tangentially, although there's been sporadic setbacks, as Limux[1] in 2017, there are new commitments to linux[2] that I hope will lead the way, at least in Europe.

[1]https://en.wikipedia.org/wiki/LiMux

[2]https://arstechnica.com/information-technology/2024/04/germa...


I can tell that a few years ago a couple of NRW libraries used a SuSE variant in kiosk mode, apparently not everyone found that great, as some of the ones I regularly visit now have Windows in kiosk mode, with the usual set of Office, Adobe and other packages.


I’m aware of a party in Germany which, at some levels, uses Nextcloud to great success so I could imagine them pushing for it in their fraction. No idea why that wouldn’t work though given that they have tons of experience


I’m also aware of one Drupal failed experiment somewhere in the same organisation…


Another elephant in the room is that many of the popular open source projects are funded by big tech.

Hard to be an alternative when you serve the same master.


It's more of a symbiotic relationship. The open source community depends on commercial support. Essentially all of the bigger projects indeed get a lot of their contributions from the companies that use, build, and depend on these projects. It's how the software world can collaborate with their competitors on the things they don't compete directly on.

This isn't charity, they are literally using more OSS software than they produce their own software. By several orders of magnitude in most cases. Companies like Google have many millions of lines of code in proprietary in house code. But they depend on an even larger amount of code in OSS form.

E.g. Android and Chrome OS are based on Linux. Those products are built on many thousands of open source packages. And of course Google is contributing to lots of them and created a few themselves. Chrome is open source because webkit was open source because Apple forked KHTML from the KDE project.

Open source without commercial companies contributing would be much more of a fringe thing.

VC funded OSS companies are a bit more challenging. These companies are perpetually confused about their licensing and need to keep things proprietary and open at the same time. These projects get a lot of attention because of the VC money but technically they only represent a tiny fraction of the OSS community.


> Companies like Google have many millions of lines of code in proprietary in house code. But they depend on an even larger amount of code in OSS form.

I don't think this is actually true:

1. The Google codebase is on the order of billions of lines of code, not millions.

2. It's basically all written in house, from the threading libraries and core standard libraries up. The parts that are open source (e.g. Linux, OpenJDK) are very small compared to the code they've written themselves.

ChromeOS and Android are open source, but they aren't even close to being the bulk of their codebase.

If Linux had never existed they'd have found some alternative, probably either a bulk licensing deal with a proprietary UNIX vendor or they'd have used Windows as the closest cheap Intel based alternative. Then they'd have put funding into developing their own in-house serving OS a lot earlier.

Source: I worked there.

> Chrome is open source because webkit was open source because Apple forked KHTML from the KDE project.

Chrome is open source for strategic reasons and because the executives in charge wanted it to be. There's no particular reason it has to be open. Safari and Edge aren't.


Webkit still is open source. Apple never owned the full copyright to it and they would have to do a complete rewrite to get out of that license, which is GPL and always has been. Safari is indeed closed source. But it uses webkit. For the same reason, Google is stuck with the same license and copyright situation. Whether they like it or not, webkit/kml/chromium are forever GPL. Nothing short of a complete rewrite can fix that.

I think you are grossly underestimating how dependent both companies are on various open source projects. It's definitely true that they also do a lot of in house code of course; and they also contribute a lot of their own projects. And of course especially Google is a repeat offender when it comes to creating a lot of dead projects, reinventing the wheel, etc. It looks like their attempt to create their own operating system kernel is slowly dying now. Fuchsia is all but dead at this point. So they are back to Linux being their only future. There's the whole Kotlin ecosystem, which they helped create, which is starting to compete with flutter. And so on.


WebKit isn't GPL, it's LGPL, which is why Safari is closed source.

I'm not really underestimating anything. I worked on the Google codebase for years. It has very little dependence on open source code relative to its overall size.


> Open source without commercial companies contributing would be much more of a fringe thing.

My conjecture is that open source is polished enough for most customers to use when there are commercial interests implied. Linux on the server is a resounding success, Linux desktop not so much.


Even on the server you're better off with the custom cloud offerings than using an off-the-shelf image.

That is the thing, what really won on server and embedded was UNIX/POSIX, and while GNU/Linux is the cheapest and more flexible way to achieve that, it isn't the only one, and the best experience is anyway with vendor custom distributions with their special sauce, not the pure FOSS one.


What benefits do the vendor distros provide? The downsides include at least increased vendor lock-in.

I think one can attribute many things to the success of Linux (incl. POSIX) but it's not about one single thing and the whole shouldn't be discounted.


Integrations with the hyperscallers infrastructure not available out the box in regular distros.

It is no different than using managed Kubernetes, or doing everything from scratch instead.


Even on the server, the commercial distributions dominate. Probably because of things like support.

And I wouldn't agree Linux desktop is unsuccessful. It's actually growing quite a bit in the last few years. And of course ChromeOS is also Linux based and capable of running Linux software. Likewise, MS bundles Linux with Windows and it is widely used by developers using Windows that way.

But even without that Linux Market share is now 4.5%. Quite a few gamers are discovering Linux works pretty well lately. With ChromeOS included it's closer to 6-7%. Linux on the Desktop is bigger than ChromeOS. I'd say it is getting better.


I disagree about the desktop not being a success but I guess it depends on how you define success. It's true that it's not as popular as some other OSes.


That the „commodify your complements“ strategy


I cite that essay always when "but this should be open source" comes up in a corporate context...

https://gwern.net/complement



And it's also not great when companies that are positioned as implementation open source and cloud closed (read: AWS/Azure/GCP reseller) also construct strange licenses that are inherently against traditional OSS values.


> Another elephant in the room is that many of the popular open source projects are funded by big tech.

People have to put food on their table and can't work for free. Someone has to pay for that work. Nobody will pay for it if he can't extract some benefits from doing so.


It was a pipe dream, because at the end of the day not everything can be a side job, to compete against those that spend at least 8h day producing code.

Then the whole issue with non-copyleft licenses, that are nothing other than the old Whateverware or Public Domain licenses from the 16 bit home computer days.

We already had access to source code back then.

And for a large crowd this is already good enough, they aren't into it for religious definitions.


IDK about 'open source', but 'libre software' is what fueled tons of propietary software from the 80's until today. Without that software tons of propietary software (even console games) woudn't even exist.

I remind you all Emacs powered some German airline's ATC in the early 90's, and it used to be used under Amazon for tons of stuff thanks to its easy widget UI to achieve tasks with very little Elisp.


Thw problem is big tech can offer free as in beer hosted services.

You can use Google docs for free so it takes some dedication to self host that and pay for the server.

Now if big tech charged for everything things would be more like the old days where you might use small tech, such as a local hosting provider that does open source installs.


Been ranting about this for years, did a keynote about it, actually did two notes at several venues, including an open culture festival, and all I got was a silent dis.

every now and then open source is suggested as superior, because being free. Zero comment on code quality, who wrote it, why it came to be in the first place.

Even the argument that a host running open source makes delivery more trustworthy is super biased - major cognitive dissonance is that services based on open tech are very often not open, neither auditable.

There’s a lot of open source being controlled by same large corporations and the part that is not, does not constitute a service on its own.

Then we must admit it takes a lot of care taking care of services nobody else cares about (by means of support).

While open source is important for academia, I think open results are more important for government. Like I don’t care what somebody used to cater to this geospatial data, or that image. I care about the data that went in and went out. Open data is much more important in the era of open weights and closed sources training sets.

The general public is often misled to equate open source to free beer. Well that is also not entirely correct given plethora of not so free licenses. Asp not correct as costs are greater when you put the personnel running that service in the equation. I can see how this argument does not fly well with socialist ideologies, but that’s problem of ideology, not of costs or technology.

Even if we consider only those open projects which are also free - these come with less guaranties than a pair of second hand shoes bought from random store.

Don’t get me wrong - open source is great and we use it daily, but comparing means of distribution with quality of service is really like comparing ябълки и круши (apples and pears in Bulgarian). So it’s indeed time to stop blindly waving the open source flag, but actually try to understand the benefits and challenges it comes with.


> Experimenting is useful, but know that Open Source is the underdog, and there are many people waiting for an opportunity to enthusiastically declare that it has failed.

almost the entire world and industry is literally running on open source.


Countries have run nationalise infrastructure before, and successfully. The problem is if they did not view it as nationalised infrastructure and instead viewed it as some sort of mana that would fall from open source heaven.

Open source software is the building blocks used by large rent (service fee) seeking corporations. They will extract large profits from any of these contracts and that is a demonstrable fact, they are also nearly all from the USA and so those profits will flow in one particular direction. It is also a historical fact that governments have run successful large scale infrastructure. Make your choice.


The point of going to big businesses for software services and support is that most customers don’t have needs that are large enough to justify the full-time staff needed for top-notch support. So companies that provide services amortise this over many customers and can employ n dozen full-time staff for a particular subsystem when the average customer might only need them a few times a year. So the tradeoff makes sense – even with a big profit margin, the customers still save money compared with DIY.

This logic doesn’t really hold when it comes to large governments. Their needs are large enough that they can justify employing specialists. At that point, the profit margin the service business is capturing is just inefficiency. Internal services should be more common in large governments.


Theoretically the advantage of outsourcing to a business instead of running it inhouse is that you can put it out to tender, picking the most competent of the entire industry, whereas your inhouse team is what it is.

In practice, Microsoft isn't going anywhere. You're just paying for an external inhouse.


If you're a government you absolutely have the ability to compete with big tech, it's "just" a matter of political will. If you decide that it's important enough you can hire competitively from the same talent pool. Strategically it makes little sense to depend on another nation's companies to run your critical infrastructure. You have to own your dependencies.


It depends? Sometimes it is the feature set, but for example with future VMware pricing we'd be cheaper off hiring two full-time staff and running proxmox, which is currently being evaluated.


This depends on the level of support you need. Two members of staff might be fine for non-critical systems but it’s not enough to support anything that needs to be up 24/7. There’s not enough coverage and less than zero slack. If my alternative is to hire two people, I would rather spend the money with a company that is large enough to employ more people, in different timezones. But if most support can be handled by a larger body of existing staff and you only need specialists occasionally, then it might make sense.


But it does run it in a particular way that isn't necessarily as profitable which in this example is a good thing.

imho the question should be if the country continues to function if the project goes bankrupt. If it is so essential that it needs to be saved by the government (even in theory) then it lives outside the domain of capitalism.


The problem is that docker compose starts 20 containers and the fans go full bore just because you wanted to try a new wiki or notes app. The complexity of relatively simple software is getting insane.


I am one of the 10 users[1] in the world that uses Docker Swarm (container orchestrator like Kubernetes or Nomad), and I disagree with that statement. I have over 25 containers running (including Jellyfin, Nextcloud, Gitea, TeamSpeak, ...) and it barely uses 10GB of RAM (Jellyfin is eating up more than half of this).

Most of my Compose files contain 2 services (1 for app + 1 for database), but some contain 3 and some contain 1. It's incredibly easy to install new software and incredibly easy to shut it down temporarily if you don't need to use it all day.

I'd even argue that some companies would benefit more from using Swarm than Kubernetes. There is a lot of things to take into account when using Kubernetes (let alone setting it up for GitOps), but Docker Swarm can be managed by 1 person.

[1]: A joke, obviously, but it really isn't popular at all


I think k8s can be managed by one person too, if you are only using it like docker swarm. Especially if you use something like k3s with SQL database.

I found setting up gitops via flux quite easy, apart from order of operations, like installing controllers and custom resource definitions before resources that need those CRDs etc.

What were you thinking of things to take into account for k8s over swarm?

The main difference for me is k8s needs a hell of a lot more boilerplate yaml for doing basically anything.


> Especially if you use something like k3s with SQL database

I'll admit I've never used K8s outside of work very much, so I can't really argue on that!

> What were you thinking of things to take into account for k8s over swarm? The main difference for me is k8s needs a hell of a lot more boilerplate yaml for doing basically anything.

I think that's a big one, yes. Stateful services (i.e. volumes) are also much easier to setup and understand with Docker Swarm - which is the same as Compose. The routing mesh[1] is also lovely. I didn't use the Kubernetes routing mesh at work because the infrastructure department didn't allow us to, why is one a reason I was arguing against it; we used a very powerful and complex system without profiting from one of its most powerful feature.

[1]: https://docs.docker.com/engine/swarm/ingress/


We are talking about normal users.


Starting one wiki/note-taking software is not going to make your fans go crazy more than running your favorite JetBrains IDE. I had a non-Arm MacBook Pro and sometimes the fans would go crazy for no reasons, so all my Arm-based colleagues who had an M1 were laughing.


I do agree we are exposing way too many low level details to users these days. Probably because we expect an expert to be setting up these network services. The dream would be to have some low power appliance people can just plug in to provide a data persistence service for applications. Then applications just use that (discovered via zeroconf/avahi) for their "sharing" needs. Everything else should be bundled into the app and invisible to users just like it used to be.


I was sincerely wondering what the EU institutions use as a productivity suite but it seems they are on Microsoft 365 ! [0]

I would be very curious to know if the data are stored on their own data center or Microsoft's.

- [0] https://www.edps.europa.eu/press-publications/press-news/pre...


“Open Source” is an alternative to Big Tech in the same way that “open standards” is a preferable alternative to proprietary technology. In fact, it is largely the same issue.


Let's simplify: FLOSS domain is the internet domain, where anyone own a desktop, a homeserver, a company machine room etc. The big tech model is the old mainframe model, or the modern web where only few own anything.

Try to mimicking them is a waste of time and can't work, pushing the society toward ownership and freedom might work, because in a way or another we will end up there being technically the sole solution.


FLOSS is the personal computer model, where you own the computer and have final say on what data is processed on your machine. If you can't try to make your software or computer lie in on your behalf, it is not FOSS.

The big tech model where trust is in the company, not the person. Business love the big tech model because it's easier to let a few credit card companies deal with the trust issue than establish a trust relationship with everyone directly (or deal with cash), because surveillance capitalism is more profitable, and because it's more profitable to rent than to sell.

The big tech model can profit first on that cost difference, and later on switching costs which would otherwise inhibit abuse.

It has essentially nothing to do with the internet, as mainframes were networked long before personal computers. Even back in the 1980s, POS terminals used dial-up to verify credit card transactions.


I mean the "mainframe model" of a single large system and many dumb terminals, now dumb terminals are named enpoints and the mainframe is someone else computer across the world.

The trust problem is easy to solve, with an open society: as long as payments got processed with open APIs and the government takes care of the frauds there is no trust problem. I do not need to trust a third party with eCash, I only need to trust my State protections.

The idea is already tempted, see not only the historic eCash, witch are the modern GNU Taler chosen (it seems) by the EU for the digital Euro https://www.ngi.eu/ngi-projects/ngi-taler/ and https://social.network.europa.eu/@EC_NGI/111499172838284606 but also https://openfisca.org and https://github.com/CatalaLang/catala or few others alike.

That's still embrional but in FLOSS terms we have already more than enough, we just miss the law enforcing it and the schools teaching it to the masses.


I'm pretty sure I know where you are coming from, but I disagree with your interpretation.

The centralized trust model does not require mainframes connected-to by dumb terminals. We need only look at how Visa in its first few decades used carbon copy devices and signatures, along with eventual consistency across a network of mainframes, to gain market power.

"The trust problem is easy to solve" is laughable, as you well understand by the need for "the law enforcing it and the schools teaching it to the masses."


Well, easy to solve, means the solution is simple, not that reaching/implementing the solution is simple. Laws enforcing and school teaching are very complex, but the solution is damn simple. It's the road to it the complex part.


So like how single-payer universal healthcare in the US is also a simple solution.


For health care the solution is making it totally public: all other the world the more health business exists the worst results and the higher cost you get.

There are aspects of a State to function that MUST be ONLY fully public. Again it's simple, in conceptual terms, hard to get applied in reality.


A sales position I was working in 2017 was the first time I'd used Windows 10. I had a very urgent issue with a customer who needed our small business to confirm a change they were requesting. I needed to go through the technical details of the customer's request by reviewing their documents over the phone on my computer.

As I was on the phone and going through their documents, Windows 10 decided to install updates. I'd experienced this before and had done everything I could to try and configure Windows 10 to require my permission to run updates, but it doesn't work that way at least when you are a small business without an I.T. team.

After a few minutes I told the customer I would call them back when my computer completed its updates. The update ended up taking over 40 minutes to complete. What really bothered me the most is that Microsoft is setting the priorities of our organization - software update instead of resolving a critical customer issue.

I've never had a Linux update require so much time and definitely I've never been spontaneously and without requesting my permission locked out of my computer so Linux could run an update.

"Big Tech", as discussed in the article, appears to me to be no longer concerned with small customers and operating in such a way as to assume we are all just their guaranteed customers so they are free to do with us as they please.


If you did have a proper IT department, they would have forced you to keep your computer up to date with security and other patches anyway. All that posts like this do is document people's irresponsibility in keeping their business-owned computer secured.


Isn't this post describing exactly the situation of a forced update interrupting work, though? Doesn't matter who does it, the effect is the same. The difference is if your IT department controls the software you at least have the option to make it less intrusive.


This rates among the most arrogant pompous asshole responses I've ever encountered on HN. I ran updates exery single morning to avoid exactly this problem.


You "ran updates every single morning" but somehow did not see the popups Windows puts up warning about required patching and did not take the "pause updates for up to 7 days" option that Windows provides? Sorry but that doesn't sound very believable.


IDGAF about anything from someone as arrogant as yourself.


80% of Linux code contributions come from US big tech.


If you look at well maintained OSS projects (apache, php, etc). It's the same. Companies with the cash to hire developers are the reason they are successful.


So this means the system is working pretty ok, tech companies are sharing the common burden of things that affect them but don't confer competitive advantage? It seems more active participation by governments (if they can figure out how to do it) would only make things better?


Big tech relies on open source and, often enough, vice versa


Open source and big tech are orthogonal concepts.

Open source is about licensing, big tech is about scale.


They are missing something major here and getting bogged down in some technicalities. Open source has no alternative to big tech because big tech commoditised stuff that's useful whereas open source commoditised stuff that is interesting to the developers.

When I sit down at my mac, I have a working and very polished calendar, mail client, todo list, contacts, note taking app, music player, browser, photo editing and library management tools, video call and conferencing software etc. And all of it syncs with my phone and my tablet out of the box.

When I sit down at a Linux machine, I have a calendar that breaks every 5 minutes and I can't share anything with anyone without futzing with iCal feeds and hiring another provider, a mail client that is ugly as sin and doesn't integrate with the calendaring or contact management stuff at all, a job and a half to find a note taking app that actually works properly, a todo list app that syncs with nothing, a spreadsheet package that crashes whenever I try and print something and oh hell I give up by then. And the answer to this? Roll out nextcloud on a VPS. Kill me, with a spoon. This is not freedom, it's just slavery of another kind.

I just want to get shit done. Big tech covers that. Please take this as a recommendation to tidy up all this hell and just help people to get shit done and then it will be an alternative to big tech.


Very much agree. I made a cross-platform and open-source note-taking app[1] in Qt C++ but never really talked to common end users and their needs, just built for myself.

Now that I'm working on a proprietary version[2] (with a block editor I rewrote from scratch), I'm talking to these end users and understand their frustration in using my product. For example, many users had issues discovering the different features of the app, so I created a toolbar, which much helped. This is just one example.

[1] https://notes-foss.com/

[2] https://get-notes.com/


As much as it pains me to say it, it's true. I use predominantly open source software on all my computers with some small exceptions. I used to rely on some cloud services because of the convenience and nothing else. But leaks started becoming way too common and what I can say about all places that I've worked at, data is handled really badly. If you pair that with some OSINT skills, you can learn pretty much anything about anyone from a single leak. So over the past few years I've been slowly cutting down my dependency on cloud services. Nextcloud was the first big step, a zfs pool for backups, a few custom protocols for alerting and kill switches and that's it.

On paper this sounds really good but there's a lot of overhead when it comes to maintenance. "Yeah, it's just one more docker-compose.yml, big whoop"(yes kubernetes is pointless overkill if you are the sole user). I've said that too many times and it's not true cause it only takes one small thing that you overlooked and you have to spend a day or two to put everything back up together.

Another thing worth mentioning is that open source can be a good alternative but open source does not mean free or cheap. For instance, I've gotten really into drones and radio communications lately. Take hackrf and the baby brother that is flipper zero - they are both completely open source but neither of them is cheap. In fact, they are really expensive - they are effectively open source ASIC's. I'm willing to bet that north of 80% of the cost is down to the software and not the hardware - because polishing a piece of software to the point where you can pick up a product and use it without effort or a steep learning curve, involves a ton of work on behalf of developers and UX/I people.

And you can't really cut off all big tech - open source phones are BAD, you don't really have a good alternative to google maps and waze, you still heavily rely on search engines and a few dozen services if you start digging deeper. There are also a number of services which do not have an even half-decent open source alternative. Also not everyone has the skills to set up and run these things.

I think the big case in favor of self-hosting whatever you can is that while open source is far from immune to leaks, if it resides in your private network(which it should) without access to the rest of the world, those holes will eventually be patched and you can take action in the meantime - stop the service, block a few ports, etc. The odds of you personally getting affected are pretty low. Now if a leak happens in big tech, there's nothing you can do about it and by the time you learn about it, it's often too late. Honestly, this is the number one reason I'm doing this to myself.


I think your experience is interesting, because it seems like approximately the same small fraction of individuals as companies commit to "vertically integrated software"--that is you own everything top to bottom. That is to say almost nobody does it. For an individual that's excusable, it's a mountain of expertise and effort, although I imagine it pays off rather well in skills learning. For a company, not so much. Companies (and governments) have the ability to acquire the expertise they need to vertically integrate. Some do so when they see value in it, but it's rare. Why do so few make the choice?


> Why do so few make the choice?

Because it is risky. The more esoteric the knowledge gets, the further it moves away from your core business, the more in-demand the skills are. As an example, maintaining your own metrics and timeseries storage. It takes quite a few skilled hands to do this in house and probably only feasible for larger companies anyway. Or you can simply hand this problem over to DataDog. While they are pricey, it is potentially pricier to build your own internal DataDog-like system, especially if you consider the opportunity cost of pulling your most skilled engineers to build it instead of building your product that your customers are paying for. Companies are perfectly willing to pay a premium to not worry about something, and that includes not worrying about your very skilled engineers leaving and then needing to scramble because no one else understands what has been built.


There are three big risks I see in depending on vendors:

1. You aren't average. Market forces might not align with your use case. The pricing model might change, or it might happen that you find out later it doesn't scale well for your business.

2. They might leave you suddenly. For example, all the google "products" (quotes because there is actually only one--ads) that have disappeared over the years. Even when an open source dependency suffers a cataclysmic licensing event, you can still fork it and carry on, provided you've both chosen your dependencies wisely and hired the people capable of maintaining them.

3. By choosing a vendor you're making a commitment to ossifying a part of your stack. The observability example is a good one here. At the companies I've worked for who do all their logging, metrics, alerting, etc in-house, developers aren't afraid to use the tools. The tools adapt to the requirements, whether it's cost efficiency, features, whatever. At the companies where we've used vendors everyone's perpetually afraid of increasing the monthly bill, and nobody has a say in deciding what goes on the product roadmap. To be clear, this might be the right trade.


Not necessarily. In my experience large corporations have a ton of internal tooling that no one knows about, even internally. Say in a company with 6000 employees scattered around the globe, tough luck knowing whether some developer made a one-off tool to debug something and used the company's LDAP to authenticate, and then the app connects to some db full of stuff. Years back I stumbled upon such a tool that had a web shell that allowed you to do pretty much anything using some dev's account - anyone with a company email could access it. This is far more frequent than you might think in large corporations. In small companies, typically everyone has access to everything and if that isn't the case, more often than not, you will ask someone to go and fetch something for you and the third time around, they will get fed up with it and give you full access so you don't bother them anymore. 6 months down the line, everyone has access to everything. Or if a small to mid-sized company sees people that clearly know what they are doing, they commonly give them full access to everything so they can get on with it. Like at my current job I've had root access to everything, everywhere since day 1.


>If you pair that with some OSINT skills, you can learn pretty much anything about anyone from a single leak.

If you need more anxiety, just think about the hottest technology right now that is capable of relating massive amounts of data instantly :-)

SQL isn't ready for AI.


Wanna guess what I'm working on in my spare time? :D


The alternative to "big tech" is not "open source". The alternative to big tech is a healthy "small and medium" tech economy, or at least a more sane distribution of market power.

Imagine if you had to compete producing widgets in a market landscape where some hyper-conglomerate would source and distribute all power, define and install all plug standards and, in addition, produce and rent any widgets that saw consumer traction. For decades this is what has come to pass as normal in this domain.

Openness (of varying degrees), standards-adherence, interoperability and competitive markets are connected attributes. In this context open source is an extreme productivity multiplier. Maybe the most potent such development in modern human history. Entities that adopt open source would collectively out-compete in innovation and usefulness any proprietary offering. But for this mechanism of sharing knowledge to thrive and reach its full potential there has to be a real market for digital technology.


I don't buy this argument. It's simplistic and IMO wrong on multiple accounts.

Big tech can totally sell "FOSS services" and provide ground works for it, like some of it do - you don't have to lock people with proprietary stuff.

Even more, big IT tech couldn't exist without FOSS in this form and shape, while the opposite is not true.


Do we need big tech? The only thing I need is a search engine.

I prefer privately hosted web and mail servers. Before "the cloud", the economy worked just fine and companies had enough money for in-house IT.

By using the nextcloud example, the author of the article is asking the wrong question.


But European businesses should utilise open source to more easily compete with big tech. Big tech is definitely using it to kill European businesses.

Yet they dont.

The problem is not big tech. Not open source. It's that the European tech economy crippled itself and cries wolf about it all day.


EU is a digital colony and that's entirely down to the lack of political will.


I don’t think it’s a digital will. I think it’s an advisory layer which is completely incompetent and will always try to lead decisions toward proprietary software solutions because they “job hop” between the public and private sectors. I do not mean this as corruption or shady, but simply that they have powerful roles because they have a lot of private sector jobs behind them. Business leaders who employ thousands of people, and the decisions makers and advisors in their vicinity are always going to have a big role in political decisions. Coupled with many of the FFOS advocates and NGOs being far too “all or nothing” in their approach, where what you would need to be successful is not to swap everything at once but to take small incremental steps so that you can build clear successes. It simply leads to a landscape where the political layer makes bad decisions despite a decade long will and commitment to advance both EU tech and Open Source solutions.


Yes it is about a lack of political will. US companies have been illegal in the EU for a decade now (Schrems II), but the enforcement of this is hardly anywhere to be seen.

It's the same kind of political denial as with being "concerned" about climate change, but still trading with China (even worse when this allows for fake self-congratulations about decreasing greenhouse gas emissions, when it mostly comes from most of the industry having been exported).


I think it will take way more than political will.

The USA is enjoying the wealth they gained in both world wars and they also kept their defacto colonies in the South America and Pacific.

Europeans destroyed their wealth in the world wars and they lost their colonies. Of course the end of the colonialism has ended some of the human suffering but it has a cold-hearted economic impact.

The American venture capitalists are all coming from the industries that got stronger at and after the world wars. They invested silicon and then the tech industries that built the wealth exponentially. The Europeans had to rebuild their countries until 70s and the investments they made are smaller. Similarly the US spent its government money to nuclear and space programs that further strengthened the economy. EU spent its surplus to improve post-Soviet countries which may or may not pay dividends in the future.

It may require significant reallocation of resources from certain places to tech. It may require diverting the resources spent on old pensioners who are the biggest voting block. It is not a simple lack of political will. It requires reshaping a century of decisions.


My litmus test for whether an open source application is reliable is whether it has a website or it links to its github page (sourceforge is ok).


The key part is "on its own".

A necessary, not sufficient, piece of the alternative

We need to work on our orginistaional structures


Yep, we don't need open source anymore but LEAN open source and that, including the SDK.


I have the feeling this blog post is a response to something that nobody ever said.


So, almost all servers running Linux around the world is just ignored by the author?


The entire article is about office IT. Productivity software, etc. Basically, people emailing excel sheets to each other. I don't see how the OS running on servers is related to that.


I think the article should have started off different considering that it actually concludes with a semi-positive stance on how open source “is” an alternative to big tech. It’s an area we take rather serious here in Denmark. Now, I won’t get into the irony of everyone wanting to replace Chromebooks in our school systems because Google is evil when the replacement is very likely to be Microsoft who is as much of a snoop these days since Google actually sells similar forms of privacy to our education. What we do have as a real working alternative to both is locally developed education solutions which will work as well, if not better? Than Google’s Educational tools on Chromebooks. What we lack is a political leadership that will commit to this. Part of this is because we’ve only recently gotten a digitalisation minister, even though people spend far more time on computers than they do on their daily commute and we’ve had a transportation minister since basically forever. Another part is that many of the top advisors in public service tend to “job hop” between our leading industry companies and public service, leaving to many contracts heading toward closed software.

What our educational alternatives show, and they have been implemented in some places and in Greenland I believe. Is very much in line with what the article recommends at the end, as far as small incremental useful changes with clear and cut goals. What would you achieve with Nextcloud? Replacing everything you have in Azure AWS in one big step? Obviously that is going to go horribly. That’s not even how we migrated into Azure from on prem. What you can do, is to start by slowly moving your applications and services into moveable parts, by container rising them. Writing your run-books in Python rather than Powershell and so on.

Then there is the change management, which the article touches on, and which is always forgotten by decision makers. Partly because decision makers don’t know what IT is, well… I guess that is it really. Where in the past (and I’ve written about this a lot) SysAdmins and supporters were unlikely to want to leave their Microsoft training, I think we’re at a point in IT history where that is less of a case because so much is now done on Linux even if you’re deep into the Microsoft ecosystem. Similarity the Office365 platform is not in as much ownership of your employee base because many people under 30 will not have “grown up” with it. Where it would have been inconceivable to not use Word, Excel, PowerPoint or Outlook 5-10 years ago we’ve entered a world where we actively have to train employees in Office products because they are used to iOS, Android and MacOS and not “PCs”.

Again, you should start by doing things in small steps. Our Libraries have switched to Ubuntu on every public PC, and it has been a non-issue because many library users are equally unfamiliar with Ubuntu and Windows, and since most things happen in a browser anyway, the underlying OS isn’t an issue.

That is how you do it. Slowly with small steps, and yes, some of those steps don’t need to be open source. If you want to replace Azure or AWS then it’s much better to head to Hetzner (or similar) rather than to try and do it with NextCloud or similar. Because then your SysAdmins will not really need much retraining as that is not very different from what they already do in many cases where moving into the cloud has really just been moving a bunch of VMs.


Just migrated from GitHub to Gitlab, also finally made the move away from macOS towards Ubuntu as my primary machine. Ive been using Macs since almost 15 years, enjoyed it all the way but FOSS comms snd tech is the future.


This is excellent. It's also worth noting we don't need to fund services via supporting private enterprise, either—many services should arguably be operated by zero-profit entities. For the most part, after paying infrastructure bills and salaries, the profit motive is contrary to providing quality service over time (see: enshittification)


Not open source but Free Libre (GPL) can become a genuine alternative provided it's implemented widely and followed honestly.


This just in: if you like hollywood movies, free speech is no alternative to movie-producing hollywood studios.

More news at 11.


[flagged]


How about the first result for "museum cataloging software" in DuckDuckGo, CollectiveAccess? This was without adding an "open source" qualifier. It seems to be in use in the industry too.

https://www.collectiveaccess.org/ https://www.dublincore.org/groups/tools/dc2008/dc2008_seth_k...

Another one apparently in use in the UK is CollectionSpace:

https://collectionstrust.org.uk/software/collectionspace/ http://www.collectionspace.org/

Also:

https://omeka.org/s/ https://www.museumplanner.org/free-museum-collection-softwar...


Yes, there are plenty of them. That is not the point I was making.


Hmm, then your point wasn't clear.


[flagged]


Can you please stop? We're getting tons of complaints about these posts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: