This is a very strange distribution of projects. There are projects like VLC, Filezilla, and 7-zip, next to often mission-critical pieces of software, like Kafka, Tomcat, and GlibC. I wonder what went into the decision process to include each of these libraries.
I also dislike the 'bug bounty platforms'. Why can't I simply report it upstream, and if accepted, claim my price? Each of the projects should have CVE protocols and procedures. The idea probably is to curb the zero-day vulnerability leaks, but I assume that if you're able to find a CVE, you're capable of finding a CVE procedure.
It seems like a rather logical distribution of projects if you consider the ratio of (installed base/developer interest). The projects on this list all have massive user bases, but few of them would garner much excitement on HN and they have relatively small developer communities.
Filezilla, Notepad++ and 7-zip aren't in themselves mission-critical, but they're hugely popular products. If you can pwn an office computer or a developer workstation, you've made a crucial step towards pwning something properly sensitive. Think about the IT guy in a typical medium-sized business or a government department - what are the first things he's going to install on his own work computer? After Microsoft Office and his browser, what programs will he most often use to open untrusted files from the internet? What happens to the department if a trojan on his machine starts feeding his passwords to the FSB or the PLA?
Of these, I'm pretty sure VLC is the most common software on end-user systems - and there are enough security advisories where a well-crafted video file can execute code with user privileges (like https://www.videolan.org/security/sa1801.html ), if you can automate that you have access to many personal computers in the EU
There's work going on (since 2016) to port the parsers to Rust[1]. I believe that a few already are written in Rust, and it'd be great if some Rust folks would help out with the effort.
GP isn't saying that VLC is unsafe, but rather that C (which VLC is written in) tends to be unsafe. Seeing as MPV is also written in C, it's absolutely the same in that regard.
> I wonder what went into the decision process to include each of these libraries.
The decision making process was a survey [0]. The two criteria used were (1) usage of software inside and outside the EU and (2) critical nature of the software for institutions and users.
Most probably these are tools commonly used by EU institutions which have records of bugs have causing them problems. The solution is to help fix those bugs by offering money. You are right though, I can't see how VLC can be as mission critical as Kafka.
Most police forces use VLC to view CCTV recordings and other multimedia evidence. It's an entirely logical choice of software, but it presents an obvious risk in the current climate. I would imagine that many intelligence services use VLC for similar purposes.
A nation-state adversary with a VLC RCE 0day could do some serious damage; if they also have an 0day for a popular model of CCTV DVR, they've got the keys to the kingdom. Those DVRs will never get patched and a nation-state adversary could dream up all sorts of ways to induce a police officer or an intelligence agent to play a media file, but at least we can harden VLC.
A friend of mine spent last Christmas debugging an issue in memcpy in glibc (on Intel 32-bit CPUs). Glibc is less well tested than I expected, and has ASM implementations of many functions for many CPUs, some of which are (obviously) less well supported than others.
IA32 is probably not getting all the focus from devs and users this days, still surprising whoever... Do you have a link to the issue, out of curiosity?
There was a bit of a scare around a 7-zip vulnerability earlier this year. [0] Turns out 7-zip is embedded inside a lot of other programs making those vulnerable too.
There's a distinction between your examples: the first ones are user tools, the latter are backend applications or libraries
My guess is that the main objective is to address user-visible bugs. While a glibc bug is certainly impactful, it is usually solvable before it gets too widespread.
(And as I much as it's "not the right way", higher level apps work around it before it is fixed)
>This is a very strange distribution of projects. There are projects like VLC, Filezilla, and 7-zip, next to often mission-critical pieces of software, like Kafka, Tomcat, and GlibC. I wonder what went into the decision process to include each of these libraries.
The EU (Brussels offices, etc) actually using them?
Sure, but there's a difference between "yea, we like 7-zip, let's put some money into it" and "yea, we use Tomcat to actually run our apps connected to the DB, might be nice if it got a bit of patching" (and funnily enough, some of the user-centric apps have more funding than some of the backend, mission-critical SW).
My evaluation of the benefit is completely opposite to yours. An exploitable bug in 7-zip has a much higher impact than a bug in Tomcat. Tomcat is running somewhere in the backend so an exploitable bug is not usually usable as a direct attack. A bug in 7-zip can suddenly create a bunch of ransomware attacks just by distributing malicious files.
We have a mountain of C code running in the wild parsing binary formats that's in real need of some fuzzing or ideally replacement by safer languages.
The thing is, "somewhere in the backend" is generally accessible from the internet, and vulnerable to attackers (so you need only a maliciously crafted packet, or something similar); whereas for 7-zip vulnerability, there must be: a) a maliciously crafted zip file, b) a user who wilfully opens it.
What's more, getting into one's backend servers/gaining some kind of access to DB, config files of the machine, etc. is, in my mind, just infinitely worse than gaining access to a computer of a person/uploading some ransomware/something similar.
We're just probably working with different SW, so we both see the thing that touches us the most as the problem... :))
> The thing is, "somewhere in the backend" is generally accessible from the internet
If this is the case you have much bigger problems that a bug bounty won't fix.
> in my mind, just infinitely worse than gaining access to a computer of a person/uploading some ransomware/something similar
That depends heavily on what the backend server is. There are plenty of databases where a hack is irrelevant because the data is public and there are backups. Meanwhile most people have poor backups and a hack can be incredibly damaging.
>we both see the thing that touches us the most as the problem
I think you're heavily discounting the risk that all these code bases in general usage pose. I've fuzzed C++ binary parsing code on just a laptop and was amazed at how many crashing bugs I was able to find in a short amount of time. Many of those were probably easily exploitable.
Gov agencies do accept zip files from general population. Send them something, they'll have to open it to respond to your request... Bam, you broke into a PC with sensitive system inside of a gov network.
> We have a mountain of C code running in the wild parsing binary formats that's in real need of some fuzzing or ideally replacement by safer languages.
In an ideal world that's what would happen but even if there where the will and the money it would take decades to replace all of this stuff in practice.
Sometimes when I'm feeling pessimistic I don't think we can ever truly secure (to a reasonable standard) anything.
>In an ideal world that's what would happen but even if there where the will and the money it would take decades to replace all of this stuff in practice.
In a previous discussion here someone pointed out you could actually compile C with hardening for out of bounds accesses for example. So maybe we need to isolate those input paths in programs and harden them.
>Sometimes when I'm feeling pessimistic I don't think we can ever truly secure (to a reasonable standard) anything.
I don't think we can either. In part it's just economics, the cost/value of the exploits is just too high for low-value targets. But it's yet another of the reasons I don't see how cryptocurrency ecosystems can really work. The security of the end-points is just way too low for me to trust that kind of thing.
For me, the biggest advantage of big country programs is the ease of reporting something. Not every software has a direct security report procedure documented.
For those who wish to get credit for them, those bug country sites help too.
What if they took all the office suite licenses budget and they invested it in an open source office suite project like LibreOffice, Caligra suite (formerly KOffice) or Gnome Office.
The city of Munich tried to develop a Linux distribution "Limux" that was used for some time, but political considerations ultimately reversed the decision.
Or just employees got fed up with the "almost but not quite there" compatibility, and unpolished functionality, and wanted to return to MS Office.
I've got a CS degree, have used Linux since 1998, used and developed for several commercial unices, and have used Open Office since it was Sun's. I still prefer MS Office.
not in this case.
people coming from office 2003 have a hard time with 2010 or higher.
it had a lot of things to do with smartphones which are problematic in the windows environment aswell.
also germany builds more and more web interfaces for administrative work.
It's it? I try to use open source software as much as possible but Office365 is just significantly better than the alternatives. I wish it wasn't true but it is.
And it's not even that expensive. £8/month/person. Slack is £5/month/person and that's just for chat.
Considering an average employee probably costs at least £3000/month it's a bit silly to worry about these small expenses.
Does MS Office have real competitors? Google Docs is very casual compared to MS Office. LibreOffice needs a lot of work.
My money was on Corel Office for Linux (with solid products like Wordperfect and Quattro Pro), but Microsoft bought a large share of Corel and it was mysteriously discontinued.
Slack has competitors but none of them are as good. At least none of the free ones - I've tried them all. Zulip is closest but it has a weird threading model (it's more like Usenet).
It's quite surprising given how relatively simple it is.
I disagree wrt office 365 (except for the collaborative editing in the browser feature). I dont recognize the price either - more like 22£ (unless its exchange only).
That's true, there was cronyism. For example, they used the worse, slower KDE instead of better options like MATE because KDE has a considerable European (and even German) legacy. I imagine they took many decisions like that.
Take a computer that used to run Windows 2000 and install KDE on it, it's no wonder people got pissed and they had to revert their decision.
I still use KDE (4.x) on CentOS 7, with older hardware.
It's not quite as as snappy as XFCE, or probably MATE, but it's still easily good enough to get the job done.
On CentOS, being a stable platform, KDE itself is also really stable. (important to me, as I have better things to do that screwing around just keeping a desktop updated)
For strict accuracy, MATE is not a fork of GTK2, but of GNOME 2. MATE did originally use GTK2, though without forking it, and it has since switched to GTK3 (while still keeping the GNOME 2 "look").
MATE is a fork of GNOME 2, which is a full featured desktop environment as well. MATE runs orders of magnitude faster and is much more stable than KDE. Especially on the old workstations where they installed LiMux.
I'm not going to say that the project failed entirely because of technical reasons, but at first glance it really looks like they took bad decisions. It's hard to defend a move where you end up with worse software and a worse experience for users, no matter how much money you save.
First, being abandoned for new development is a great indication of the quality of a project.
Second, you haven't given us any arguments for your "indications of quality" regarding MATE and KDE.
Third, like Gnome, KDE has a huge legacy in FOSS, and is a great project in itself based on a top notch GUI backend. Some of its code even went on and become the basis of the modern web (KHTML -> Webkit -> Blink -> Node -> now also Edge), other tools like KDevelop, Krita, etc are among the best in class in what they do.
What are you, some teenage Linux nerd, with a "favorite" desktop to promote in flame wars?
MATE is not a GNOME project.
When GTK3 was made and the decision to build Gnome Shell was made, the MATE project was started to fork GTK2 and the old shell.
It's obvious you're being provocative on purpose but I don't see this leading to any fruitful discussion. Maybe try a more constructive approach next time.
The LiMux project started in 2005, so the KDE version used back then would be KDE3. That was not really worse in performance than GNOME2 (or today MATE), although comparing it to Windows 2000 would not be easy.
Then put Windows 10 on it, or any other fully featured modern GUI OS. And we have proved what? Computers from 18 years ago don't run modern OS's very well, but can run ok with an OS specifically pitched as 'lightweight'.
On the other hand I have a PC right here on my lap that is 10 years old that is running Kubuntu problem free.
For so.e of the problems that is the case. Some of fheproblems were that no other municipality joined them in the effort, which leaves them a sole fighter in a world where they have to use software provided by other authorities. One example is ordering a passport was initially setup by running the software provided by the federal administration via terminal server, however the. Finger prints were added to passports so the setup didn't work anymore and even being "the largest municipality in Germany" (cities of Berlin and Hamburg are larger, but they delegated more municipality tasks to the districts and are also states) doesn't give them much leverage.
They way the decision came into play is crazy however. Just around the elections Microsoft moved their German headquarters from outside the city into the city and the next elected deputy mayor from the conservative party was disappointed that it took much time to get an official mobile phone from the IT department and setting up mail on that device was complicated ... not idea how that's related to desktops, but that triggered the debate ...
Yup. Going open source requires a shift in mindset when handling support issues. Coming from a paralyzed, complaining-to-vendor position, into taking responsibility and fixing the problems yourself.
> political considerations ultimately reversed the decision
Thats a very one-sided portrayal of the situation. There were problems regarding usability, the resulting low user acceptance and issues with external MS Office files due to compatibility bugs.
If anything, funding the project for so long was a symptom of putting ideological and political considerations before user needs.
I find it very weird how even most software developers prefer MBPs with MS Office on them, but some poor souls elsewhere are supposed to do their daily work on a sub-standard platform. I mean we're still joking about the "Year of the Linux Desktop".
The EU asked for the standardization of Microsoft Office file formats, and the "Office Open XML" format (OOXML for short) was created (ECMA-376).
The resulting standard was 6,546 pages long (in comparison, ODF is just 867 pages long), and Microsoft Office was not fully compliant with it, making the entire process a waste of time. 10 years later situation is the same.
It is reasonable to believe that interoperability and standardization were not in the best interest of Microsoft. They have the largest market share and they have not much to win by giving opportunities to competitors.
I think it's a stretch to say that the EU is "encouraging" such scummy practices. It's likely that they just collated a list of all software used widely by government departments within the EU -- and thus FileZilla is on the list. Ultimately, a potential 0day causing RCE within a government department is more of a concern to the EU than the optional malware you get during FileZilla's installation.
I haven't used Filezilla in a _long_ time -- is the malware optional? I imagine that most EU governments image their machines, so they IT departments likely aren't installing the malware.
And there is also the consideration that governments will continue to use Filezilla even if there isn't EU funding to make it more secure -- malware and all.
The European Commission also has additional calls out for intermediaries to re-distribute funding to open source projects (ICT24), and some of the intermediaries have their respective calls open for projects (from 5k€ to 200k€):
It isn't about promoting open source products. It's about defending against open source products that are already being used. Likely heavily used within the EU institution itself.
Is it worth spending money on Drupal considering we nowadays have anything else?
The answer is yes. The value of these bug bounty programs is directly tied to the amount of use the software gets (and most of these get used a ton, including Putty, regardless of alternatives).
Oh indeed, I'm not trying to say they should not fund these programs, this is awesome and welcome. I'm just warning about a possible pitfall for them to keep an eye on :)
Most places are not running the latest version of Windows. And even if their SOE is Windows 10 I guarantee you that their VDI isn't. And since Putty doesn't need an installer it's easy for users to install themselves even when the machines are locked down.
So given that pretty much every government/company probably has some incidental security exposure to Putty it is a smart investment to make sure it's bug free.
This is a good step, and it’s great glibc is included. In the future, I think it would be great if more critical, widely distributed libraries/software could be included like that!
DSS is EU-owned library I believe. I used it instead reimplementing digital signature verification for XMLs, PDFs, hadling certificate revocation, etc. Makes sense that they want to secure own library that secures many other applications.
WSO2 is Enterprise Service Bus that I used at another company (owned by government BTW) instead of one from whoever-makes-commercial-ESBs.
WSO2 is more than just Enterprise Service Bus. There are a bunch of products and solutions belongs to WSO2[1]. Enterprise Service Bus, API Gateway, Identity and Acess Manager, Analytics and Stream Processing Server are the core products.
As others have said, they are most likely being used by the EU in some parts of their infrastructure. Then the question becomes "Why were they using these software in their infrastructure?" The answer to that is probably along the lines of "a guy that was assigned to project used it because it came up in google," if I were to hazard a guess.
PHP is awesome. The whole "let's start from scratch on every request"-deal makes debugging really simple.
PHP isn't "complete", it could still need better support for multithreading and async operations, but that is also a part of what makes PHP so nice to work with.
It's fantastic for the live request-response situations.
It is called the long tail. PHP is only language with okay CMSs. The reason for it is the hosting model. It is easy way how to give someone package of files and they can buy their own managed hosting, put the files there and they have website.
Its all the tiny businesses restaurants, schools, fan sites. Sites with 5 visitors a day. Probably like 50% of web is this.
You can hate PHP all you want but there is no tech that does this better (maybe serverless one day). And PHP cmses are in completely different league compared to anything else unfortunately. Things like Craft, Kirby, Bolt or even Wordpress nothing unfortunately compares.
Of course people here are gonna hate on it and i hate PHP as much as anyone but when you are living from making websites for 500eur and you need to make 3 a month then it is hard to beat.
I've programmed PHP professionally for the last 10 years (among others), modern PHP written by professionals is markedly better than it used to be and the PHP devs should get credit for that.
What's all this emotionalization of the situation?
In the end PHP is a tool. And despite the constant whining, it has created a great many projects people want to use and use (by choice).
When you create an as good alternative for people's actual needs (and not what you consider people's needs to be, like "elegant code") we can see if we can get them to switch to that.
Professionals use tools. Whiners and amateurs complain about ideals.
>I cannot think of PHP companies as something else than a no-innovation, no-research, no-interest & no change since 2000's.
Is this supposed to be satire?
>Now please tell me what I can do with PHP, I want to see newcomers read what options you have by learning PHP.
You get access to a turn-key, widely supported, language and ecosystem, that powers close to 80% of the web. Including extremely popular CMS options. Plus, access to some of the cheapest hosting you can find. You can also write all kinds of backend and cli stuff in it if you want.
And you can always use ANOTHER TOOL for a different job, if PHP is not suitable. What a concept huh? Who would have thought.
I wouldn't voluntarily start any new project in PHP, but that doesn't change the massive amounts of existing PHP code that can benefit from an investment in security.
PHP is (sadly) not going anywhere for a long time.
Also, as others mentioned, it's just so easy to host PHP based solutions like WP etc that you can't reasonably recommend anything else to a non-technical user.
Even someone clueless can figure out how to host something with PHP on cheap shared hosting with a little internet research.
Good luck doing the same thing with Python or Ruby or even Rust.
As the sibling comment also says the strength of PHP is the hosting offerings. I use PHP only, because it's trivial to create smaller sites with it on cheap shared hosting, because PHP hosting is offered practically unversally everywhere.
For bigger companies which host their own sites, other languages may be better choices.
I have no idea how the development of the PHP language works, but most languages are supported by their community, no? The amount of morons contributing the language tends to scale linearly with the amount of morons using it.
Also, I'm sure most of this has been fixed, but it sure sounds like a terrible language...
The community contributing in the PHP runtime and the community developing projects in PHP are two separate things. While the language itself has been evolving there are still a lot of bad quality PHP based projects and this has to do more with the low barrier of entry rather than PHP being a bad language on its own or the people contributing in it being bad developers. The last few years there's been a lot of good effort put in the language itself and it shows with the latest releases.
Most of all the stuff in the fractal of bad design is still there, which is mostly caused by PHP 7.2 still being more or less compatible with PHP 4, which is basically the strength of PHP.
There’s better solutions to almost all of those issues now though.
Seems like it would quite easy to game the system. Make contributions with known vulnerabilities and then submit an anonymous bug report when the contribution is approved.
And most open source projects run a fairly transparent dev process - almost by necessity. Doing something like that as an individual dev might be possible, but hard and likely impossible to do structurally (no guarantee to get it in, no guarantee for the project to be picked next year(s), no guarantee for nobody else to find it first, and upon discovery, risk that your scam becomes apparent).
But as a team, the only way to really pull this off involves inserting such vulnerabilities intentionally and out of sight, which means a closed dev process. Even if you orchestrate via some other medium - assuming you're using a VCS, the vulnerability will be publicly traceable to a core contributor - and if you do that regularly, you'll at the become known as a project that's a security nightmare; that might kill the project in the long run. And you might even raise suspicions purely base on the frequency and nature of vulnerabilities.
All in all: abusing this sounds like a fairly risky fraud.
Lots of fraud is risky. And often very worth it if you are very poor and live in a country where laws against fraud aren't enforced. I think the potential for abuse deserves a closer look.
If you look at the question, you can see that some people don't make the connection that the motivation behind code signing is figuring out who committed suspicious code. Code that has security bugs is one interest, another is code which the committer didn't have permission to commit, for example proprietary code.
Thanks for the link. So I submit some code that is signed and it's a good contribution that closes an issue. I intentionally include a subtle bug. My friend who lives in a different country uses the bug bounty program to fix the bug and he collects the money. How do you detect that scenario with code signing?
Follow up: This situation is similar to when England wanted Delhi to be rid of cobras so they started offering rewards for dead cobras. The citizens of Delhi responded to this incentive by farming cobras.
What's the difference? It's a systemic flaw.
If there exists an incentive for finding vulnerabilities, there exists an incentive for introducing vulnerabilities. Bug bounties work great for closed source companies because there doesn't exist a misalignment of incentives. If Johnny keeps writing buggy code, he gets fired. If anonymous234 gets his buggy pull request approved, confederate anonymous456 gets to make a few bucks.
Follow up #2: For the skeptical downvoters, I'll put my money where my mouth is and attempt to capture the bounties using the method described above.
> Follow up #2: For the skeptical downvoters, I'll put my money where my mouth is and attempt to capture the bounties using the method described above.
Even as a life long EU skeptic, I'm okay with this. Im not sure about the choice of projects, but I'm sure someone had their reasons for picking the projects that I don't find particularly relevant for a public funded bug bounty.
Eg. Im sure that Notepad++ has its share of bugs, but I doubt many are critical or security related.
Great idea, I hope they also put money to help get orgs out of PHP, Drupal and other dead/terrible software. If not this is a bit depressing and short sighted.
PHP has matured greatly since version 5, especially in terms of performance, and Drupal 8 is a top choice for creating enterprise-level APIs. Neither is dead and Drupal is only "terrible" for beginners.
By most measurements, PHP is used by a majority of sites on the internet. The worst part is how many of those still use PHP 5, which reaches end of life tomorrow...
My disillusion and cynicism know few bounds in any matter involving the EU (or any political body, for that matter).
Over the years, something like this could easily morph into 'this open source software certified and legal to use within the EU'.
No, but you’ve got to have an argument for why this particular round of funding is somehow different from the others which haven’t led to what you suggest so far - what is it?
Or is there some general argument for why the EU funding open source software will lead to our demise? Does this round mark a specific point in the journey to free software being made illegal?
“Governments funding open source software is bad because reasons” isn’t really a useful post on Hacker News.
I also dislike the 'bug bounty platforms'. Why can't I simply report it upstream, and if accepted, claim my price? Each of the projects should have CVE protocols and procedures. The idea probably is to curb the zero-day vulnerability leaks, but I assume that if you're able to find a CVE, you're capable of finding a CVE procedure.
Overall, though, this is great of course.