It's a lot worse than the standard configure, make and make install.
For one, it's only requiring one simple script to get compromised and replaced to compromise your entire machine. At least when I download the tarball, unpack it, etc. I've got several steps before I even get to the make install that I would run as root. And I might not even do that depending on my install target. If I do, I have ample opportunity to verify the package in question and examine the script I'll be running via sudo before I do so.
With the instructions provided by this site, not only is there no opportunity to verify the script I've received is the one I wanted to (HTTPS alone is not enough for this, the site or domain may have been compromised) but they're instructing me to just run it immediately as root without any examination.
This is a terrible practice, and while yes it has been around for a long time in the past it was something only done and encouraged by idiots, it seems to be gaining popularity.
This is a bad idea, stop doing this people.
Provide an install tarball and provide a checksum of it that we can verify.
I'm a Windows user. Virtually every piece of software I install comes in the form of an executable, that I download-and-run directly from the browser. I often give this executable administrator priviledges. The executable can then do whatever it wants on my computer. Usually it will install said piece of software (except when it was made by Oracle, it'll install said software and the Ask.com toolbar)
This has been standard practice for decades. It's very comprable to that wget oneliner. Why is it bad? Or, why is the wget oneliner bad but not Windows installers? Do Windows sysadmins compile tarballs and verify checksums? Or are Windows sysadmins simply dumber/less paranoid than Linux sysadmins?
You're giving those executables full control over your computer. Yeah, you won't get burned very often (depending on where you're getting these executables.) But when you do get burned it's going to be catastrophic. That's the problem.
How would you then recommend installing non open source software on a production system? E.g. a freeware (but proprietary) FTP server or a database engine.
1. Don't install software from untrusted sources.
2. The trusted sources should provide checksums so you can verify the package before installing it.
This protects you from a compromised package so long as the page you accessed the package from has not been compromised (if that page is compromised, the checksum could be changed to the new one).
I have no idea how you would verify closed source software if the checksum and package are both corrupted...
"2. The trusted sources should provide checksums so you can verify the package before installing it."
If the checksum is published to another trusted location, in addition to where you download the files, that would help in the case that an attacker compromises both the checksum and the package on the download site.
The developer could post the checksum to Twitter at the same time as making the package available for download. An attacker would need to compromise both Twitter and the download site.
If you are taking recommendations, I wouldn't recommend installing them at all. You could ask the authors to support a platform which has a better security model.
(Given that there are open-source and robuster-than-Windows platforms freely available.).
That was a terrible recommendation. Proposing a solution when you know nothing about the situation nearly always results in a pointless solution.
Suggesting that the best way to securely install Windows packages is to use a different operating system smacks of immaturity, and is obviously not a practical solution in Windows environments.
A better suggestion might be: If you do not trust it, run it in a VM/sandbox.
Blindly installing windows executables and giving them admin privileges is at least as stupid. This being common practice is probably the primary reason botnets are easily able to exist. This being common practice is why windows sysadmins disable users from having admin privileges. Because apparently you're too dumb to be trusted with them.
My point was the people who are going to blindly run curl/sh commands are also going to blindly download the source tarball and run configure/make/make install, and the people who want to do more verification are certainly free to do so.
Yes people are going to dumb things. But that doesn't mean you need to encourage and facilitate it. That's my point. If you know better you should be facilitating the proper way to do it, not the worst way.
True. Though for a bit of benefit, if the download was hosted separately from the site, that'd be two things to hack. The downside, of course, is that script updates would require new SHA1s. It would be best, then, to insert version numbers into the installer scripts, and a version check. That way if a new version comes out, the installer could warn about it (y/n), while still verifying the SHA1.
No, they're not depriving you of any opportunities to verify. They're offering you a shortcut if you decide that you trust them enough to forgo examining thousands of lines of code. If you decide you want to review the code, just wget it down and then install it after you review it. And if someone doesn't know enough command line to do that, are they really going to be in a position to audit the code?
>>>>Provide an install tarball and provide a checksum of it that we can verify.
This 100%.
When I first started experimenting with Linux after being a lifelong Windows user, this was the same advice I got from a seasoned Linux developer: ALWAYS COMPILE FROM SOURCE - NO EXCEPTIONS.
He just told me even though Linux is secure and not as big a target as MS, he said you want to establish good habits from day one and not get lulled into a false sense of security.
What? You don't need a tarball to be able to verify the package via a checksum. In fact, a good packaging system will do checksums automatically, plus offer public key signature verification. I know deb and rpm files do that, and I believe many Windows packaging systems also provide checksums and public key signing. So do apt, aptitude, yum etc. On most of these systems, you even have to manually disable this option if you don't have the repo's public key.
In fact, of all the install methods I use regularly as a Linux user, the only one that does not offer automatic checksums and public key signing is make. You can certainly verify checksums and public key signatures, but you have to do it manually.
Do you look through all the code you build from source? Highly unlikely. If not, what's the point?
Security is a lot more social screening than many people in the tech world would like to think. Know and trust the people you are running code from. If you don't know and trust them and still need their code, that's when you want to spend a couple of hours reviewing the source.
Why? If you don't trust your Linux distributor then you need to find a different one, but I don't think the answer is compiling everything from source; specially if the upstream project doesn't support 100% your target distribution.
I think he told me this so I would get into the habit of doing it. I was so used to just clicking on an executable file and just installing the program without ever thinking it may contain malicious code.
I can tell you I don't do this every time. Now if I'm installing a 3rd party extension or plugin, I usually do it in a sandboxed environment, run some pen tests on it to check how secure it is and see if there's any malicious code before I install it on my machine or server.
And yes, even then, it is still possible to install malware on your machine.
For one, it's only requiring one simple script to get compromised and replaced to compromise your entire machine. At least when I download the tarball, unpack it, etc. I've got several steps before I even get to the make install that I would run as root. And I might not even do that depending on my install target. If I do, I have ample opportunity to verify the package in question and examine the script I'll be running via sudo before I do so.
With the instructions provided by this site, not only is there no opportunity to verify the script I've received is the one I wanted to (HTTPS alone is not enough for this, the site or domain may have been compromised) but they're instructing me to just run it immediately as root without any examination.
This is a terrible practice, and while yes it has been around for a long time in the past it was something only done and encouraged by idiots, it seems to be gaining popularity.
This is a bad idea, stop doing this people.
Provide an install tarball and provide a checksum of it that we can verify.