I've been using Windows pretty much since version 3.1 (though will some time on OS/2 as well) as my primary OS. I have, of course, used Linux, though mostly for servers and not a full-time primary OS. I also have a Chromebook running Chrome OS. Recently, I switched jobs and my team uses Macs so over the last few months my primary work OS is now Mac OS X, while I still use Windows at home.
One thing I have discovered in this transition is that I find the fanboism illogical and pretty much pointless. I've found that I can, for the most part, do about the same sorts of things in one OS as another, though the process is different. Yes, there are some applications available on one but not the other, but I have not found anything that forces me to believe that one OS is superior in all ways to another.
I can appreciate the differences and pros and cons to various operating systems, but I honestly get annoyed by people that attack or bash another OS with broad statements like "Well what do you expect from Microsoft?" or "Well I can do that on Mac why can't I do it on Windows?" Worse, it's never appropriate to associate an OS choice with a personal character flaw of an individual as often happens in flame wars. Windows, Mac OS X, and Linux varieties are all tools and should just be seen as such.
I would just like to add that there is a valid reason why Linux or the *BSDs are superior to Windows or OS X - they are Free. As in Speech. And I'm sad this doesn't even warrant a mention.
Maybe I'm the only one... But as a professional developer working on everything from device drivers, embedded systems, up to your average LAMP stack and other web frameworks the "Freedom" of the operating system is worth exactly bupkis to me, ziltch, zero, nada.
There are a whole host of reasons to use one OS over the other but this "Free as in Speech" is simply not a very good one. I'm far, FAR, more concerned about "Getting Shit Done" than I am about the free-ness of my OS. For some tasks the right tool for GSD is windows, for others it's osX, and for still others it's some Linux / *BSD (I'm a NetBSD fan myself).
It seems short sighted to say that some arbitrary and subjective personal belief system about technology makes one system inherently "better" / "superior" to another. I guess it's a matter of what sort of things you grade on, For my personal OS grading metric there is only one item, "Helps me Get Shit Done". Nothing else matters, I just want to be productive.
It seems short sighted to say that some arbitrary and subjective personal belief system about technology makes one system inherently "better" / "superior" to another.
It's true that the FSF and those who support them are more concerned with the long-term future of computing availability than about whether they can GSD in the next few days or weeks. I'm assuming that that's what you mean by "short sighted"? Well, that's a bit snarky.
I chose OS X above (though I'm moving more and more in the direction of Linux on the desktop again). I guess it just struck me as odd to use "short sighted" to describe the people who, for idealistic reasons, are willing to take a short-term hit to productivity to ensure long-term capability.
> It's true that the FSF and those who support them are more concerned with the long-term future of computing availability than about whether they can GSD in the next few days or weeks.
But what's the point then? If it's just an academic process then that's fine, but for the purposes of determining a useful day-to-day primary system, it seems like 'It's free software' is a terrible metric on its own. All else being equal, of course, I'd rather free than closed, but all else is not equal.
It feels sometimes like this sort of 'free therefore better' ideology blinds the people doing the actual development to the fact that if they made the system more productive and intuitive for people to use, then more people would use it, and that's a net benefit to everyone. Perhaps if they DID focus on 'whether they can GSD in the next few days or weeks', then their project would reach more people, and through that more people would be exposed to the (legitimate) benefits of free software.
Isn't that equal to always optimizing locally, as in a greedy algorithm, and therefore ending up at an suboptimal destination? Why isn't a short-term productivity loss acceptable if it improves things in the long-term?
I think it boils down to the difference between idealists and realists. Realists are always optimizing locally whereas long-term change is always brought by the persisting efforts of the idealists.
I also sense that you speak to FOSS as in "us vs them". You want to wait till their products become better than anything else while they are looking for comrades in their campaign to reach there.
Sorry, I didn't communicate well. The FSF movement is surely thinking "long term" and these people aren't necessarily "short sighted". What I do think is short sighted is using that as the primary metric for the quality of an Operating System; That leaves and awful lot of pretty important bits and pieces out. Admittedly, some people just prefer Linux / BSD... and thats fine. However, the parent comment I was replying to gave me the impression that, for them, the Free Software aspect was the "end-all, be-all" of metrics for measuring the quality of the OS.
Use whatever metrics is most important to you when deciding which OS to run. Some is "free as in speech", some focus on "does this help me git shit down a lot faster" ... some, well, who knows. Who cares for that matter.
I pick a Mac laptop as I can run OSX, Windows, and any necessary flavors of Linux (Metric #1). Plus, it has a nice physical shell and for me, that's worth paying some extra cash (Metric #1). But that is me. I assume no cary over in what works for me with other people.
Laptops are as diverse and affordable as shoes (or any other commodity consumer product) and people pick these things based on whatever suits their fancy at that time.
In regards to getting shit done, I can pretty much get shit done on any distro for the most part once past any learning curve (if there is one).
Just out of curiosity, what is making you move towards Linux?
The huge appeal of OS X to me is, as someone said on here recently, it's basically *BSD with a fancy window manager and 1-to-1 hardware/software integration.
The way OS X is being locked down. In 10.8, signing is optional and even if you tell your system not to allow unsigned, you can bypass it on a case by case basis. In the next OS X, we'll probably move a bit farther in that direction, and at one per year, the OS X of 2016 will probably refuse to run unsigned applications at all, for excellent reasons involving security and malware, etc. I'm sure there will be a way to turn it off, but when it's basically impossible to get more than a few users unless you submit to signing, applications like Bittorrent will either submit or find themselves reduced to developers-only status. If they submit, Apple will be able to remove them at any time (for the vast majority of users, who won't know how or want to "root" their Macs). Even if Apple never chooses to use this maliciously -- and I don't think they would for a long time, if ever -- it's going to be only a matter of time before some group sues them to use it to stop infringement or something.
I don't want to run a system where, by design, the manufacturer can remove my ability to run applications I've installed after the fact. So, even though I'm happier in general with the polish and fit of OS X than current desktop linuxes, that's where I'm headed.
I did mention those benefits in passing: "for excellent reasons involving security and malware". I don't disagree that this can provide those benefits; I just don't feel that they're worth the cost to me, personally. Also, I don't have any notion of how they could provide what I and other technically savvy people want and what the main base of computer users need at the same time.
The only thing you seem to be saying you want is to be free from the fear of unintended negative consequences of code signing.
It's tautological that nobody can both provide something and freedom from the fear of the unintended consequences of that same thing simultaneously.
On the other hand, since the only way to judge the trustworthiness of code is to know its provenance, I don't see how we can avoid some kind of signing becoming ubiquitous.
To me that means that free software must eventually develop a decentralized code signing scheme.
Correct me if I'm wrong, but the package managers of all popular Linux distros (except for Arch Linux, who are moving to bring it in soon) have had code signing for years now. It's not decentralised however the package management system isn't either.
I don't really expect them to take it away completely. So, a simpler version is that I expect the capability to silently disable applications on user machines will be misused (through third-party insistence if nothing else; see the _1984_ debacle at Amazon). A similar system which could only throw up scary warnings when starting an unsigned application wouldn't bother me much, since it wouldn't be such attractive lawsuit bait. But I agree that I'm only speculating that Apple will carry this through to its natural end of being iOS-like. I can't imagine any reason for them not to do it, and I think that for the majority of their users, it will actually improve the experience.
Right now I'm using XMonad, but there are a number of other tiling WMs under X that actually manage windows for you instead of placing that burden on the user.
I'm decrepit enough that I remember the bad old days when proprietary OSes had a lock on things. I remember the number of times I ended up fucked because I couldn't look at the source of some crucial component, couldn't change it, and could only plead with a vendor to unbreak their stuff. And the standard response to my pleas was, "How many more units will you buy if we change that?"
I'm way more productive now thanks to the open source movement. My minor problems often turn out to be somebody else's big deal, so they end up fixed as if by magic. And if not, I can dig in myself, or hire someone instead.
No thanks to people like you, of course. But I owe a great deal to people who made the "irrational" choice to adopt an open platform and make it the most productive choice.
>Maybe I'm the only one... But as a professional developer working on everything from device drivers, embedded systems, up to your average LAMP stack and other web frameworks the "Freedom" of the operating system is worth exactly bupkis to me, ziltch, zero, nada.
And Linux is the top system for ease of acquiring development tools. Compilers, debuggers, IDEs, version-control systems, scripting languages... you get it all on Linux.
Actually, in my experience, it's quite difficult to get Unix-style development tools running on OS X. There's a whole basis of Unix software and development headers you have to install first, whereas Linux has a package manager for that.
I primarily use Linux, so I'm most productive there, but I understand your sentiment. I would use a nonfree OS if it increased my productivity.
That said, I think Linux and all the other FOSS OSes and software keep costs down. Without Linux, could you imagine what a Windows license would cost? I can't picture that we would have the same price points.
Do you think a similar argument could be applied to Android or any other free/commodity software/platform?
Perhaps the proximate cause of low Windows license cost is that there are alternatives that are free as in beer, but it's obvious to me at least that the only reason such "free as in beer" alternatives exist is because a lot of people work very hard to make sure that there are "free as in speech" implementations of fully fledged operating systems.
I am a GSD metric guy as well. I have two laptops , one mac for apple development (iPhone etc) and a windows machine to do blackberry development.
When im doing multimedia stuff , say using progs like xtranormal i have to use my windows because again, no port for mac is available. If im coding on an xterm connected to my amazon instances im using my macbook because its just alot nicer and basically all the tools i need come installed with the mac.
The thing i never understood is, if you are a mac fanboy and would never use windows or visa versa how do you manage to develop properly. How can an apple fanboy who refuses to use windows do blackberry development and how does a windows fanboy do any iPhone development, do they just restrict themselves to programming in objective C and dot net respectively ?
Yeh you could use a VM but i just found it easier to have two separate laptops with whatever OS i needed to run to get the job done
I guess you can have a preference , me i prefer my macbook when im doing general stuff because again all tools are included and if i need to fire up an xterm, run some network tool, quickly whip up a shell script or try some random thing its just easier on a mac but to be a fanboy who refuses to use the other OS makes no sense to me.
What you just explained is simply your arbitrary and subjective personal belief system about technology that makes one system inherently "better" / "superior" to another.
Proprietary software - i.e., software you can not modify to suit your needs - can actively inhibit your ability to "GSD". That's kind of the point.
I'm not one who has produced a many exploits in comparison to the real pros, but if I had to find a hole in an OS or browser these days I'd much go after Linux or Firefox than Windows.
3rd party apps on Windows are a different matter however.
Well said. I agree. I work in a 99% Windows IT shop, therefore Windows is the OS that will help me get the most work done in the shortest amount of time. I run Fedora and FreeBSD virtual machines in my spare time, but for most of the day, I need to get things done and Windows 7 helps me accomplish the tasks set before me.
I guess your value system is just different than mine and a lot of other people who feel like decisions should not just be made on practical expediency.
By default, the next version of OSX will only allow you to install software from their app store. In 3 years, (or whatever their release cycle is) they might decide to remove the option to disable this restriction all together. Obviously, this will be broken very quickly. However, for corporate users or people who don't want to void their warranties, "Free as in Speech" might very well become synonymous with "Getting Stuff Done".
Freedom also carries a cost. At the end of the day, the best OS for any given individual is the one that gives them the largest net increase in utility, for their personal utility function.
My experience suggests that a Free OS isn't worth it to me, but it's been about six years since I gave it a go full-time. The last time I gave Linux a good go round (as in using it as my primary OS for everything I did), I still had to invest time in getting my touchpad to work correctly with X or getting decent sub-pixel anti-aliasing. All of the freedom to tinker in the world is worthless to me if I can't get the basics sorted without a significant investment of my time.
You would hope that on a site like Hacker News, there would be a good percentage of users who would get a lot of utility out of being able to tinker with their operating system in ways not sanctioned by the creator of said OS.
In my experience this hasn't been true of (some distributions of) Linux in several years. I've talked to a lot of people who hold this opinion and they generally haven't tried Linux in 5+ years.
I think for the general case yes, but there are still some rough spots on certain hardware (e.g. power management on macbook pros and so on).
Some other smaller things, like attaching an apple keyboard to ubuntu creates up with pretty weird mappings for a couple of keys. Configuring the keyboard under those conditions is pretty confusing.
Yes, if it's your hobby. If you need to get something done (time = money) but have to stop to fix something in your free operating system then it is no longer "free" (as in money) I'm not talking about the openness of the system in relation to code etc..
If you have to change something in a closed system, it will either be impossible, or cost zillions of dollars (whatever it takes to convince Microsoft or Apple to do it for you).
A number of years ago, working on a semi-embedded system, I hacked the way Linux boots from USB devices to speed it up, which made the people I was working for quite happy. That's the kind of thing I'm talking about - I would expect to find a lot of people with similar stories on a site like this.
Also, configuring Linux machines is not really that onerous in this day and age. I suspect more time is wasted complaining about it than in actually doing it. There are certainly still some exceptions, and if you want the ultimate in "mom friendly", Mac OS is probably the way to go. But for a 'hacker', the ability to explore and modify might trump that need for ease of use.
What cost? You're stating this as a fact, when it certainly isn't. As I've pointed out in another comment, the claim that usability is an orthogonal concept to Freedom is a false dichotomy.
Yes, there are quite some issues at the moment, I'm not going to deny that. The thing is, a lot of those, like missing hardware support, are not the fault of the Free Software community, it's the fault of the very things we're trying to stop: closed/locked hardware, proprietary drivers, non-open file-formats or Freedom-denying practices like DRM or tivoization, and finally plain and simple FUD (named as the single most critical problem GNU/Linux faces in an infographic released by the Linux Foundation last year[1]).
Before someone points out that shifting the blame doesn't make the problems go away, let me add that I'm perfectly aware of this. I just want to make it clear that it's unfair to blame Free Software for a lot of the problems it faces, when in fact we fought and tried to prevent a lot of them for more than two decades.
This IS clear to everyone; it's simply irrelevant. Life's not fair. We all have to deal with unfair problems. Emphasizing the unfairness of your problems only makes the FSF look less "idealistic as in principled" and more "idealistic as in naive."
Relatedly, your most critical problem is not external, it's that the FSF is now associated more closely with non-issues like "GNU/Linux" nomenclature than solving serious problems.
Suck it up, grit your teeth, and press on. That's what it takes to be successful in just about any field of life. Anything else ends with you sitting on the sidelines, impotently complaining about how unfair it all is and how bad all the successful organizations are.
It's us hopeless dreamers that have built the world you say has no place for us. And we will continue to dream, and make our dreams reality, no matter how much you tell us we should accept defeat and just face the "reality of life", whatever that is.
So no, we're not going to "suck it up", but we're going to press on, to make a better world for future generations.
Don't misread me as a visionless hack. I would never discourage dreaming. My point was not that you are thinking too big, but rather exactly the opposite: that the FSF so often gets caught up in trivial distractions.
"Suck it up" does not mean "accept defeat," it means to accept the blows that come with contesting the arena of ideas, and get serious about effectively pursuing your mission. You don't get to just pretend the arena of ideas doesn't exist.
Free software is an ideology, and as an ideology it badly needs a prophet, someone who can inspire reverence for the cause the way that Carl Sagan or Neil deGrasse Tyson can inspire reverence for science.
rms is wasting his influence. Just look at the primary GNU Project About page; it's a disaster of anxious dissembling, fussing over terminology, scolding people who only 95% agree with him, and taking pages and paaaaages to do it. That is not how you lead.
"If you want to build a ship, don't drum up the men to gather wood, divide the work and give orders. Instead, teach them to yearn for the vast and endless sea."
When the FSF has a message other than "the waves are choppy and all the boats are unsound so you better stay close to shore," maybe they'll find that people start listening.
One more thing: A terminal as a first class citizen. Definitely not on Windows, and while the OSX one is orders of magnitude better, I seem to recall it's still a bit lacking. What ruins OSX for me isn't the terminal, though, it's that X apps aren't first class.
Regarding the Terminal - I never really got what people meant when they said the terminal was lacking. Maybe it's lacking some obscure features, but it absolutely nails the core features. The OSX terminal is, in my opinion, the single best terminal I've ever used (and I split my time fairly equally between OSX and Linux as my main desktop OSes and have tried pretty much every Linux terminal app I can get my hands on).
Unlimited scrollback, the ability to copy and paste using the keyboard, key bindings to clear scrollback, change font size, etc., separate keybindings to scroll the buffer vs. send scrolling commands to the terminal, double-click on parens/etc. selects a range from there to the corresponding other one, and I could probably go on listing more-or-less unmatched _usability_ features for quite a long time. On the other hand, if it's functionality people are referring to then I guess I just don't know what I'm missing because I've never found it lacking in that department either.
Well, the Mac terminal used to suck (especially before Leopard) so maybe people are talking about it pre (Leopard|Snow Leopard|Lion)? All the last three releases greatly improved Terminala.pp so it is in many ways superior to the Linux terminals I have used.
Not in the least since Snow Leopard, as Terminal.app has been perfectly fine (with Terminal Colors SIMBL plugin, and MouseTerm if you really want) and even better in Lion. For utmost configurability you could lean towards iTerm2, which you can tweak to no end (and arguably more than most X terminals).
Platform-specific tools from pb{copy,paste} to tmutil are truly useful and improved at each new release. Tools like brew give you all the package management love you could want, and xorg support is as first class as it could get without OSX being X-based itself.
I guess that if you want to go further than what OSX currently support, then you're looking for getty VTs, tiling window managers or Compiz customizability. As such one should probably not try to shoehorn Linux into OSX and rather use Linux (or some *BSD) directly.
Unfortunately, the terminal on OSX is incredibly buggy. Keeps incorrextly coloring commands, randomly failing... Perhaps it's because of the outdated bash they use?
Ehm, I have been using Terminal.app on OS X for years now, and probably use it > 50% of my time daily (yay for REPLs), and I never really encountered any problem.
The last thing I remember was that in OS X 10.5 I had to change some locale option to have UTF-8 working properly. But that's it.
That's more of a toolkit problem. For most X apps, there's no way to figure out (for OSX) what part of the app is the Main Menu. And what functions to call when a drag/drop happens, there's no interface for it. Under X, every toolkit has it's own solutions. The only solution is to embed support for compiling to native Mac apps into the toolkit.
There were projects to pull that off for KDE and Gnome, but somehow nobody continued working on them. I had the KOffice running first class on my Mac in 2004, and also Inkscape and GIMP a couple of years ago, but these native interfaces were buggy, and somehow there wasn't enough leverage to really finish them. Sad.
Have you tried PowerShell? It provides many more features, and has been a default install since Windows 7. Microsoft has been pushing it by making it possible to use PowerShell to configure a lot of the server products - and sometimes requiring the use of PowerShell for some configuration tasks.
I've tried PowerShell a few times. The thing that puts me off every time is not being able to bind keys. I find it so painful not being able to use emacs shortcuts at the prompt.
Because it operates at the object level rather than the files and strings of nix shells, Windows Powershell is more or less "one ring to rule them all."
I use the terminal heavily on OSX and Linux and I don't see a difference. It sometimes can be perceived as inferior on the Mac because of the lack of package management and issues with dependencies especially when working with Ruby tools. But as far as capabilities go my experience has been that the OSX and Limux terminals are equally equipped.
I agree with you. That's certainly a 'pro' on the side of Linux. I stated that every OS has it's pros and cons, but my goal wasn't to list what those are.
To say unilaterally that it's better for all people at all times in all use cases is, at the very least, an overstatement. While being free is a top concern for you and many others, for some (perhaps many) usability might be their top concern and they may be willing to sacrifice freedom for usability, and that's a choice we should all be free to make.
>usability might be their top concern and they may be willing to sacrifice freedom for usability
While some argue that this is the current situation, I want to point out that's a false dichotomy. Freedom and usability aren't orthogonal concepts like some make them out to be. I fundamentally disagree with the idea that we need to lock everything down to make it "usable".
Also, Freedom of software (and hardware) ultimately has greater implications on society as a whole, which is why I raised the issue. It's not merely a feature, a pro or con, it's something which at least people like here on HN should be greatly concerned about. Computers play an incredibly important role today, and will only become even more important in the future. It's important to retain control over them on a fundamental level, and that requires Open Standards, Free Software and Open Hardware.
> While some argue that this is the current situation, I want to point out that's a false dichotomy. Freedom and usability aren't orthogonal concepts like some make them out to be. I fundamentally disagree with the idea that we need to lock everything down to make it "usable".
Nobody is making this claim, so why are you trying to refute it?
For the virtues of locking everything down, I would switch from free v not to mac v windows. These are the two dominant operating systems, dislike it as I think we do. Apple locks down everything. Apple hardware, on the iphone Apple Software, Apple plugins and addons, everything, to the greatest extent possible, is controlled by Apple. This yields benefits. Driver installs giving you problems? Not with Apple. Screen display no working properly? Not with Apple. They control everything, and they make it all perfect. Windows makes a "something for everyone and everything" strategy. It gives you more options, but it also gives you more headaches. I don't think it is possible to increase options endlessly without costs.
OS X is hardly customizable. The user interface, for example, is pretty hard to customized while on Windows, numerous patches exist to modify completely the user experience (it might be because it is less needed on OSX..). Also, SIMBL allows to patch applications and add features, but it is far from being trivial. I do think Windows is more customizable. Concerning Linux, you can just do what you want with it.
I don't feel like these are a major issues today, or that they have been for some time, but back in the day I know a lot of hatred toward Windows/Microsoft stemmed from Linux users feeling slighted by the lack of support from ISVs and hardware manufacturers, the relatively immature state of Wine, and above all Microsoft's constant FUD / aggressive (and often illegal[1]) politics / lawsuit threats / etc.
Throw debacles like Silverlight, OOXML, IE, and Vista into the mix, and it's not hard to justify a lot of the resentment toward the "Evil Empire" from back then.
Today (though Microsoft had to be dragged here kicking and screaming), Microsoft actively supports open source (CodePlex), open standards (ODF is officially supported by Office), and the open Web (they're clearly genuinely trying with IE9+); on top of that, they're rolling out some pretty solid products (Kinect, Win8, MSE, WP7, Azure/AppFabric, all the random stuff MS Research does, etc.). Mind you, I still wouldn't be caught dead using Windows or any other Microsoft product, but it's nice to such an influential tech company changing for the better.
"To say unilaterally that it's better for all people at all times in all use cases is, at the very least, an overstatement."
Yes. Just better for hackers who want to alter anything about the code they are working with.
"While being free is a top concern for you and many others, for some (perhaps many) usability might be their top concern and they may be willing to sacrifice freedom for usability"
>> "While being free is a top concern for you and many others, for some (perhaps many) usability might be their top concern and they may be willing to sacrifice freedom for usability"
> False dichotomy.
I disagree, he's not stating that there's a correlation between free-ness and usability. Just that what he (and maybe most "normal" people?) consider usable is not currently available in free form. As a result they end up with a choice between "better usability" and "worse usability" and it just so happens that the entire former category happens to be non-free.
Now I know a lot of linux fans disagree and say that linux is just as usable as OSX/windows, but I personally disagree and the markets appear to be backing me up on this. Unless you are supposing there is some other important reason for people not to switch?
Theoretically linux and the important applications can be made just as usable, but it hasn't happened yet...
The only thing the market backs up, is that decades of lock-in, billions of dollars in advertising, pre-installation of the OS and lobby-ism make the difference when the competition has no such resources.
The reason people don't switch may very well be that noone except tech-aware people has ever heard of Linux (I'm pretty sure my parents don't). And as this poll suggests, for people that know Linux, it is a very very good alternative to Windows, even better than that ;)
Do you honestly think so many people here would use Linux instead of Windows when it's not "usable"? You know, i'd like to get my work done, so i'm using what makes me most productive. Certainly not Windows.
Sure, for programming linux (any unix, really) is superior (personally I use OSX). But, for just desktop usage (browsing, e-mail, video, music) I much prefer Windows to linux. OSX vs Windows is a bit of a toss-up, I prefer OSX because of the unix environment for programming, but I still keep windows around for games.
Since XP the instability issues have Windows have been a thing of the past and the Windows 7 user experience has been pretty solid in my experience. Granted I don't do anything complicated under Windows (like I said, unix wins for programming), but neither do most "normal" people.
Edit: Also, note I'm not saying that Linux isn't usable, it is just that most trivial desktop tasks take me significantly more effort to get done nicely under Linux. Whether the benefit of other tasks (scripting your system/whatever) taking less effort outweighs this extra effort depends on the frequency of the different types of tasks.
I'd argue that specifically for the standard tasks Internet, Mail, Video, Music, Linux is just as fine from a usability point of view. Of course the user doesn't have a benefit of using Linux if his Laptop comes preinstalled with Windows/MacOSX. But that was my point ;)
Wine for games is still a massive crapshoot.
Firstly you will get compatibility problems with many DRM systems which is a killer if you plan or running your legally licensed games without cracking them.
Assuming you can then get the game to work , expect graphical glitches a plenty , occasional/not so occasional crashes , slower performance and issue with the audio and visuals being out of sync.
There are a bunch of older games (GTA3 etc) that work fine in wine, and some indie games that have been built for it to begin with but I couldn't recommend it to anyone as a gaming experience that is close to that available in Windows 7.
Completely agree with you. If you are a gamer, the only solution is Windows, there is no alternative.
Steam on OSX brought some hope, but performances are just not there yet, and in any case, as long as developers will stick to Direct X, there won't be any other choice than having a Windows installation. I cannot blame them, Direct X seems to be largely above OpenGL, notably accessibility wise.
Besides that, while I'm not sure it's still a valid concern:
When I switched from Windows to Linux in the mid 90's it was a transition from a box with an illegal copy of MASM and a not-really-working copy of a C++ compiler to a box with a plethora of PL's, databases, code-examples and whatnot with an enormous educational value for me. On the winbox I learned Assembler, on the linbox I learned to program, that made the big difference for me.
Is it still that way for others or is the current situation on windows better? (not considering OSX as you can install pretty much anything there nowadays)
There are various ports of the GNU tool system and Microsoft has been making recent compilers available for download (and use, the free editions include redistribution rights).
Lots of other programming systems also enjoy at least decent Windows support.
Before I start, what follows is my personal opinion, based on my own experience. I'm not saying my way is right. I'm saying it's right for me.
Superior (i.e. free) will not make me use it (Linux, or anything else that's free).
Every January 1st I refresh all of my personal PCs and laptops. Every January 1st I kick off Windows installs on three of them while I give a Linux distribution a go on the fourth - a stock Sony Vaio P-Series.
Never, ever has the install run smoothly. And I have to admit that a single bump in the road is enough for me to delete the partition and put Windows back on. Because I need stuff to work. I'm productive when I program, not while I debug installations.
The kind of freedom you mention is not a reason why they are superior in some absolute way.
It's a factor which, in the real world today, makes them better for some purposes and worse for others.
There are reasonable but unproven conjectures that this kind of freedom could become the most important factor over time, but the existence of conjectures doesn't demonstrate any inherent superiority.
Everytime I've tried to switch from Windows to any usable Linux Distro, never has it happened that i didn't have to get frustrated due to some driver issues, laptop hibernation issues or some other issues which forced me back to Windows and kept Linux to either servers or a guest OS running in a VM in my PC.
I think ReactOS is what are you waiting for. It suppose to offer you the freedom of any other free and open source OS while holding the benefits of Windows. But it also awaits contributions. Spread the word.
ReactOS is poorly supported, incomplete, and user-hostile. Very little actually works on it--the software support is mostly limited to fairly old software, the only development happening for it is OS development (which is fine, but that's not a lot of use for normal people, or even software developers who don't have a burning desire to reinvent the wheel), and its UX decisions are a noted step backwards from Windows itself.
I don't intend to be rude, but recommending ReactOS as a legitimate Windows alternative is reality-averse.
"ReactOS is incomplete" = still needs development attention, which is what I've said - "it also awaits contributions". "user-hostile" as much as Windows has ever been. "Very little actually works on it" is actually more than what works on Linux + Wine, which at it's turn (like many would claim) do cover enough needs already, and that's only the "alpha stage" we're talking about here. ReactOS is getting better and better at the fastest pace relative to the rest of the OSS. You see, even if the ReactOS project had to start right now, it would still have a guaranteed chance of succeeding thanks to it's principles: it is free and open source like any other "free-as-in-speech" option out there, plus the one thing that makes the difference - it avoids alienating the users by breaking the backward binary compatibility for the sake of whatever reason. It offers the Windows simplicity.
"Free - as in Beer" has always been the more important point to me. This is followed closely by the ability to screw around with anything I want which, I suppose, is close to the "Free - as in Speech" argument, but for me it has always been about pragmatism rather than ideology.
On the contrary, it couldn't be more relevant. It's a fundamental issue of technology, and becoming increasingly important as we face the war on general purpose computing, a possible new advent of trusted computing with UEFI, and the list goes on.
I would go as far as to say Free Software is on of the most important things of the 21st century. We're moving towards an information society, and if we don't free the means to create, process and transmit information (as well as information itself), we will lose control to a few big corporations. I'm not planning to let that happen.
>When was the last time you hacked at your OS's kernel?
I haven't done that yet, but I occasionally poke around in the sources for things like the coreutils, read them, try to understand them and learn from them. I plan to do that with the Linux kernel at some point, too. You simply can't do that on unfree operating systems.
You're thinking free as in beer. Free as in speech means open source. As a developer you gain more from that as you can tweak/maintain your own software if needs be.
Wow. I'm getting quite of a down votes. I'm not opposed to Open Source software, actually I'm contributing to one of them. What I meant is that when the price tag is so small comparing to your earnings (since you are going to use the software for work), why would you make it a decision factor when picking an OS?
I disagree. You are entirely correct that fanboism is detrimental to any healthy discussion. But sweeping any discussion under the rug as fanboism isn't much better.
I use all three operating systems, and one major argument for me when doing development is the terminal. I know there are emulators on windows, but I have had moderate success with them. Ubuntu and osx give me much more power in that area.
At work where I'm forced to use Windows, I have Cygwin installed. It's great to pop open a terminal and be able to pipe things around as you would on Linux. It makes Windows a little bit more comfortable, although Cygwin still feels like a second-class citizen (especially when it comes to the file system).
One thing that bugs me about putty is its cramped GUI, which isn't even resizeable. (The terminal screen is, but not the windows in which you configure the connection parameters).
Ah, gotcha. Yes, this is a huge gripe of mine as well. I don't want to have to launch RDC just so that I can then launch a cmd.exe instance and pscp something to a remote machine.
In my experience I've found OSX somewhat better suited for development and Windows somewhat better suited for gaming but otherwise the main differentiator is hardware polish. The unibody Macbook Pros/Airs is just levels above anything I've experienced in the Windows world.
I've just switched back to Windows and its the hardware I miss more than the software. I bought a fairly expensive Dell but it isn't a patch on the old Macbook Pro.
Windows globally runs great on MBPs. During the early days of SL, I was spending more time under my bootcamp partition than SL as it was buggy and unresponsive.
Agreed on the hardware. This is why I bought a MacBook and installed Windows on it. I still hate the damned keyboard, but I hate all non Model M keyboards, so I've learned to live with it.
You'll never eliminate flame wars or any of it's derivatives.
You simply can't for the sole reason that the people that participate in those or are offended enough to start those
silly subjective "debates", are probably suffering from lack of identity.
Therefore they lash out at anyone that dislikes or doesn't share the exact same preferences, it makes them feel personally responsible to "defend" their choices in life.
Flamer wars are inevitable, the internet was made for venting and complaining that people are complaining is a big waste of your valuable time.
I've honestly found developing on Windows a much bigger pain in the arse until you spend a lot of time setting it up. It's not that you can't do things on Windows - anyone with skill can do anything on anything, however it just makes it significantly less easy, it just feels like an OS set up for consumption as opposed to creation.
I'm not sure that's actually the case, but if it were then deliberately using one that was locked down / charged for / oligopolistic would be a bad choice, not a neutral one.
Every day, I start up a computer running OS X. Then, I proceed to fire up virtual machines for two Linux distributions, Windows XP, and Windows 7.
I occasionally play around with a headless system that I have running FreeBSD. Then I may power up my iPod Touch to test recent changes to my mobile site. I also do a significant quantity of work while I am logged into a remote debian server via ssh.
I doubt that I am the only developer with this type of daily routine. I actually do think that this poll is pretty nifty (and I checked off OS X), but I think that it is still worth observing that the term "primary operating system" just doesn't quite carry the same weight as it used to.
Depending on how you use it, OS X can feel a lot like Linux. If you're spending most of your time in Terminal and Emacs, usually ssh'ed into a server, then your technical answer will diverge pretty far from your "real" or metaphysical answer.
I work for multiple clients. After messing a few years with "I need a postgres for this one, a mysql for this one and sometimes, things just break on the live server because Mac OS X is almost linuxlike", I switched to a fully virtualized development stack and haven't looked back since. So the answer is not metaphysical, but true.
I can't comment on that. I don't develop anything that requires a GUI on the machine. I just ssh into the machine and do my thing there. All editing/browser viewing is done on the host.
This is my case - whether it's a Linux VM or on the cloud, the server OS is Linux.
For my desktop, OSX provides the best combination of spatial window management (you can drag-drop nearly anything - esp. text without wiping the clipboard buffer), terminal friendliness, and MS Office (yes, Excel is still better than alternatives). Lots of hidden gems and a sustainable indie dev market.
I don't play much more than the occasional Nethack, so my need for Windows is pretty limited.
OT question for you: I'm a developer that does a lot of web development, from Perl to RoR. When I end up starting more than 1 VM with 4 GB RAM, my computer literally dies with Mac OS X. Now this is all with VirtualBox, which I suspect isn't the best all around VM application.
For some reason I feel that it is the crappy memory management with OS X that is killing me.
Other than getting more RAM do you do anything special? What VM software and how much RAM you running.
First, even if its the answer you ruled out: get a machine with lots of RAM. Don't care about the rest, even an SSD is not that important, just cram as much of RAM into the machine as possible. 8GB would be good, 16 is better. Make sure that the OS has enough space to shuffle memory.
Use tiny VMs. Most development stacks do actually fit in 512MB, as long as there is nothing else running. Pay attention about which parts of your dev stack do actually consume the memory. In my case, it's mostly in-memory databases. Sample those for smaller datasets, its good practice anyways. If its still not enough, use odd values like 700MB. Rule memory leaks out (this is one of the big advantages of small VMs: memory leaks are easy to find).
Also, use one VM per project. Unless projects are tiny, putting 2 in one VM only replicates the problems of your host-system.
Finally, I also use VirtualBox with Vagrant and am quite okay with it.
This is what I'm thinking, I'm kinda spoiled at home with both my main box and my VM server each having 8 GB.
You do make a good point about the extra cruft that isn't needed for a VM. I should actually know this as I have several LEBs on the web and optimize them highly for low memory usage. I guess personal time < work time.
Going from 4GB to 8GB on my mbp made a huge difference running VMs. Also, get Fusion. For working in VMs all day Fusion has worked better for me over VirtualBox.
You should try out Parallels then, as I find it much snappier than Fusion. After having tried Virtual Box, Fusion and Parallels, I kept the last one as it provides the best experience of the three imo.
> my computer literally dies with Mac OS X. Now this is all with VirtualBox, which I suspect isn't the best all around VM application.
Asked and answered. I like VirtualBox - the price is right for when I just need to run an app or two on rare occasion - but it certainly isn't the most stable or least OS-crashing VM I've ever used.
What is this "crappy" memory management of OS X that you speak of? When it comes to memory management, OS X is definitely among the best from my 14 years of experience with modern operating systems. For what it's worth, I regularly run two VMs in VirtualBox totalling just over 2gb of guest RAM allowance, on a 4gb machine running 10.6.8, and I don't suffer problems with this. Users of 10.7 claim that it's a wee bit hungrier than 10.6, though I still can't recall the last time I saw anything else than "Swap used: 0 byte" in the Activity Monitor.
I can't seem to find the blog post about the memory architecture in Mac OS X, but it was rather recent, under 8 months ago some guy blogged about the crappy memory management that was used in OS X and why he ended up switching platforms.
I have yet to try 10.7 stuck on 10.6 until the boss allows us to upgrade. This will probably be my next big upgrade before anything else.
I use OS X for browsing the web, web-development, graphic design, video editing, and writing.
I use Windows 7 for music production and playing video games. At some point I want to start dabbling in game development, in which case I'll probably go with Windows for that, too.
I ssh into various Linux servers privately and at work. In my spare I sometimes play around with Linux distros on my desktop just to learn about the current state of affairs. I usually can't see any advantages in it over OS X other than the fact that's it's free software and that it runs on cheap hardware. I keep being curious though.
I don't use any virtual machines because I dislike the sluggishness and I don't really need them.
Sometimes I wish I could get by with only one OS without feeling crippled in some respect. My dream setup would probably be an OSS system that's great with multimedia stuff and has about a 99% adoption so hardware would be supported really well.
But that thought depresses me because it reminds me of the state reality is in, so I try not to have it.
I am fascinated by this claim. The last time I looked at Windows for music production was a long time ago, but back then CoreAudio beat the pants off of ASIO for real-time work. Is that not still the case?
What software do you use? Two packages that I use heavily, Logic and DP7 are mac-only.
There are a few freeware plug-ins I use that are Windows only. If I could afford to buy Altiverb 7 right now, which is OS X only, I'd probably switch to OS X (at least till Altiverb came out for Windows).
How is Mac a better plattform for audio? This is not a rhetorical question. I keep hearing this, and some say it's because of CoreAudio. But so far I haven't been able to find a thorough explanation that's not based on biased assumptions.
The Variety Of Sound stuff is what I'm missing on OS X. I use those a lot.
Many musicians say this because Windows systems can, for a variety of reasons from hardware drivers (certain Firewire chipsets and motherboards) to bloatware, become very glitchy and finicky when it comes to low-latency recording. It's very difficult to predict if new hardware will work or not and it can be very time intensive to troubleshoot when the problems arise. Unless you buy from a music PC specialist, you're unlikely to encounter sympathy from support desks.
Alternatively, every Mac comes with Garageband and is built from the ground up for reliable recording - if you buy a system and you hear glitching in recordings (which I've never heard of), you can take it to the Apple store and have a technician troubleshoot the problem.
I don't think most PCs face this problem (although prevalent hardware like HDMI ports is often problematic for smooth audio recording), but musicians tend to recommend Macs because the certainty that it will work out of the box has a lot of value.
Although I think this has a lot more weight when you talk exclusively about laptops, with which I indeed have had so many problems in the past that I would recommend a MacBook to any fellow musician asking me for advice, especially if he is going to go on tour with it.
The two DAWs I've assembled myself in the last 12 years were both super stable and performed really well. I don't think I would have gained anything by using a Mac.
In fact I'm on a machine that I recently built which dual boots into Windows 7 and Snow Leopard. Maybe I will benchmark Reaper in both of them and come to a surprising conclusion. If I do, I'll post it on Hacker News.
Out of curiosity, since you sound not like a fanboy of any particular os and just want to use one: What keeps you using osx over windows for browsing the web, webdev and graphic design? Are there any special advantages or is it just software which is not available for win?
Graphic design: not many. Just little things, like the great desktop zoom, the nifty screenshot shortcuts, simple access to special characters. Nothing that Windows couldn't do without some modifications, just maybe not as nicely. Also, I spend most of my time in OS X anyway, and I don't want to boot into Windows just for quickly creating or editing a file in Illustrator or Photoshop.
Browsing the web: font rendering. I don't like Windows' aggressive hinting and the one dimensional anti-aliasing. Some non-standard fonts I find not only ugly, but unreadable on Windows.
Webdev: Unix underpinnings, Rails. Text rendering, again. And I feel like I'm a lot faster at switching apps and searching for stuff on OS X. But that's probably just habits I've built over time.
Overall, OS X gets in my way the least. That's why I would choose it if I had to choose just one OS. But I don't, so …
Same here, as a developer of a cross-platform library, this is the only configuration that works for us. By restricting virtualization of OSX, Apple has effectively forced most developers to use this configuration.
I've built Departments and platforms on the old faithful
OS, I even blog about it - but my contracts demand skype or even Oracle and whilst you can sort of get those things working I no longer can afford the extra time to make it work
Ubuntu just managed to auto install X on a chipset I could not get working manually, and so I am going to Linux land for my workstations. The contracts will require Linux and windows servers so Freebsd will run my in office DNS for now
the network effects of things like dropbox skype and so on are having their effect. I have gone from feeling my OS choice is a secretweapon to a drain on time and effectiveness
You might try PC-BSD for a Ubuntu-like experience with a FreeBSD base. I know it autodetects graphics/sound during its installer and can setup stuff like Skype/Flash for you pretty easily.
That said, I did the same thing and now use Ubuntu on my laptop and workstation. The only apps I really want are a terminal and Skype since I can run Konsole+tmux/Firefox/Chromium/Okular/etc remotely displaying locally via X.
For desktop use I'm migrating back to Windows after having used Ubuntu almost exclusively for the past five years.
It's a shame, because Ubuntu was soooo close to surpassing Windows from a "I just want to get stuff done without mucking with the machine" standpoint. But then Windows got much better with version 7 and Ubuntu took a nosedive into crazy UI fantasy land.
The final straw was when text started randomly disappearing and reappearing in emacs and xterms in Unity. I don't know why it happened, just that it was too much trouble to diagnose some weird bug for the millionth time. If I can't trust my desktop to run emacs(!) without glitching, see ya.
I'm getting older, and after about 15 years playing around with config files, I just want something that's stable and consistent, and OS X was never to my taste. Windows is leaps and bounds better than when I heavily used it last.
And Linux runs fine using Xfce in a VM where you can count on the host to have all the hardware working.
Interesting that you say "crazy UI fantasy land". It wasn't until Ubuntu 11.10, and now 12.04 that I thought desktop Linux had a shot. Unity in 12.04 is quite awesome and makes me super productive. It gets out of your way and that is how I like it. Sure, I make some modifications (smaller icons, autohide the launcher, change icon set to Faenza), but that is about it.
And, since being back in Linux land, I have never, ever had to maintain a system config file or touch it at all. I do have my own zshrc and .vim stuff, but I would have that on a Mac as well. Overall I find Ubuntu to be more productive than OSX. And since I virtualize my dev instances, I want something lighter and Ubuntu is WAAAAAAY lighter on the same hardware than OSX.
It makes me happy to know that someone here likes Unity, because I think it is horrible. But then again, my preferred GUI for Linux is Ratpoison and some Xterms.
My preferred UI is the Awesome WM so I know how you feel but I think Unity is going down the right road for several reasons:
+ Composited desktop means a X11 artifact free experience.
+ Keyboard shortcuts for most tasks
+ Unified experience for different apps/programs
+ The windows key is no longer a dead key
- Custom key bindings are hard
Back in 11.04 I absolutely hated unity, but now Stockholm syndrome is setting in. When I go back to 'regular' desktops I find that I have to make a conscious effort to manage my windows. I now get annoyed if I have to move windows around.
Unity appears to have altered my behaviour and I think the change is pretty nice.
Our experience must have been quite different, since I have finally ran away from Windows after four years of using Ubuntu in parallel. Windows 7 is nice, but I still find Ubuntu easier to use and much less resource intensive. True, I was also at first thrown away by Unity, but have adopted Gnome Shell and had no need to look back since. Each time I reboot in Windows I feel like in one of those nightmares when you're running the best you can, but are barely moving.
"Ubuntu took a nosedive into crazy UI fantasy land"
Is that really your reason why you ditched linux????
It is so easy to install a different UI. Nothing is so clean and consistent as window maker or any other of the extremely lightweight UIs linux allows to use.
Compared to that: whenever I start up Winodws (XP) in vmware to get my tax stuff done, I am horrified by all the messages that pop up (repeatedly), antivirus practically paralysing my computer for minutes (and longer), genuine advantage stuff etc. For my taste, windows still needs so much work invested in configuring these annoying things until the stuff is usable.
First off, occasional forays into Windows VM is always fraught with windows and AV updates - you can delay them if you're only in for a few minutes, but...
That evenutally got me to dust off an old netbook, use it as my windows "server" that faithfully wakes on LAN, and just keep that in the cupboard access through remote desktop (Microsoft makes a great OSX app for this).
Every week the machine wakes up, backs up (to image), fires off a bunch of updates, potentially reboots, then goes back to it's aquiescent sleep state.
The RDC is generally faster and has the upside that crazy activity on Windows stays on it's own hardware, and I don't have a crazy 60GB file that's clogging my backups - the netbook has it's own backup.
I'm getting older, and after about 15 years playing around with config files, I just want something that's stable and consistent[.] .. Windows is leaps and bounds better than when I heavily used it last.
This is why we use Windows. When we buy a new computer, it's a former corporate business class laptop sold on eBay, it comes with some version of Windows preinstalled, and we don't mess with it. As with the recent Gmail discussion, the reason we use it is that it simply works and we need it to do work.
Here follows nostalgic reminiscence and some ranting.
I used to run OpenSUSE exclusively. That was fine when the world was young and my job didn't demand too much of my time. I didn't need any graphics support, WiFi or even ACPI.
10 years later, I don't have the patience and the time to sculpt a unique snowflake that will work with all the hardware in my laptop, and have it work in any reasonable network I might have to plug in to. Windows 7 works well enough. This is also a pity, since I've come to strongly dislike the post-Vista Windows UI - the new start menu, the new task bar, the new Windows Explorer windows. Everything was Just Fine with the Windows 2000 UI.
At work it's all Red Hat Enterprise Linux, a tiny bit of AIX (ewww). And I run NetBSD on Prgmr just for the heck of it.
I hate where we've gotten to. Opera is the only browser that still has a g*ddamned proper status bar.
My main machine is Ubuntu host and I run virtualized Ubuntu dev instances. Frankly, nothing can beat it. Unity in 12.04 has been a god-send to me using Linux and I don't think I'll be going to anything else for quite some time. Seriously, if you haven't tried Ubuntu 12.04, you owe it to yourself.
BTW. I used Slackware for years but had to use OSX at a job for like 4 years. When I left that job I went and bought a MacBook Pro out of habit and I realized I just didn't feel natural using it, even after 4 years. That is when I put Ubuntu on the system and felt the "ahhhhhh" feeling. Nice to be comfortable again. Unity really is an awesome interface. First real innovative interface I've used in years.
This is the first time I've seen someone praising Unity. As soon as I tried unity I shifted to Kubuntu, but now that you mentioned, I will try 12.04 with unity.
Couldn't agree more. The whole mentality in the OS X developer world glows of such a strong focus on usability and UI design, while, for some reason I cannot understand, the Windows and Linux world is mostly the diametrical opposite. This was the first thing that I noticed when I moved over to OS X; the difference was incredibly clear, and it struck with great impact. I think people in general gravely underestimate how much in good software lies in a good interface connecting the user with said software.
I am not from the valley and it is heart-warming to see Linux beat out OSX. Especially considering the number of mac books I have seen engineers carry about, the couple of times I have been there.
I'm kind of indifferent to OS X but the Mac laptop hardware is so nice, particularly the trackpad, that I'm likely to stay an Apple customer for the foreseeable future.
<rant>
Am I the only one hating especially the trackpad? I feel like it has much lower relative spatial resolution, i.e. if you set the maximum cursor speed you still have to make long motions, and the precision for some reason is far from pixel perfect.
And pixel-perfect pointing is something to be desired with those ubiquitious small controls in OSX.
In discussions like this, where serious hackers praise OSX over Windows, I feel like I have some strange disease that makes me prefer my old Toshiba U500 over new MacBook Pro sitting on my shelf. (Really want the MacBook Air but feel like I'd hate it too)
</rant>
Joking aside, I've never found any trackpad Apple or otherwise that I could use as efficiently for pointing and clicking as even a cheap-o mouse. But gestures and scrolling are so smooth and so useful on OS X that I keep one around (even when I'm at my desk - I now keep a mouse to the right and a trackpad to the left).
Aren't the macbooks over-priced ? Personally I'd rather buy a laptop and a tablet (something like the transformer) rather than get something that costs twice as much or offers lower return on hardware for the same price.
Disclaimer: I am from a parallel programming background, so I am biased towards performance rather than form factor or battery life.
The last one I bought (a 13" about 18 months ago) wasn't. Yes, you could buy a Windows laptop from Dell or HP or someone for half the price, but it would have less than half the battery life, it would be larger (albeit probably not heavier; Macbooks are the densest laptops I've found) and it would have a plastic shell instead of aluminium which feels worse when new and won't age well. It would also come laden with crapware, which isn't an issue for me since I've put Linux on it anyway, but it is a big deal for most people - the majority of machines I see running badly these days are doing it because of software, not hardware.
I guess if you're coming from the perspective of maximum CPU performance per $ being the key metric, then they probably aren't a great deal, but for most people that's not really the most important thing any more.
I like to consider the total cost of ownership when purchasing a piece of hardware, which is hard to quantify and sometimes a touchy and subjective topic but I have found Apple Computers/Laptops to substantially outlast anything from Dell or HP. I would lay down 2K for an Apple and never quibble (that much) but I would never do that for any other brand (I have just had lots of very bad and costly experiences with them....Dell in particular)
Fair shout. Like I said, I bought my last laptop 18 months ago and back then I didn't think Dell had any good options; admittedly their website is a maze of twisty laptop models and brands and price points which I may well have missed something good in. Amusingly in this case I can't see any picture of their laptop and when I leave the tab via ctrl+shift+tab it pops something up saying '9'. Very professional.
I would also observe that I'm wasting money on an operating system I don't really want on the Dell as well, although it would likely be somewhat more useful to me than OSX.
I guess that it does look like a really nice machine. I think it says something though about Dell's image as a company that I find it hard to believe they've genuinely created something that good at this point (and yes, I have owned Dell machines before; that's part of why). I guess I should probably cut them some slack...
Well I did mention two laptops 1080p (1920 x 1080) with a color depth of 24. Both bought for around $1200, with substantially better hardware than what apple offers.
What metric are you using when you say equal display ? I have two laptops that display at 1080p (no, not an external monitor, just the laptop displays themselves).
I recently purchased a <$1000 HP laptop because I couldn't bring myself to pay the Apple tax (basically double) for similar hardware. I'm currently having a hard time with the feel of the machine. The worst part is the trackpad - multi-touch scrolling is flaky, and I either get unintended clicks or enable a "palm check" that turns off the trackpad when typing.
I'm deciding laptop ergonomics are really important (but $1000 important? cringe)
The three laptops I had before my MBP were all from different companies, ASUS, Gateway, Dell, and I hated the hardware on all of them. There was always something going on. The case didn't close right, the battery life sucked, the trackpads were awful and so itty bitty, the wifi cut in and out, the power cord needed to be jiggled just right to charge, or wouldn't charge at all unless held in place with tape just so... between the three laptops and six years there was always SOMETHING going wrong. So you could say that I originally purchased my MBP because of the hardware. I expected to install Windows on it (didn't, but that's a different story).
I swallowed the extra cost when I did the math. I use my laptop about 3000 hours a year. Day in, day out, I'm on it in some way. They usually last me 2 years or so before I get the itch to upgrade or they fail. I did the math and found that the ~$1000 laptops I bought cost me $0.003 per minute over the course of those two years. A $1300 MBP cost me $0.004 per minute. I decided that I would rather pay $0.001 more per minute to enjoy the hardware I used. I'd rather be in a rage over the fact that my code isn't doing what I want than if my battery isn't taking a charge again.
An added side benefit has been how much tougher the MBP is. I've had it for two years now and it shows no signs of needing replacement. I probably won't upgrade until I see how Apple's Retina display transitions to their laptop line. The upgrade then will probably be out of lust than necessity.
We're talking about the primary tool of our trade. I make my living using this device. If I was repairing cars again I'd be buying Snap-On. Yeah, it's expensive. But goddamnit if I break a wrench they replace it for free. The guy drives to my house and gives me demos. I don't need my tools giving me grief, I need to get work done with them.
> An added side benefit has been how much tougher the MBP is. I've had it for two years now and it shows no signs of needing replacement.
MBPs are not that tough though, I've a colleague who really shouldn't be left alone with machines, his MBA (which he brings everywhere and takes little care of) did suffer quite a bit, it's horribly dirty and has quite a few dents. A Dell half the price would probably be dead already, though. But he'd probably benefit from a Toughbook C1, that's a step up in ruggedness, MB's aluminum is nice but not that solid.
FWIW I've had a (2006) macbook 1,1 since it came out (cost ~$1200 w/o applecare). In the same time, I bought an HP (consumer targeted, ~$800), resold it out of frustration with the trackpad and got a Dell Precision M4500 (refurb _and_ $20% off coupon brought it down to ~$900 w/ 3 year service contract). The Dell was awesome with a mouse, but the trackpad was still basically useless, and the intellipoint wasn't nearly as comfortable as the macbook's trackpad, so here I am, back on the macbook for the foreseeable future.
All that to say the trackpad, keyboard, keyboard controls for display brightness, and (my experience, obviously not common) battery longevity do seem to be worth the money. There's a ton of weird little pieces that apple really gets right that don't show up on the spec sheets.
I have been using Linux as my only operating system since Yggdrasil Fall 1994. Before that I was a happy Amiga user.
After Yggdrasil I took a short turn to Slackware (downloaded a lot of floppies), and then switched to Debian which I had heard nice things about - and I have been running Debian ever since.
I always get surprised at how complicated other systems (Windows, Mac OS X) seem to me, when I briefly encounter them. I guess it is the power of habit.
I've been a Linux user since kernel 2.0.30 was new.
I have used Windows at work for 6 days total. I used it at a Java shop where the build system was Windows only. After 6 days I needed to get my job done and I installed Linux to get better dev and network diagnostics tools, hacked the build system to not include .bat-scripts, got my job done and never looked back. That was my first 6 days as a professional software engineer and my last 6 days of using Windows for non-gaming purposes.
Since 1998, my primary private system is Mac OS. In 2000 I gave my last Windows Notebook to my brother (A 9000$ Fujitsu Siemens Lifebook E6560 - he used it until a few months ago and I'll get it back soon).
At work, I used Linux exclusively until 5 years ago. Since then, my Notebook is Mac OS. But large processing jobs still go to our grid that consists of several 172 GB RAM linux machines. I am an astronomer and I process quite a bit of data.
I fought for Mac OS at work and since I don't have to use linux on my notebook any more, life is much easier. But Lion and Apples cloud integration bs makes me fear, linux has to come back someday.
Do you encounter many issues performing your day to day work?
I've been eyeing it for a while and I had it running on my home server for a time but I didn't think it was quite ready to be my primary OS. I generally use Arch Linux at the moment, mainly because I like the configurability and frequent package updates.
There's a few glitches -- I don't have the latest Intel graphics drivers, and as a result the video doesn't work when I resume from S3, for example -- but aside from those, it's really quite straightforward. For a long time there were problems with flash, but with linux emulation enabled it "just works" under Chromium; it's been years since I had OpenOffice fail to open a Microsoft Office document. Java is a bit of a nuisance to build, but for licensing reasons, not technical reasons.
Obviously you need to be comfortable at a command line -- the most common "maintenance" you'll be doing is updating ports, and every couple of months you'll find that /usr/ports/UPDATING says "the foo port has [done something weird] and you need to run the following commands before/after updating" (often it's a matter of deleting a port because its contents got merged into something else) -- but if you've got Arch Linux experience I can't imagine that you'll find much difficulty using FreeBSD.
Depends on what you expect, really. Setting up isn't much harder than, say, an Arch.
Hardware support is generally ok, though if you want 3D acceleration, Nvidia is about the only safe option (ATI is absent, Intel can be tricky or non-supported, depending).
Keeping 3rd party programs up to date can be a real pain, though. I haven't found a way to avoid long compilations, occasional breakage, and weird library issues. Compiling Firefox can be annoying.
(I run FreeBSD as my home server where basically all my data and important programs live, and a variety of other "throwaway" OSes in my day to day use)
I tried out pcbsd, which I think is based off FreeBSD. The only issue I had was everything being built from source. Considering that userspace is pretty much the same for BSD and Linux, I am guessing there wont be many issues. Coming from an Arch Linux back certainly helps I think.
The drawbacks have to be the lack of certain binary driver blobs available in Linux. Some may call it a feature though.
A choice between the 'big three' is rapidly becoming an uninteresting decision. No concessions for the rare people who use Haiku, wrote their own OSes, or use their phones/tablets as their primary device?
Myself using Debian/Ubuntu as work OS since i started studying 11 years ago. Dualboot for gaming.
All my (IT) colleagues are using MacOSX.
My pro/contra list:
Contra:
- gaming
- want a proper desktop search engine/crawler
(that doesn't eat up my CPU every day when
indexing (and thus, creating far too much noise))
Pro:
- everything else, never looked back
edit:
When i think back so much has improved in regards to Linux, it's awesome. No more kernel compiles, no more X11 configuration editing, no driver hassles. I'm curious what the future brings :)
OSX on laptops, OSX or Linux on desktops, Linux or FreeBSD on servers, iOS on mobile/tablet.
I don't know you you define "primary"; outside of a web browser, I spend more time at a command prompt on a linux machine than at the command prompt on OSX.
I'm sitting on Windows 7 machine and quite frankly I don't get the hate about it. It's beautiful, it runs programs and with those programs you can do work and view media. Heck, it even plays games (WOO, who knew right?). While there are parts that aren't really thought-through when it comes to UI design, it's still the best thing I can afford. Some day when I win a lottery, I'll try out a mac, but till then it's all Windows. And to those lovely linux freaks, the day you can run Adobe Fireworks / Illustrator / Photoshop without the hassle of Wine emulator, do let me know.
The trouble with digital photography apps is the lock-in - I've got tens of thousands of photos processed with LR, so I'm basically stuck with it. Glad that I didn't go with Aperture - at least I have two platforms to choose from rather than one.
Really? Common most stuff you copy from somewhere (you can find tons on github for example). For the rest the only have to know that the list syntax is like the python one [1, 2, 3]. I don't no haskell either.
I've been running various flavours of GNU/Linux on my laptop for a couple of years now, with Arch being the most recent one. I find it perfectly adequate for an IDE-less, Vim-centric coding which takes a significant majority of my time spent in front of a keyboard.
I wonder what are experiences of members of the HN community who run Linux on their MacBooks, especially in case of a development-oriented environment. Could you share your opinions about such a setup? Is it worth the price of the machine?
I use OSX and love it. For personal use its awesome and feels a lot like linux when I need it to, but when I actually need linux for work, I can just ssh through the terminal and its awesome!
I love how its very easy to develop on OSX because it comes with a lot of dev tools preinstalled. Not that it takes any effort to get your own personal environment set up, but its just nice how they package that all together.
Different OS's for different modalities. For desktop environments, I usually prefer windows 7. I think that Win 7 has some really good window management control, and across multiple monitors often found in a desktop setup it works especially well.
Apple hardware is top notch and robust. I don't need that in a desktop - my hardware isn't going anywhere (hell I don't even use a case for my desktops). For a laptop though, portability and build quality matters. So for those I'll buy Apple hardware, and thus use OSX as it integrates the best with the hardware provided.
For servers, I like a light and lean platform, so I'll use linux. Requires very little maintenance, runs solid, and eats up very little resources. Right now I'm using Ubuntu on my two linode servers. I'm only hosting websites on them and seeding the occasional torrent, I don't need a custom setup. The out-of-the-box functionality that ubuntu provides works great for that.
This is by no means and end all/be all rule, but for the most part I find that this is how I'm using my devices and OS's.
I'm using Windows 7 for CAD. I now use Solid Edge, I used to do AUTOCad and I think I'm headed for Pro/E.
I'd like to use anything but Windows, but guess how plentiful are my options to use CAD on other systems? Nonexistent.
The economic niche of CAD systems would have room for good CAD running on Linux. The bounty is good. At the moment it's practically impossible to get a serious CAD legally for no less than 2000$/copy.
I'm guessing that the 3D printing scene is held back by modeling prices. It's actually more expensive to model than to actually get that model printed.
I feel like I'm saying "Carthago delenda est", but I strongly feel that this should be addressed.
OMG, I was just 3 when you began to use Linux. I would like to know if its some sort of addiction to use Linux ? the reason I am asking is, I am using Linux since 6 years and I think I am getting addicted to it .
The awesome thing about Linux is that I got a Unix like system for free (Windows 3.1 was mainstream at that time), and that I can do in Linux whatever I want to do because it is open source. Linux hasn't lost its fascination although I am a bit disappointed about the current misleading too much tablet oriented developments (Gnome 3, Unity).
Developers will never give up a real operating system. You just can't develop on consumer devices like the iPad; these are designed for consumers, they come with many limitations to make the consumer's experience easy and smooth, but at the cost of hackability.
Right, but whatever directions things go towards, do you really think that developers will use "browsers" as the operating system?
Look at photographers. Despite all the consumer level digital cameras that have flooded the market, professional photographers (and even hobbyists) still buy the more expensive (and more complicated) cameras that a consumer would probably never buy.
Assembly to C is not the same thing as Unix to Chrome.
Your argument would work if, say, people start writing operating systems in Python.
Another important thing to realize: there's a big difference between the OS and the GUI. The GUI might very well be implemented on top of "web" technologies, but that's not enough to say that the OS is the browser.
Well, the portability improvement from Assembly to C is mirrored in browser applications compared to native one in my view. The Browser today abstract a lot of things for you now including network communications, device interaction, file system interactions, graphics... the list goes on.
These aren't all GUI related things. I strongly believe you can now have a OS that is basically booting you into a browser and all OS interaction would go through WebAPI and still build most applications we use today.Obviously there will always be cases which you need to drop down a level.
Developers of ChromeOS or boot to gecko seem to agree somewhat.
Windows 8 supports HTML/JS for creating apps using WinRT, but it's hardly usable in any web browser like Enyo apps are. Not to mention the ability for creating apps using .NET or C++ are still very much a possibility--perhaps even preferred.
It has not really changed in the last 40 years or so though. I would be surprised about a radical change in 5. I don't see replace the OS companies at YC yet either.
"The day is coming when Chrome, Firefox, Safari will be the poll options."
I am astonished. I wouldn't trust anything from Google any more. They aim at control over our private data. Absolutely unacceptable for everyone, but especially for developers, if you ask me.
OSX also has both of those, although I'm not sure if you have VB support in osx, but I'm guessing that if you're using calc you don't need it anyway. If this set up works for you that's great of course!
VBA on OSX is fine, a little slower than on windows (yes, it can get slower!) - couple of funnies around file system access, but well written code will work with no changes
Ubuntu desktop, Ubuntu laptop, iPad, 95.5% of our servers are Linux or BSD. In our office, we have one old windows xp box on P4 hardware to remote desktop into for various stuff that cant be done on Linux or is significant easier on WinXP....rare.
I tried to use OSX as my primary OS for a year but I couldn't be as productive as I am on Windows. However I would really love to be able to use Linux.
MS Office is really a lot better on Windows. It is miles a head on features and stability in for example Word, Outlook and Excel. It is so much more powerful and if you are an advanced user, OSX version is just frustrating.
The file management and the window manager of Windows is to me also a lot more convenient then on OSX.
On Windows you also have the Adobe CS and there is no substitute for it on Linux. Running it under Wine doesn't give the same user experience.
I see more and more web development tools for OSX only. I hope it is not necessary therefore to have to use OSX.
I run windows, but always run a Linux VM in which I do all of my dev work, and have an OS X laptop which would be a linux laptop, but Logic/Final Cut Pro keep it that way.
Any particular reason you have a Windows host + linux VM instead of the other way around ? Personal experience has been Linux hosts + Windows VM performs better than the other way around.
Windows handles media much better than linux. I have a linux media server as well, and while I like it for somethings, getting Blu-Ray/DVD/MP3/CD/Netflix/etc. working in it is not one of those things. Windows just works in that sense. Also games, although this is more of a theoretical bonus as I rarely play anything.
I virtualize Linux under Windows too. My primary reason is gaming. I can use Linux without any hardware acceleration. My dev environment is entirely virtual and there are advantages to being able to throw it on a thumb drive and carry it around. Or I can be staying with a friend for a week and setup my ideal working environment on their computer in a few commands. Another advantage is it's much smaller than a Windows VM so I can store more Linux VM's/snapshots on my puny SSD.
The two apps keeping me from running Linux full time are Ableton Live and Adobe Fireworks. Unfortunately it doesn't seem likely that the high-end proprietary content creation tools or anything comparable from the OSS world are likely to come to Linux any time soon.
Never used Ableton, but I've been dabbling in Reaper[0]. I can't make a full switch yet, but am hoping to be able to at some point in time. From the little I've seen it isn't missing anything major.
Final Cut Pro is another story, but I really don't use it very often anymore, and I'm well-versed enough in Adobe Premier that I could probably drop it.
I'm not holding my breath either, but I do dream of a world where most software is OS-independent. I guess we're closer now than we have been for a while.
EDIT: I just remembered that Reaper is Win/Mac. I hear it runs under Wine, but I'd be worried about latency there. Alas, we are captive to our tools.
Personal laptop: Linux Mint 12 (was running Ubuntu since 8.04 or so until recently... since Unity I've been wanting to move to another distribution)
Work desktop: 21" iMac running Debian (dual-boots to OS X but I rarely need it, and I'm much more at home in Linux)
Synergy works to share the mouse / keyboard across the desktop to the laptop to create a three-screen environment at work.
I used OS X for a good month or two when work got the new iMacs in; it just wasn't my cup of tea. My coworkers have a mix of Windows and OS X installs mostly.
I waffle between OS X and Linux. They're both running at my house, more Linux than OS X, but when I got a new Verizon iPad with 4G, I finally set down my Cr-48 running Ubuntu (love Unity on that machine, btw), and picked up my old 2008 MacBook again. Hacking: I find linux more pleasant. If I bought a new computer today, it may well be a Lenovo tablet, in the old, swivel-screen definition of the word, because I really miss the high input resolution of resistive touch.
Serious CAD could be for example: AutoCAD, Solid Edge, pro/e, Catia, microstation just to name few. ArchiCad doesn't fit my needs as it's practically only for visualizing buildings. Last time I tried VariCad it was unusable due to bugs. And Blender isn't a CAD.
Being a programmer, I want an OS that gets out of my way and lets me focus. I find the 'busyness' in some of the OS's distracting and a drain on productivity.
Availability of tools and one's aesthetic preferences are also important, for me once I switched to OSX there was no looking back.
That being said Windows or Linux may be better choices for people with other needs or preferences.
Linux warrants a distribution choice, I think. Not nearly every distro out there is worth my time. While under the hood, they are all pretty much identical, the way a given distribution sets itself apart is what makes or breaks the operating system. I've been an avid fan of Arch Linux for a few years now and I can't imagine wanting to switch to Debian or any other setup any time soon.
I used Windows for over 10 years and moved to a Mac in 2009. There is no way I am going back anytime soon. Amazingly, I only realized the problems with Windows after using the Mac for a while and then switching back to Windows for a day to fix an urgent issue. The usability issues lit up like lights on a christmas tree.
Here is what happened. I tried to install a tool on Windows, which required an updated version of the .NET Framework. So instead of doing what I wanted to do, I started doing what that little app wanted me to do. When I tried installing the .NET framework, it wanted an updated version of the MSI (Microsoft Installer). So now, instead of doing what the app wanted to do, I started doing what the .NET installer wanted me to do. You get my drift?
In the 2 years I have been using my Mac, I have not once come across words like virus, anti-virus, drivers or seen any avoidable alert boxes pop up. I am no Apple fanboi. Really, OSX is my preferred OS simply because it does not get in the way and let's me do what I want to do.
Yes it's the same, no it's not easier in any way. I get all kinds of alert boxes and inane dialogs in OSX. I develop all day on a mac and it's constantly annoying, as much or more so than windows7.
I was running on linux for a couple of years. Then I got tired of my laptop battery being destroyed, the fan on my laptop being in overdrive at all times, and also having everything run slow even though I had paid for top hardware. So I switched back to Windows.
I WISH I could use linux as my primary, but I really can't justify all the flaws just for a bit more security.
For pure development (unless you're developing an iOS or OS X apps), Linux is definitely the top choice. For office work (Microsoft Office, Visio, Outlook, Lync, etc.) unfortunately Windows still bests (OmniGraffle and Office for Mac are decent, but still no match), for everything else - OS X is definitely the best choice.
I am a Linux user and I feel more secure when using it.
Desktop : Linux Mint ( Was using Ubuntu until 10.4 )
Laptop : Linux Mint and Windows 7 Dual boot. The reason I have windows is that laptop battery lasts for 4 hrs on windows and only 1.5 hrs on Linux when not on charging mode. also I use windows when I need to use dual screens.
At the moment, I'm posting this on Windows 7 64-bit (my main software development machine), with a 64-Bit Debian for Linux development and a 32-Bit SliTaz for Linux testing running in VMware Workstation at all times and actively used (along with a XP image that I'll spin up once every couple of days for testing).
My MacBook Pro is on the other "side" of my body (my desk is an L), so by simply re-swiveling my chair, I'll be on OS X where I can do more coding, check my email, etc.
I think the OS wars are over. Web apps address the needs of many people (I actually don't use too many), but by and large, all three platforms have seen enough usage and statistics and have been around long enough that there's (GOOD) software to do X (whatever X is) on any of the 3.
I answered OS X in the spirit of honesty. However, I would much rather use Fedora on a laptop at least equivalent to my MacBook Pro, if I could buy such a thing. Installing it myself works at home, but not at work.
I am familiar with System 76, but I personally prefer Fedora. Suggestions welcome.
For my needs OSX is the perfect mix of a user friendly / powerful desktop OS. It's just a very elegant operating system. I feel like just about anything I need to do is very close to the surface. Anytime I am forced to use Windows or Linux, even though I am quite familiar with both, it seems incredibly convoluted to me. Not really difficult just unnecessarily complex and chaotic. OSX knows how to get out of the way and let me work. Since Lion came out I've been using full-screen apps + multi-tasking gestures extensively. I can go days without seeing anything besides my applications. The OS just disappears and leaves me alone when I don't need it.
This is pretty hard to define for me. At work, I primarily use Windows 7. My away-from-work computing is pretty evenly split between my laptop (MBP running Lion) and my desktop (custom box running Ubuntu 11.10).
I'd say my time pretty evenly split between all of these operating systems. I think I prefer Ubuntu overall, however the ease of use of OS X and Windows 7 are pretty tempting a lot of the time (try installing Spotify on Ubuntu 11.10 and get it to play local files). I'd probably prefer OS X among those 2, though since it's much better for development. However, for the most part it really doesn't matter what operating system I'm using.
I run Fedora Linux on the laptop I use for all my day to day stuff. All of the servers I maintain are running some RH related distro; either Fedora, CentOS or - in the case of one really old server - Red Hat 9.
OSX is the only operating system that can run iTunes, Office, Photoshop/Fireworks, Ruby, Python, bash scripts, Apache, Mysql without virtualization or any weird and unfun hacks (yeah, cygwin, I'm looking at you!).
It's not Cygwin "MinGW, being Minimalist, does not, and never will, attempt to provide a POSIX runtime environment for POSIX application deployment on MS-Windows. If you want POSIX application deployment on this platform, please consider Cygwin instead."; there's no emulation layer at all.
Pretty much the entire games industry runs (for historical and practical reasons) on Windows. I used a Linux desktop at home for ages but eventually the rebooting and impedance mismatch got too strong.
Windows (7, mostly). I also run Ubuntu on servers and in a Virtualbox for development.
I've been meaning to move over to a MacBook for a long time now, since I love the iPhone, love the look-and-feel of the MacBook Air, and because so many programmers are moving to them.
The problem is, I've been so reliant on Windows for so long, and have so many modifications and workflows that I set up, that I honestly think it would take me months of work to get to the same comfort level in OSX, assuming that I even could get all the features working there. Is it really worth it?
I don't know anything about how heavily you've modified Windows, but I just switched to OS X from Win7 a month ago. This is after only ever using Windows. I was surprised by how easily I made the switch. I know Lion gets mixed reviews from long time Mac users, but I'm a huge fan.
Well, I have autohotkey modifications which give me a vim-like keyboard everywhere, add all sorts of shortcuts to switch to my most-used windows (browser, editor, etc.). Plus I have programs for everything I need to do, etc.
Honestly, the two biggest problems I have with switching are my AutoHotKey customizations, and Total Commander, which afaik doesn't exist for OSX.
I use OSX at work with a Virtual Machine running Windows 7 for corporate tools (such as mail, instant messaging, etc.). I also use that OS on my laptop, mostly because I bought a macbook pro a couple years ago and still want to get the best of its hardware (battery life).
At home, I use Windows 7, simply because I don't want to reboot to switch back and forth from one OS to the other. I exclusively play Starcraft 2, and if it was possible to play it under Linux with excellent performances, I would switch and never go back again to Windows.
MacOS on the primary notebook and desktop, a little x86 box on Mint Linux with LXDE for a lightweight and complete Linux, Windows 7 on Parallels, on those occasions when I feel life's too smooth.
Now I have multiple machines running linux. I use linux at work too. I am an embedded developer and use Linux in most of the projects. Using linux machine makes life a lot easy.
At home : OSX / Arch in a VM (awesome window manager is very simple and nice). I want to go Arch all the time but photoshop / illustrator hold me back.
I clicked Linux despite using Windows for more time because I only use Windows for non-relevant stuff (gaming mostly, also studying chess since good software only exists in this platform, sadly...). For research/programming/reverse engineering/security/important work I use Linux, specially Debian GNU/Linux or based distro (except Ubuntu), but I guess any open-source OS will do fine as well (FreeBSD and OpenSolaris did very well last time I tried them).
Depends. For servers, FreeBSD and OpenBSD hands down.
Desktops are two different systems--Windows and Ubuntu Linux. I need direct access to the hardware for development (CUDA) on both, so I can't VM. Windows is needed simply because Linux has always failed to provide me with the business apps I need to successfully work with the outside world. Not trying to start a flame war; it's just my experience.
Laptop is Windows, with Linux as a VirtualBox VM. Simply because I want to GSD.
At work:
1. 64-bit Windows 7 displaying a 32-bit Linux VM on one of the two screens
2. 32-bit Linux running native on an HP TouchSmart (for doing embedded Linux cross-development)
At home:
1. 64-bit Windows 7 running usually at least 2 or 3 FreeBSD and Linux VMs
2. Macbook for reading mail and developing while watching TV. Some programming done natively, some done SSHed into a FreeBSD VM. Also sometimes runs 32-bit Windows 7 in a VM.
Windows at work, Linux at home. I voted Windows but if I weren't working on .NET I wouldn't be running Windows at all. I wouldn't have said that a month ago, I really like what's been done with Ubuntu 12.04. Windows is basically just a Visual Studio host for me. Once I figure out how to get paid to work with node.js Linux will be my primary OS and Windows will just be a virtual machine running VS. Unless I start working on iPhone apps.
it doesn't matter. i use a web browser and a terminal. as long as i have those, i'm good.
i use mac os on my laptop, because mac laptops are the nicest. i use ubuntu on my desktop because it's the cheapest option. i honestly don't notice a difference switching between the two. all the linux kerfuffle about window managers and whether unity is better than gnome2, it doesn't matter. you can switch between windows. you can get work done.
Use OSX, Debian Linux and Windows 7. OSX for work, debian for my testing server/some programming and windows 7 for everyday use but that is slowly changing as I use my mb pro more and more.
In the past I have had several windows laptops and none of them hold a torch to the mb pro. size, weight, hardware & battery life are all far supperior than anything I have ever experience on a windows based laptop. The price in my eyes is justified.
Probably should have allowed an option for *BSD. I suppose "other Unix variant" is sort of approximately the same thing, but the former would be more clear.
I've got a macbook air and a thinkpad. I use the air much of the time but quite often I just need to get my FreeBSD fix on in spectrwm. (Yes, you can run spectrwm on OS X but it's a pita and XQuartz makes it much more difficult). Most of what I do on the thinkpad is C hacking.
I tend to approach operating systems with a BSD mindset: use what works for you, and if you have a complaint - put up with some code.
Using linux since Caldera Desktop Preview 2. Things really picked up with RedHat 5.1 and 5.2. I currently am running the latest Ubuntu on my dev box and my laptop. My job is mostly C++ development of finance models, statistical analysis, and creating documents in LaTeX. I couldn't think of a better fit than Linux for this.
I was a MacOS user from 1992 (version 6.8!), and started using Windows along with it in 1995. In 2001 I have switched to Windows completely, and in 2009 I have started using desktop Linux (Ubuntu) in parallel with it, switching completely to it as my only desktop OS in December 2011. Wandering what will be next... :)
Long time Windows user and developer, switched to OSX a few years back, now can't imagine working without it. Love the terminal access to Unixy stuff. Running Windows in Parallels for the remaining Windows only software. As I tell people, my MacBook Pro is the best Windows laptop I've ever owned.
I've also been a Windows user for more than 10 years but just recently switched to OSX (May 2011) because I was migrating from Flash Development to HTML5/JavaScript. It seems that most of the tools for web development in general are more conveniently available in OSX.
I can't hope to answer this question. My home desktop is a Windows 7 box, my laptop is a Macbook Air running Lion, I own both a Xoom and an iPad and use both regularly and my work machine is a Linux box...there needs to be another option on this Poll...YES!
Linux is my OS of choice, but I have nothing against Windows 7. Unlike XP and earlier it can be normaly used from non-root account, there can be set nice POSIX-like programming environment. Maybe one day, but I need get more RAM for my machine at home. xD
OS X for regular day-to-day, Windows for CAD and structures. I used to use gentoo occasionally, so a year ago I started browsing other systems and found Arch Linux. As soon as I get some extra hardware for a desktop workstation I'd love to give it a try.
I don't have the time to dork around in Linux as I once did (never was that great at it) and while I'd like to take OSX for a spin I'm a penny pitcher so I stick with Windows at home. I'm happy with it.
I'd love to have a Macbook Pro for iOS development, but I'm sitting pretty with the Arch Linux install on my current laptop. Nothing matches the customization of Linux in my opinion and I could never leave XMonad.
I spend blocks of hours every day in front of a dev machine, but aggregating all the hours I spend in private or inbetween work, including weekends and all using my phone or the ipad, iOS is my actual primary OS.
My OS of choice is OSX, not because I find it the best OS ever, much more because I don't want to use Windows! I used to have a laptop with ubuntu on it but in the more recent versions the UI just isn't it.
I use OSX, Windows, Linux and Android daily. Some more than others depending on where I am and what I'm doing, but I'm at work using Ubuntu for 40 hours a week, so I should probably vote that.
I use Win7 at work (servers are FreeBSD). I also had a personal WinXP laptop until 4 months ago when I switched to air and I'm now using OSX at home for fun and sometimes also for work.
Not sure what to put down, since my general server runs Ubuntu, my games machine runs Windows 7, and my work machine runs Lion (for iOS dev). I use all three almost daily.
understandability by common users is a big thing and linux lacks it. A common person who doesn't know anything about computer would be much comfortable with windows or Mac rather than linux. Linux is a programmers operating system. You learn a lot using linux than windows. Programmers have different opinions so are different distros(That is my view !!). I personally use UBUNTU for my programming and Windows for gaming/netflix/other_entertainment.
There is other breeds of nerds here. Me for one. I do mechanical nerd-stuff and I need windows for CAD. But I guess there is not enough of us here to explain that number.
It's probably correct, actually. In the past there have been a few links from HN to my own site, and I was surprised to see that a LOT of people were using OSX.
At work I use Windows / linux, but about a week ago I switched from using a MBP as my primary computer at home to a Transformer Prime.
Impressions so far: it's awesome. Consuming is much better, which is most of what I do at home. Most producing I've done lately has been of the pure text nature (emails / blogs etc). The onscreen keyboard is mostly find for this, and if I need to bang a lot of text out I can attach the keyboard.
I'm a bit overwhelmed with work work atm, but when that calms down I'm keen to try AIDE[0], as well as look for a more light weight editor for python / html hackery that can connect via ftp/ssh or similar, and play with developing web apps that way.
I'm surprised they're on HN but I have a (geeky) friend who do most of his work on an Android tablet so that his "computer" is always on and ready. That makes a difference when you run more than one time in more than on place and you spend a lot of time with your customers too.
Depends on the type of dev, but anything short of iOS is probably easier on Ubuntu. And, just an FYI, I know people that do iOS on Ubuntu as well (virtualized OSX instance).
That's the major stumbling block for me; slowly I'm moving towards Linux but I still need photoshop. I look forward to the day when design teams produce their work in something other than that monolithic software. Don't get me wrong, it's an amazing tool, but there's no need for anything that complex in web design.
I used Linux for 6 years, but switched to OS X 2 years ago. It's much more comfortable, I have access to the command line tools and the UX is much better. I just get things done now and haven't to care why pulseaudio doesn't work anymore after an update etc.. I'm really satisfied with my MacBook and OS X Lion :)
I was forced to choose "other" since my operating system (GNU) was not listed.
What you're referring to as Linux, is in fact, GNU/Linux, or as I've recently taken to calling it, GNU plus Linux. Linux is not an operating system unto itself, but rather another free component of a fully functioning GNU system made useful by the GNU corelibs, shell utilities and vital system components comprising a full OS as defined by POSIX. Many computer users run a modified version of the GNU system every day, without realizing it.
Through a peculiar turn of events, the version of GNU which is widely used today is often called "Linux", and many of its users are not aware that it is basically the GNU system, developed by the GNU Project. There really is a Linux, and these people are using it, but it is just a part of the system they use. Linux is the kernel: the program in the system that allocates the machine's resources to the other programs that you run.
The kernel is an essential part of an operating system, but useless by itself; it can only function in the context of a complete operating system. Linux is normally used in combination with the GNU operating system: the whole system is basically GNU with Linux added, or GNU/Linux. All the so-called "Linux" distributions are really distributions of GNU/Linux.
My car engine is a Volkswagen and the entire vehicle consists of parts made by various other companies, but do you see anyone going around saying "I have a Brembo/Volkswagen" just because Volkswagen used some Brembo parts to build the car?
Look, GNU is super important and they have done amazing things for Free software. I heavily support them and the work they have done. However, I don't force people that use my Open Source software in their products to put the name of my organization at the front of their product name or label just because they use it to develop their product and neither do most other manufacturers of other products in the world.
Using Linux as a generic label to represent all the Linux based Distros out there is not unreasonable just as using "Volkswagen" isn't. There is no need to cram GNU down peoples throats and force them to write and/or say "GNU/Linux" anytime they refer generically to Linux. If you feel that way, at least be consistent with everything you own and start refering to it by both the components its built from as well as the product itself.
This argument is, and has always been, ridiculous. The GNU user land is no more an operating system than the Linux kernel or the K desktop environment. It's just one component of a larger software system which most refer to by the convenient moniker of "Linux".
It is disappointing to see "GNU/Linux" comments consistently harassed. Consider the history.
The GNU project was conceived in 1983 to create a full free (as in freedom) operating system. This operating system combined many different projects (for example, X), in addition to creating its own components. Before Linux, it had one troubled component - the GNU Hurd, which was its kernel. Torvalds developed Linux (the kernel) in 1990 and released it under the GNU GPLv2 in 1991, which completed the GNU operating system by providing a working, stable kernel. The entire operating system was, and still is (except when GNU is not present, such as Android), GNU. The term GNU/Linux was used to give credit to both Torvalds and the GNU project. The term just as easily could have been "GNU/Linux/X" (and in fact that was used by the Yggdrasil distribution).
Linux, as it was conceived, is a kernel. GNU, as it was conceived, is an operating system. There is an important distinction there.
Fast forward a couple decades. Many people, in an effort to shorten the name "GNU/Linux", simply dropped the GNU portion and referred to the entire Operating System as "Linux". Also combine this with projects such as Android, which use Linux without GNU, and therefore are not GNU/Linux. Everyone has come to know any operating system that uses the Linux kernel simply by the name of "Linux".
Is that correct? Well, that depends on how the term is used. "Linux distribution" is certainly correct --- if your operating system uses the Linux kernel, then yes, distributes Linux. But to consider Linux an operating system is incorrect from both a project and historical standpoint, because it is not --- it is a Linux-based operating system. But to call an entire operating system "Linux" is saying that the entire system that you are using is part of the Linux project. On the other hand, calling your operating system "GNU", as long as it uses GNU, is correct, because GNU was always developed to be an operating system - a collection of components.
So given the history, why is the term "GNU/Linux" ridiculous? Why is the term "Linux" to refer to an entire operating system not ridiculous? Because that is the most popular term to refer to a Linux-based operating system? And given your statement
> The GNU user land is no more an operating system than the Linux kernel or the K desktop environment.
it would seem GNU is just as fitting to be used as Linux. So again, why is such a notion ridiculous?
Mike, what type of computer do you own? Does it use Intel hardware? Maybe it was created by Dell or Lenovo or somebody else? When someone asks you what type of computer you own do you say "It's an Intel/Dell" or "It's an Intel/Lenovo"? Or would you just say "It's a Dell"?
The term "computer" encompasses a wide variety of components. Had I purchased a PC from some specific company, such as Dell, stating that I have a "Dell computer" would indeed be accurate (stating "I have a Dell" is not technically correct; Dell is a company, not a computer). In a similar since, saying I use "Gentoo", "Trisquel", "Ubuntu", "Arch", etc is accurate.
I built my PC using hardware I purchased separately. I do see your argument - if someone asks me "what type of computer do you own", I would not list each hardware component individually. The problem is - that question is terribly vague. What type of computer do I own? Well, it's classified as a PC. Generally, when someone asks that question, they are looking for a specific company name. When I respond that I built my own, that answers their question.
If someone asks "what type of processor", I would then respond "I use an AMD-based system". In that sense, if someone asked what kernel I use, I would respond "I use a Linux-based system" --- they were specific enough to inquire about a specific component, so I would respond in such a way that answers their question.
"What operating system do you use?" Technically, I use GNU, and to those who understand what GNU is, I respond just like that --- "GNU". For those who may be unfamiliar with GNU, I will state "GNU with Linux" --- the GNU operating system with the Linux kernel. If I used GNU with another kernel, it is still GNU. Linux is one component of my operating system.
That said my display server, window manager, text editor, etc are also all useful components of my operating system. I would not say I use "GNU/X/Xmonad/Vim", simply because that is not a distinction commonly requested. Perhaps one day, if Wayland becomes popular, "GNU/Linux/X" would be useful/necessary.
It is what it is, history. Calling it GNU/Linux might have been appropriate when GNU was such a big part of the system. It isn’t any more. Linux, on the other hand, is the biggest open source project in existence. Saying those two are equal is ridiculous.
The argument is not about equality; Linux is by far much larger in nearly every regard than GNU is. The argument is toward correctness. No matter how small GNU is, "GNU" describes an entire operating system, which is comprised of many components, some maintained by GNU/FSF, others not.
Let's say that I released an operating system called "Mike OS", which used Linux, portions of GNU, etc. My only contribution to the operating system, aside from packaging, was a simple script to handle package management/configuration. Well, it's still "Mike OS".
I think much of the confusion comes from people thinking of GNU in terms of projects that the FSF personally maintains. As stated by http://www.gnu.org/: "GNU is a Unix-like operating system that is free software—it respects your freedom. You can install Linux-based versions of GNU which are entirely free software."
No. GNU + Linux does not make an operating system. That’s a lie. In any modern distribution GNU packages are just optional components. They fit in a larger framework that makes an operating system, just like everything else. Stallman’s threshold for GNU/* is apparently just linking against glibc.
When GNU makes an actual distro, they can call it GNU/Linux.
gcc would be the principle counterargument. At a more fundamental level, the GNU project provided the philosophical foundations for the Linux kernel to develop. GNU is among the principle reasons (the AT&T lawsuit being another) that we're living in a Linux rather than BSD centric world. I suspect that the free software model of the GPL also mattered -- BSD/MIT licensing have their place, but they're not a match for the OS/kernel as a whole, at least not at this stage of the game (in an earlier period they did help establish UNIX as an industry standard, and spreading standards and reference implementations is a key element of these licenses, hence: X11, Apache, BIND).
As I find the GNU userland superior to other tools, I also find that it's worth consideration.
And in all cases, I find it's sufficient to acknowledge the FSF's contributions to the environment I use and prefer. It's GNU/Linux.
I took a leap of faith from Windows to OS X as my main desktop OS back in late 2004, before it was "trendy" to use products by Apple - and long before users of Apple products began being referred to as "Applefags". I haven't looked back once. Besides OS X and Windows, I've been using the *BSDs (mainly OpenBSD) since around 1999.
I assert that you can't be a programmer (Read anything unofficial MS) with Windows. If you try, that just tells me you don't care about your tools. Which tells me you need to level up.
I don't use crappy unportable tools, which means everything I care about also runs fine on Windows. I'm not writing Win32 apps right now, but when I was I certainly had no good reason to add work just to avoid Windows on my dev box. Win32 isn't a very tasteful API, but it at least has the virtue of being intended for what people want to do with it, unlike the World-Wide Web (which is being destroyed so that we can have an astonishingly ham-fisted and inefficient application platform instead).
Saying "I use GNU/Linux": OK. Saying "I don't use Linux and am going to distort your poll results because you didn't use the terminology I prefer": not OK.
And, honestly, if getting a few downvotes for some random thing is "setting a new low bar" then something's wrong with your calibration.
> "I don't use Linux and am going to distort your poll results because you didn't use the terminology I prefer"
Well, first of all, I didn't "distort your poll" because I didn't actually choose other. In addition to hacker news users hating the FSF, we apparently are also immune to sarcasm and humor.
Second, I'm sorry, I didn't know that polls on Hacker News were so serious and scientific. Do you police rules like "pick only one" with words like, "Come on guys! I'm trying to run a scientific poll here and you're ruining it!"?
"Since when did it become not OK to side with the FSF?" I'm sorry, but when did it become obligatory to agree with your world view. You were down-voted presumably because your comment was fatuous. Being a 'hacker' is not synonymous to exclusively using F/LOSS. Ideological fallacies are just the same as all the other fallacies...
One thing I have discovered in this transition is that I find the fanboism illogical and pretty much pointless. I've found that I can, for the most part, do about the same sorts of things in one OS as another, though the process is different. Yes, there are some applications available on one but not the other, but I have not found anything that forces me to believe that one OS is superior in all ways to another.
I can appreciate the differences and pros and cons to various operating systems, but I honestly get annoyed by people that attack or bash another OS with broad statements like "Well what do you expect from Microsoft?" or "Well I can do that on Mac why can't I do it on Windows?" Worse, it's never appropriate to associate an OS choice with a personal character flaw of an individual as often happens in flame wars. Windows, Mac OS X, and Linux varieties are all tools and should just be seen as such.