I get so frustrated reading blog posts by user interface designers. They almost invariably consist of reasoning I disagree and conclusions I disagree with in support of user interface decisions that make my experience as a user worse. Ubuntu Unity is pretty much an anthology of user interface decisions that are exactly wrong by my perception, and "lets get rid of quit" is just another one. Of all the actions in any menu ever, "quit" is one of the few that I thought confused no one. Then I read something like: "in Ubuntu, we have some elements waiting to help out: the messaging menu, the me menu, and the sound menu" and discover the concept is stuffing functionality into a maze of predetermined slots based on a set of usage assumptions. I don't think any of this is responding to the needs of users, I think its the equivalent of rearranging the furniture in the middle of the night so people trip over the couch when they are walking to the bathroom.
I don't know that I'd necessarily attribute it to the Canonical team, but the best UI designers make decisions that don't sound right out loud, but actually work quite well in the product.
That's often the very point of UI, is in studying behaviors and responding to them in ways that aren't obvious, and that often sound wrong on their face. Apple, in particular, are considered a paragon of UI excellence, yet many of the decisions they make SOUND downright terrible.
Of course, using their products is generally easy and convenient.
For the record, I am not a UI/UX guru, so all I can offer are anecdotes on this, but I think that the important takeaway is that good UI/UX should not be obviously right. If it were, everybody would have already been doing it, and I think that we can all agree that most UI of yore is downright bad.
Recently, meaning in OS X, I've felt like the Apple UI has degenerated through the use of "easy looking" UI elements that remove every "confusing" element to the point of preventing the user from doing anything.
Not only can I not find files when Finder doesn't use folders but my parents and my friends who actually have Macs can't find files either.
It seems to me the Mac doesn't win by real usability but by looking beautiful on first glance, having a reputation for usability and having a strong following of folks who will teach people who otherwise struggle with the interface.
I agree, generally. I find Mac the most irritating platform to use -- the shortcuts are all different because Apple has a silly insistence of using Super where they should use Ctrl, and common shortcuts are supplanted by the OS, leaving apps to have to find new ones. The Dock is a slow method of switching and Sup+~ shortcut to switch to a windows is more annoying than a straight Alt+Tab. Sup+H and your window goes away; this has confused a lot of people I know, and it annoys me a lot when I try to look at history.
I really see no serious UI benefit in Apple's products. It seems roughly equivalent to me, and it uses the same paradigms as other conventional systems, it just does it in a shinier way, and it seems cool just because Apple does it, just like everything MTV does seems cool to a teenager.
Huh. I find that shortcut consistency is actually a strength of the platform. Are you talking about inconsistencies between OS X's choice and the other OSes you use? Because otherwise, I don't see it. Only on the Mac do I know that I can use the same key combo (apple-,) to get to the preferences of an application, regardless of where it came from. And I'm not sure how apple-` is "more annoying" than alt-tab; apple-tab still exists for switching between applications. apple-` just switches between the windows of an application. I actually like that ability, and again, it's consistent across applications.
OS X may not be your cup of tea, and that's just fine. You just happened to attack what I perceive as one of its strengths, so I'm surprised by your choice of targets.
I agree with this. The fact that you can cycle through both applications and a specific applications' individual windows (in a consistent way) is definitely a perk of the way that OS X handles windows.
I do mean that it's inconsistent between OS X and other OSes. I mean that Apple+Tab is more annoying, and I would prefer it all to just switch windows instead of applications. Most apps on other platforms have a consistent shortcut to the preferences and/or menu bar (Alt).
Actually, Apple's use of the Command key for GUI shortcuts both precedes the use of Ctrl for the same task and pioneered many of the conventions, such as Ctrl/Cmd x, v, c & s, that are universal today.
Although I was a bit skeptical at first, once I got used to the concept it made a lot of sense. On a Mac, Ctrl+key combos are for contextual menus in the GUI and navigation/commands in terminals and text areas. For instance, you can navigate to the beginning and end of nearly any text element in OS X by using Ctrl-a and Ctrl-e, the standard Emacs shortcuts. And the Cmd+key combos are for GUI and application-level shortcuts, such as copy, paste, print, close, etc. There are two different contexts being handled, so it makes sense to use a different key. The combo to copy a line from a terminal window should not interfere with the combo to cancel the running process within that window, as an example, and on OS X it doesn't.
Most importantly, those shortcuts are implemented consistently across applications. No matter what app I'm working on, Cmd+, will always bring up the Preferences window, Cmd+w will close the window, Cmd+q will quit the application, Cmd+h will hide it, Cmd+p will invoke the print dialog, Cmd+s will save, Cmd+S will save as, Cmd+z will undo, etc. I never have to stop and think about it, or get jarred out of my train of thought by an application that uses some non-standard shortcut to save, or quit or similar. I use Ubuntu at the office and while many apps are making good progress on presenting a consistent interface across the OS, I still get frustrated regularly by apps that do things differently.
Finally, Cmd+Tab provides the same functionality as Alt-Tab on Windows and Linux. Cmd+` lets you switch between the windows of a particular application, which I've found to be quite useful on a many occasions.
Point being, don't knock it till you understand why others like it. While I don't agree with all of Apple's UI decisions, there are plenty of excellent usability features in OS X and I now prefer it to both Windows and Linux because those features line up nicely with the way I work.
Aside from a few crappy lowest-common denominator Java desktop apps that haven't been optimised for OS X, but thankfully those are disappearing fast.
My apologies for the formatting being messed up, I forgot that the * character invoked italic formatting. That last sentence was supposed to be a footnote.
I really don't understand how current finder works, having used it a dozen times. Whatever it does, doesn't seem to involve separate folders. Seems Prima facie evidence of lack of usability since I have been able to reasonably easily understand Windows, Linux and the older Macintosh finder-thing.
And yes, iPhotos seem to be a problem. I watched a friend spend three hours trying to find some photos they'd previous loaded in iPhoto.
I'm really confused about what you mean. OSX's Finder hasn't changed much in years. This is actually a big complaint of the OSX user community. They want it to change more. Finder in 10.6, even 10.7, is really not much different than Finder from OSX 10.1 though quite a lot different than Finder from Classic.
A right click in most programs on a "file" will reveal an option that is along the lines of reveal "filename" in finder. iPhoto does this.
I don't know if I understand what finder is doing different than say windows explorer or konqueror in linux. when you click on finder it opens a new file browser that has the typical list and icon view as well as some quicklook view that I don't like much and a nested list view where each column is a different folder. It seems pretty similar to windows in my opinion. (not that it's all peaches, I would really like to have a file cut and paste option instead of only copy but I digress)
edit: Looking at your higher up comment I guess you are talking about the isuite of programs that aggregate all your files and handle them internally? even that is a folder on your hard drive(~/usrname/Pictures/iPhoto Library is still a folder just right click on it to select show package contents and you can get to all your files or just open iPhoto and right click on a picture you want to locate and select reveal in finder) though and it's less of a function of finder and more of a function of the iphoto itunes etc.
- In the case of iPhoto, I remember the program automatically copying the files from a camera to some folder deep within the hard drive and then giving not a clue later as to which folder that was and certainly not allowing any exploration between folders...
- Don't you find "a right click" with a one-button mouse a bit tricky? I did try it but somehow it wasn't happening... maybe that's where things went South.
- Plus, being a geek, I always can do what I need to do. I'm not really looking for solutions here. My main argument is that despite a reputation for usability, OS X seems to me to lack practical usability and to instead mostly coast on good and reputation.
> I remember the program automatically copying the files from a camera to some folder deep within the hard drive and then giving not a clue later as to which folder that was and certainly not allowing any exploration between folders...
Well the fundamental idea behind iPhoto is: why would you care where it put your files at all?
* If you want to see or edit your pictures, you can do so via iPhoto (or iPhoto-compatible image manipulation)
* If you want to mail or export pictures, iPhoto has "Share" and "Export" options
* If you want to open one of your pictures in a third-party software, OSX's standard file picker has a "media" section which gives you a special iPhoto file picker (which works extremely well, it can even search through all your tags and faces)
All the iLife apps (iMovie and GarageBand, mostly) work this way. So does iTunes.
> Don't you find "right click" with one-button mouse a bit tricky?
Control-click. And every mac that has shipped in the last 3 or 4 years can right click.
> Well the fundamental idea behind iPhoto is: why would you care where it put your files at all?
As you say, it's not just iPhoto - it's a big part of the iVillage. Certainly some people - not just geeks - like this approach, but I find it patronizing ("Don't worry your pretty little head about where we store the files.") and authoritarian ("You may not delete the Pictures folder from your home folder or give it an alternative name.") all at the same time.
Long story short, I shouldn't have to open an application to grab one file. I should be able to use the file explorer. Apple seems to go out of its way to make that harder for me.
> Certainly some people - not just geeks - like this approach
I would argue that non-geeks are, in fact, far more likely to enjoy this approach than geeks.
> I find it patronizing ("Don't worry your pretty little head about where we store the files.")
Well except it's easy to know where they're stored.
> and authoritarian ("You may not delete the Pictures folder from your home folder or give it an alternative name.") all at the same time.
And this one is utterly bonkers: it's an open not-secret-at-all that iTunes and iPhoto let you move, create switch library by pressing Option as you start them.
> Long story short, I shouldn't have to open an application to grab one file. I should be able to use the file explorer.
You are. It's not like the file is stored in a secret binary database.
> Apple seems to go out of its way to make that harder for me.
> Well except it's easy to know where they're stored.
Sure, I can control-click the folder, select "Show Package Contents" (?!? an obvious way to say "open this folder"), open the directory named "Originals" and then drill down into the year-stamped folders until I find my albums. But clearly Apple doesn't want me doing that. Otherwise, the photo library would work like all the normal folders. I could simply double click to open it and view its contents.
This is all I meant by "Apple goes out of its way to make this harder."
Why are you using iPhoto again? It’s not for you. I see no reason whatsoever why I would ever want to touch my photos in the filesystem (and there is nothing wrong with that) but you seem to need that functionality (and there is nothing wrong with that).
Apple has always provided an alternative way of importing photos in Mac OS X (it is, in fact, the way of importing photos that predates iPhoto), the application is called “Image Capture”, it’s in your applications folder and you can make it you default for whenever you are connecting a camera. It puts photos in folders.
So, "drag and drop" out of iPhoto isn't what you want, and you don't want to just use the "open file" dialog from another OSX app (since that lets you access all parts of the iPhoto library), and then you don't like how you have to click down a few folders once you do use "show package contents"?
The only person going out of their way to make this difficult is you.
FFS, Apple is far from perfect, but the things you're complaining about have pretty simple answers.
Let's imagine I use a Mac. I open Gmail and I want to send my father a cute picture of his niece. There's no direct way to do this without leaving the application I'm in (Safari) and going somewhere else to get at the picture.
By contrast, if this were a Pages document, I could simply click on "Attach a file" and then browse to the folder where the document lives. In the case of iPhoto and iTunes, this feature - file browsing - is simply not as straightfoward. I don't think that's up for debate. You are free to suggest ways that I can get at the photo (through iPhoto, through another OSX app), but I can't browse to the item the way that I can browse to other files on my filesystem.
> The only person going out of their way to make this difficult is you.
I really don't see how. I want something perfectly normal: I want "Attach a file" to work. In many cases, with many Apple programs, it does. But in a number of other cases (iPhoto, iTunes), it does not. This is not my fault. I am not doing anything special or nerdy or geeky here.
Ah, but you can access anything in iPhoto in the "open file" dialog that opens up when you go to attach a file in gmail. It's in the left sidebar under "media", from there you can get to anything in iPhoto organized in the same way they are in iPhoto (here is a screenshot: http://civicit.com/~tvon/files/osx-open-file.png).
Obviously though, you didn't find this, and I can see how that could be a bit obtuse if you expect to browse "Pictures" and find a bunch of image files.
> Let's imagine I use a Mac. I open Gmail and I want to send my father a cute picture of his niece. There's no direct way to do this without leaving the application I'm in (Safari) and going somewhere else to get at the picture.
I hope you are not serious, because it's not only trivial, I explained how it works in the comment you first replied to:
> * If you want to open one of your pictures in a third-party software, OSX's standard file picker has a "media" section which gives you a special iPhoto file picker (which works extremely well, it can even search through all your tags and faces)
OSX's standard image picker has a direct access to iPhoto libraries, and gives you direct access to iPhoto's search engine as well. Likewise for iTunes.
> I really don't see how. I want something perfectly normal: I want "Attach a file" to work. In many cases, with many Apple programs, it does. But in a number of other cases (iPhoto, iTunes), it does not. This is not my fault. I am not doing anything special or nerdy or geeky here.
Out of 6 phrases in this comment, only 2 are correct. And one of them only barely.
I was serious, and I was wrong. Thanks and thanks to tvon for explaining how this works. I didn't read your initial comment that far down since I thought I knew how it worked. I was busy being annoyed at Apple and in a rush to write my response.
"Don't worry your pretty little head about where we store the files."
It's just a different type/style of abstraction that may not be suited to you!
All file systems essentially hide the nasty details of where and how files are stored (which inode does it start at, is it fragmented across multiple sectors, which physical device?).
From a UX point of view there is a lot to be said for not treating every single file in exactly the same way.
I'm sure it's an approach that will get more and more prevalent in future (context relevant functionality for the types of files you're currently interacting with/managing).
Needing two hands to get a context menu is the kind of thing that makes me question the 'minimalism' of OSX. Same as needing two keys(/hands) to do 'delete to the right'
So... when I help someone out and need to right click, it's not an annoyance to either alter their settings or have to use two hands? The 'single button interface' demand is silly - all it does is force people to use modifier keys to get the functionality they need. How this is different to just enabling right-click I'll never know.
I can't verify this at the moment, but I'm pretty sure that you can assign the right and middle click actions on a three button mouse to do Cmd- and ctrl- clicks, so this is a nonissue for those of us with a standard mouse.
I'm thinking more about the macbook pros that I see lots of people with. Another pet peeve with apple laptops is that the physical click is only at the bottom of the trackpad - why not the top? I have large hands and it's really not anywhere near the resting place for my fingers, especially if I'm typing.
Are you sure iPhoto still has a 'reveal in finder' option? I"m not seeing it... Though I don't know why you'd need it with iPhoto since you can just DnD photos out to the desktop (or wherever) if you want to work on them.
> Whatever it does, doesn't seem to involve separate folders.
Sure does, it overlays a standard Unix filesystem with some OSX-specific features (hides "system" folders such as /bin or /tmp, displays "bundles" as files).
Why "modern, up-to-date design decisions" tend to actually make computers lousier despite their being conceived of by people whose very job it is to make things better?
And especially, why do they tend to make open source GUIs especially shitty?
Well...
First, what's good and desirable about these "modern, up-to-date design decisions" UIs is that they are organized around a tightly integrated, sort-of-intuitive metaphors in theory, and even in UI Labratories, let the average person accomplish more things, and accomplish those thing more quickly. The Office ribbon is fine example. In order to let a person accomplish more while confusing them less(giving fewer choices), it must anticipate what the user does...
Anticipation... stop there, there's the first problem - anticipating the users intentions. It's not just that the anticipation can be just wrong from get-go (though that happens too). It's that anticipation is an approach in it's present incarnations scales very badly. As soon as you want to do something hard, the system will stop anticipating you and so the hard thing is twice as hard.
And problem two of tightly integrated metaphors is that since they inherently have to abstract from how a computers' software and hardware actually work, any task that's hard for the machine will suddenly have to be done in a fashion that takes into account the machines' limitations and so violates the metaphors - if your app takes up 10GB, it need a quit menu item, damn - but giving it that menu now puts it in conflict with the "UI guidelines".
And finally, the reason tightly-integrated metaphors are especially bad in Open Source is they require more consistency so the user can count on them (even from otherwise separate applications). And you'll have a hard time getting that with a horde of volunteers. IE, why KDE 4 was terrible.
It's funny, because from my perspective as a casual office user, I actually thought the menus made things a little easier to use, even if I needed to "re-learn" where things were.
My wife however, much more of a "power" office user, has completely sworn off of them. This is a person that knew pretty much where every menu item on word and excel was by memory. Her experience has been destroyed, so much that I needed to re-install 2003 on all the machines that she would use.
This is the engineering equivalent of offering discounts to new users, while ignoring loyal customers; it leads to churn, but not much progress for day to day users (i.e. the biggest users of the software). Applications that I like the most have "levels" of the user interface; you can switch to a more complete, complex, user interface once you get more used to the program.
It would seem that existing users' knowledge of menu/button layout was sacrificed for new users' organizational benefits.
From a "never having used Office before" point of view, it is more logical now. From my power user's point of view, the first 30 hours of usage had an additional 5-10 hours (wag) of "okay, where would they have moved THAT feature to?"
Getting rid of quit/close requires the user to learn entirely new metaphors every time they use a program in order to know how to get a program to stop taking up resources. In a system with effectively limitless resources this is fine, but in lower specced systems, not so much.
Also "effectively limitless" means equivalent to the specs of the worst system the designers use, which will likely skew things. Where I'm sitting we share 384K down / 128K up between about 40 people. Updaters that refuse to stop downloading when you close them really bog down the network. Now, you could try and put in some custom menu to control that sort of thing, but that's going to be a mess of configuration and hard choices, and developers will have to do that for every single app.
Quit/Close are two well-defined commands that give the user a well defined way to say when they want something to go away vs. when they want something to go away and quit doing anything. Re-implementing "go away and quit working" on a case-by-case basis is going to end up with every app either having a nonstandard interface for doing so, or no way to do so at all (like most update managers.)
Personally, I think "some apps don't quit properly" is a description of a problem with the apps, not with the "quit" command. Comparing to mobile is a red herring, because you can in fact quit background services on most mobile platforms, and it's a necessary thing to do. It's only programs that by design, always gracefully suspend, full stop, and resume later that make sense to not have a quit command.
As someone who has used Matlab, and someone who has used Eclipse, you can take my "get your fat ass out of RAM already" command when you pry it from my cold, dead fingers.
Actually, on my Android phone I'm constantly using a third party task killer to stop services and apps running in the background that I thought I 'quit' by backing out of them. I also use the quit option on apps that have it.
I don't entirely trust Android to manage battery life as efficiently as possible yet, especially when iPhone still does it so much better, and when every time I check the services running on the phone, there seem to be tons I had never even started in the first place.
I consider that at least partly a UI/UX issue.
(On a side note, that's my only complaint about Android so far, using 2.3, otherwise it's great.)
Your task killer might be hurting your battery life rather than helping it. In any case, in recent versions (2.1+ I think), Android is better at managing tasks than a third-party app, and overzealous background task killing doesn't help anything.
I heard this as well so when I got my android I didn't install one. But the battery life was horrible. Often, if I used the phone regularly during the day the battery would be dead by 9-10pm. I installed a task killer and killed everything when it booted up and killed a couple times a day. After that I suddenly have ~1.5 days worth of battery available. I'm not sure which app was being such a hog (I suspect it's an app that was refusing to let the phone go into sleep mode) but the task killer was most definitely necessary.
You're killing a mosquito with a howitzer, and you may have better battery life by letting Android do what it's supposed to, and find the offending application.
Download "Spare Parts", go into its battery history section, and select "partial wake usage".
You'll find out what (if any) app is keeping your phone in partial wake lock status.
Thanks for that article, very informative, but not necessarily conclusive. There are also a ton of comments that resonate with my concerns, about how random apps I haven't touched seem to constantly start themselves and run in the background doing who knows what - using CPU (battery), bandwidth, whatever.
At least one comment mentioned a significant battery life improvement after installing Advanced Task Killer and turning on its autokill feature, which I've been using a few months. I think I'll experiment a little and turn it off, and see if there's any noticeable difference in battery life.
I agree with you. It was one of the hardest things for me to wrap me mind around. I'm constantly looking for the quit button on apps and I always have the urge to use a task killer to kill off tasks. I have seen a visible degrading of the user experience when I leave tasks running in the back ground. I also suspect app developers are not managing tasks correctly leading to inconsistent behavior.
... to its detriment, at times. I have a very good app on my iPhone that plots out bike paths around London for me. Sadly, when I have reached my destination, and "close" the window, it continues to use the GPS and other battery sucking things to the point where I have to manually go and kill it via the kill-menu (which is all I ever use the fast-switch menu for).
I'm not saying this is Apple's fault, but that it's very hard to work out when someone has "finished" with an app that could continue to, say, provide information about what the user is doing to a third party, or sucking battery life, or making annoying noises, for example. And apps that do this after I'm "finished" with them annoy me to no end. I'd consider that "broken" :)
Maybe it's confirmation bias, but I had to learn the metaphor of "no quit option", and I welcome the quit button/option on apps that have them. I have seen many, many users; particularly new Android users; that wonder where the quit button is, and what happens when they just "leave" the current application.
So, no, it may not have "broken" anything, but it has caused significantly wide, if not deep, cognitive dissonance.
I'm certainly impressed that the Canonical team are thinking about UI issues in this much depth. It is admittedly FAR more thought than I've ever given to the 'Quit' command, ever.
And while my initial thought is that there are much more important UI/UX issues to be solved in Ubuntu before this, I think that thinking of things like this are key in fixing the global UI altogether.
I'm not entirely certain of whether or not this is the best suggestion though. I LIKE to quit applications. I like knowing that they aren't running in the background, that they aren't consuming resources, that they aren't going to pop up a notification, that they aren't cluttering up my taskbar. I'm the same way with browser tabs -- I don't like having any more open than I need.
If I'm in the middle of something, and need to have a lot of tabs open, I'll generally open a new browser for my casual browsing that I can close when I'm finished. Conversely, I almost always keep Photoshop open, even if it's idle, because launching it takes more than a couple of seconds. I don't know if getting rid of quit is exactly what I'm looking for.
I feel the same way as you do - I like to feel my system is clean and uncluttered. However, I still agree with their thoughts.
We may like quitting applications, but realistically, memory/CPU management is something that can and should be handled by the OS. It's an overhead on the user's mind that can be dealt with perfectly well by clever automation. It should keep itself clean without needing our help.
A kid growing up in a world without the 'quit' button would not feel anything is missing; we've just been conditioned by the creaky operating systems of our own youth.
I don't necessarily disagree. Most of my complaints are systems of the existing broken paradigm. That my taskbar is cluttered up could easily be fixed with their suggestions, and a part of their remedy might even be to get rid of the taskbar altogether.
The proof is in the putting, and with conceptual changes like this, I'm leery of judging before I see the actual results. For the most part, backgrounded running applications don't take up resources unless they're supposed to (like music playing, or an uptime monitor, etc.) so I don't have any qualms about leaving that to the OS (unless they screw it up,) but all the other visual aspects need to be dealt with at the same time for this to be effective.
I quit applications because open applications leave UI remnants laying around. If they can figure out a way to do the Right Thing with the UI, then it wouldn't bother me.
As an Ubuntu user, these articles scare the hell out of me. The quit or exit command is not confusing at all. And if you want to get rid of it, you'd better make sure that none of your programs have any memory leaks ever (good luck with that).
You know what is much more confusing than the quit command? Having to go into the CLI or the task manager and kill stuff manually because it is eating half of your memory and processor time doing nothing. Or having to restart your PC every day because it just gets slow after a while.
Here is the most important thing about GUI design - you should confirm people's expectations. People expect to be able to quit stuff.
It is extremely annoying that this starry eyed experimentation is going on in the most popular Linux distro. They are basically risking the one foothold Linux has been able to make in the desktop world. If you want to experiment, you should start an experimental distro and not risk your's and Linux's one single solid success.
> It is extremely annoying that this starry eyed experimentation is going on in the most popular Linux distro. They are basically risking the one foothold Linux has been able to make in the desktop world. If you want to experiment, you should start an experimental distro and not risk your's and Linux's one single solid success.
That is an excellent point, and one that worries me, too.
On the other hand, I'm glad that some major player in the Linux world has an interest in UI experimentation.
How to reconcile these two thoughts? Maybe what we need is another Ubuntu variant (like Kubuntu, Xubuntu, etc.) that is stuffed full of experimental ideas. And then the best ones get into the more mainline releases.
Less Linux-savvy Ubuntu users are less likely to try out a non-mainstream distro/variant. It's hard to find the "best ones" without testing these new features on these users.
Sometimes your users are your Guinea pigs. I think Ubuntu will be fine as long as they remain responsive to user feedback and maintain "get me back to what I'm used to" options for experimental features.
Ubuntu/Canonical might have better luck by releasing a more rolling release distro to test these changes. The Daily build is fine and all, but it reflects the status of the trunk. Having a few branches that can go on wild adventures, killing off 'quit' buttons and rearranging the UI, might benefit them more than the cost of implementing it.
I'm specifically referencing the way Fedora tends to be the frontline for RedHat and CentOS updates, and isn't afraid to roll back when things go bad.
A document you 'close', an application you 'exit' or 'quit'. The document-centric people have been trying to do away with applications since the the beginning of GUIs.
It reality, the document metaphor works great for a few things that resemble documents. But documents are as much an artifact of the paper-handling wold as processess login sessions are to the electronic world. Are they really more fundamental than, say, a stapler? Is a stapler worthy of being a fundamental pervasive metaphor for interaction?
Games? Videos? Terminal windows? Phone calls? Desktop sharing? Text and IM? Many of the things we do with computers have little to do with documents.
Efforts to implement the stapler model include "Cut and Paste" and Object Linking and Embedding (OLE). Microsoft thought you ought to be able to staple anything to anything, like a terminal in the middle of your word document or a clock on your spreadsheet, for no reason other than just because you can. It didn't work because app developers needed to do too much work to support it, so few programs other than Microsoft's own office software were built to link items together like that.
"A few behemoth applications, such as LibreOffice and Gimp, still keep “Quit” separate from “Close” for the original reason — to save you from having to wait for the application to relaunch after closing its only document. But that is fixable, and all other applications have become fast enough that they don’t need it any more. After all, they’re running on hardware that is hundreds of times faster than it was in 1984."
"all other applications have become fast enough that they don’t need it any more". I'm glad this person has decided what I should consider 'fast enough'. So kind of him.
As much as I didn't really like the Mac 'quit' model (as distinct from closing the last document window), most apps still take a long time - meaning I can notice it - when opening from a cold start. Perhaps when everyone has SSDs this will be less noticeable, but it's still noticeable/measurable to me.
I remember working with a guy in 1997 who was fawning over how fast the next Windows was going to be. "It'll boot up in, like, 4 seconds!". Right... however fast our hardware becomes, our apps fill up the hardware with more 'stuff'.
While CPUs have gotten much faster (60%/year), memory hasn't kept up (barely 10%/year). Most programs spend much of their time moving data around in memory, because common programming styles lead to data that is extremely fragmented (poor locality) - memory is the bottleneck, not processor speeds.
That's a bit depressing to hear. The SSD perf I've seen on laptops seemed great, but I suspect that in conjunction with slower memory, we'll still see issues. :/
Yes, I agree. I'll caveat my original statement by saying this; I use Windows (work machine), and I'm using the SSD's they gave me, so they may not even be good ones.
That said, I can come nowhere near the performance of a video where some guy wired up a bunch of SSD's, then opened every microsoft office application at once and they just popped up on his screen. Some crude timings below (in seconds; time to open from the "Start" menu, till the hourglass morphs back to an arrow):
Now, you may be saying "DUDE! That's awesome!" and yes, it is, compared to platters. But it's not instantaneous, and as soon as your context readjusts, you still notice the startup time.
Now, you may be saying "DUDE! That's awesome!" and yes, it is, compared to platters
???
I'm saying "which co-worker swiped your SSDs while you weren't looking?". Quick test here, I rebooted Vista and opened Office apps, with a stopwatch timer in the other hand, rounded to closest half a second the results are:
Word: 3s, Excel: 2s, Outlook: 3.5s (no mail in it), OneNote: not present, PowerPoint: 2.5s
That's from a 5400rpm WD Blue laptop drive, on a Core 2 Duo laptop.
So if I understand it correctly: First they threw away the rhythmbox tray icon and made it quit on "X" in order "not to clutter the tray". Then they reverted to their own meta-trayicon and made even more applications fold to tray on "X". Now they want to remove "quit" completely and make "X" a "maybe quit, maybe close, you don't need to know the answer"?
I'd really like it if they started publishing their research / references / discussions. Right now it seems like they just do what they want and wait to see if it works or not.
This argument is pointless. The argument makes no sense without putting in the context of content creation vs content consumption.
Content creation is very resource incentive. If you don't quit any apps ever, you run out of resources. Simple as that.
Content consumption, on the other hand is easy and very optimized. Always have your browser running with 10 pages loaded at the same time you have your music player running and have a movie paused. No problem. Never quit anything.
It's not about never quitting, though. It's about quitting when you close the last document, or something similar, and breaking the distinction between closing and quitting.
Heavy-duty apps should quit in the background after you close what you were working on, but this shouldn't necessarily matter to the user.
I'm happy that Canonical is thinking about usability, but has any of this actually been backed up by any kind of experiment? I see one link on Google to a study done on Thunderbird (sadly down, as it's the same blog), and a bunch of blogspam related to it, but very little evidence that the recent radical changes they are making have undergone any usability testing. If you're trying to help users, shouldn't you study their actual behavior?
I admit though that this is not my field, and I could be using poor search terms.
I've been subconsciously watching this minimization trend go by. The latest instance I've seen is in FF4, where the forward/backward button has lost its little dropdown indicator. Now you have to know it's there, and use a right click to get the list of sites.
Not being a UI/UX/UXB guy, I have to wonder if there's some magazine they or their managers all read, that had an article advocating minimalizing, 'cause there sure seems to be a bandwagon rolling through town.
“Say good-bye to manual saving. Auto Save in Mac OS X Lion automatically saves your work — while you work — so you don’t have to. Lion saves changes in the working document instead of creating additional copies, making the best use of available disk space. The lock feature prevents inadvertent changes from being saved and automatically locks documents after two weeks. And the revert feature returns you to the state the document was in when you last opened it, so you can feel free to experiment with confidence.”
So instead of "save" which seemed rather simple to me, you have to learn two new features such as "lock" and "revert", which do not seem so simple to me.
Oh and also, now you have no idea what data is saved in your documents. Your documents can carry in them hidden past versions without you being aware of it. Which can prove very embarrassing in many business situations. Which means that anyone that handles sensitive data will have to create or buy complex new software that scrubs all documents of old data. (this is already an issue with Microsoft Word and its proclivity to save all kinds of dangerous metadata).
So when before you merely had to save stuff, now you have to worry about reverting, scrubbing, etc. Thank you GUI elves.
The thing is, you tend to hit save every other second, while you probably only need to revert/lock like once or twice a day.
Think of lock just like commiting to your personal source control. Revert than maps perfectly to SVN revert. You work just like you always do. You commit whenever you finished some task and want to start a new one. You revert whenever you found out that you did some wrong stuff since your last commit. I do these actions every day today.
But I almost never discard all changes since the last save. Hence, I'd be glad to have the OS save my work implicitly for me and take care of my personal source control system. The proposed system would actually save me a lot of work!
I think that losing unsaved work is a bigger and more common problem than those downsides you mention. Locking and reverting is not something mandatory, users can simply ignore those features until they need them. On the whole I see automatic saving as a nice step towards computers as an appliance that can be used with less and less regard to implementation details. (The security side is an interesting point, I wonder how they solve it.)
The remarkable thing is that for thirty years we've been using tools that default to destroying your work. I can't think of a better example of our collective tolerance for janky technology than the persistence of such a glaring regression from the typewriters (etc) that personal computers replaced.
Makes a lot of sense, if you kept a transparent source control of every change made to something and constantly updated it on each change. The issue though is how everyone is used to save.
I guess a regular save can be the same idea as a tag in version control with the automated stuff to, I think the next OSX was doing something like this?
> Makes a lot of sense, if you kept a transparent source control of every change made to something and constantly updated it on each change. The issue though is how everyone is used to save.
True enough, but they're also getting used to transparent saving. You don't 'save' your state in Gmail; if Firefox or your computer crashes, you didn't 'save' the current tab state. And so on. The state is just there and always up-to-date.
If you look, you can see how they are saving in the background every X seconds (such as Gmail's composition warning), but I imagine most people simply gloss over that.
I think your overall point is solid, that the application (or os) should be making sure it always has a saved state of my document.
But...there would still be a need for a way to specify where the document should be stored on the filesystem. For instance when writing software, files have to go in specific places. The file can't just be saved anywhere, or in some application specific bucket, I need to be able to specify the location.
Can anyone recommend a good novice-friendly Linux distro with a clean, consistent UI that isn't being progressively replaced with presumptuous avant-garde nonsense?
I guess the next one in line should be getting rid of file as a concept. Not much point knowing where exactly the song you want to listen or a movie you want to play is stored.
Depends what you mean by "where". With respect to finding it amongst other files, making it available on portable devices, sharing it (or NOT sharing it) with others, "where" is very important to the user.
What we really want is the user's concept of a file decoupled from the system's concept, and then cleansed of various baggage from the prior coupling. That's very different than making it a system concept exclusively.
I meant from a user's perspective. You shouldn't care if the song is stored in /foo/bar or in a remote database somewhere in the cloud. You access it through some obvious interface e.g. iTunes.
Having 'file' as a central concept of accessing data doesn't have much connection to the actual way it is used. My vim config file doesn't have much in common with a song I've just downloaded. It is stored the same way, but that's about it.
Most aspects of the file concept are indispensable to the user. A generic data container is what allows us to have standardized document formats not coupled to applications or platforms and generalized organizing, packaging, sharing, storage, and search. These are user concepts that can't be abstracted away without sacrificing enormous amounts of utility.
The aspects that the user can do without are related to sharing the file system with software that uses it as a backend store.
I'm sorry, I just don't see where the problem is. To take two applications that I run almost continuously, an editor and a browser, both have multiple instances of task--- both shown to me as windows as it happens. No problem in either opening or closing them. The mechanism is as old as windowed operating systems; find the menu item that says close or click in the corner (or it's analogue.) This action scales; do the same thing if I wish to close the app (although as was pointed out in passing in the article, this action is decreasing.) Is there anyone reading this that is confused over these actions? I really don't think so. Instead of the disparaging remark about cargo-cult interface design, maybe it was a case of the designers understanding that there are some wheels that need to be shared and not re-invented. Given the time these methods have been in place and in use, it doesn't seem like a good thing to change without a much more compelling argument.
Many applications have both a "close" and a "quit" option. The "close" option does what you describe - they want to get rid of the "quit" option, which is used to quit an application.
(Try opening two Firefox windows and you can easily see the difference.)
This guy may be happy to know that the new Mac OS that's coming out doesn't have running lights on the dock by default. So you can't really tell if an application has quit or not. In fact, apps now resume when launched.
The trouble is, not all apps support resuming when launched. So it's like a big mishmash and you're quite sure which are running and which are not. The transition will not be pretty! I'm gonna turn my running lights back on as the first thing I do on the new Mac OS.
Look, I actually LIKED the Mac's paradigm of being application-centric, and not document-centric. The latter was Windows' way: MDI, SDI, and all that. When you closed the last window, the app closed. Or did it?
On the Mac, I had control over when I wanted the app to close. Even when I did not do development, I understood what was a program and what was a document. It ain't that hard to do. As a developer, I really love to know what app is running and be able to shut it down, without having to force-quit it.
This looked like an interesting read. But then they mentioned Rhythmbox and I remembered the one thing I hate about that player: that stupid icon that stubbornly stays there in the notification area even if I don't want to listen to music any longer.
I think you could easily make the opposite point: With slower media (floppy disk, slower hard disks etc.) starting an app was costly. This is why people tried to avoid quitting an app just to select a file. Today hard disks (especially SSDs) are fast. Quitting and starting an app is rather cheap. (With the exception of those commercial apps that try hard to make you feel that they were worth the money.) I thus would like to propose to get rid of all those "Open file" dialogs. In the age of multitasking, they are an obvious case of cargo cult.
With all the acknowledgement of Apple's role on the history of this, I have to wonder whether they are aware that Apple has been laying the groundwork for doing the same thing since Snow Leopard, and that Lion can already automatically quit and even pre-start applications without the user's involvement.
That said, I suspect Apple will keep the "quit" menu item even if it's not necessary for at least one version so that people have a chance to learn that it isn't necessary without being freaked out by it's sudden disappearance.
Also, although app developers have been encouraged to adopt to the new conventions, there will be plenty of legacy apps that don't for some time to come.
The recent UI changes in Ubuntu (getting rid of X11, now this?) seem just crazy. I use Ubuntu on my machine, because I want a developer friendly OS. Recently, however, it just has not been the case: for example, OCaml 3.12 is still unavailable in apt repos (even on "unstable" branches), despite it being available in Fedora/CentOS yum repos or in macports. Same for Scala 2.8.x, Erlang R14, etc...
Has the Ubuntu team completely forgotten its original constituency in pursuit of "Linux that your grandparents can use on their netbook?"? Looks like back to debian-unstable (or Fedora rawhide?) for me.
I remember the first days of GNOME, a Windows-clone in looks, with an ambitious acronym "GNU Network Object Model Environment". Are they anywhere near that acronym in vision today?
For me, as a power user, "quit" means "stop doing anything!!", for an IM app, it means: stop showing me as available, stop using my yahoo login because I want this other program to use it! Just stop everything! Don't assume "oh you just want to be offline!", NO! I want to stop you in your tracks and prevent you from doing anything what-so-ever.
I quit an application when I feel it's not doing what I want. I quit an application when I feel the application is being presumptuous and making false assumptions about what I want to do.
I hardly ever quit an application because I need the memory .. it's not about process/memory management. I often close application to reduce clutter on my desktop, and clutter can be reduced without actually quitting applications, so they have a point there, but I'd still hate it if applications assume that I don't really want to quit.
It really annoys me that closing Banshee doesn't stop it from playing music.
It's about setting rules and drawing lines; it's about having control over one's out computer.
I quit a movie/music player to stop from emitting sounds. No, the sound menu is not enough replacement. It might be a good alternative, but not good enough to warrant "never quitting the media application".
I open a browser in private mode then quit it, because .. well it's private mode; if you can't quit it it kinda defeats the point.
I quit a download application (e.g. a torrent client) to stop it from downloading/uploading (to free up bandwidth).
You could try to rethink every use case, and you can provide other ways to achieve the same goals. But, in the end, this is not a good reason to make applications non-quit-able.
This brings to mind the X session management protocol (http://en.wikipedia.org/wiki/X_session_manager) that never seemed to be used be in modern Linux applications. Issues that others brought up regarding memory consumption could be addressed with more consistent use of this or a similar protocol. While intended for saving state on logout, there's no reason the window manager/session manager/something else couldn't just direct an application in the background to save state and then kill it when memory gets low. That said, it's probably a bit too late in the game to establish such a policy. Users would be more likely to blame the desktop environment for killing the program and losing all of their work instead of blaming the application for not saving the state to begin with.
But really, what you want is for every document-editing app to open a single document, with maybe some tools on the side. The problems he mentions here go away and the "quit" menu option remains perfectly logical.
It's made a little tougher through Browser-tabs being the bastard-son of MDI. But seriously, multiple windows should be handled by the taskbar or some other thing.
Music is tougher question but considering every the Linux jukebox apps I know is a complete pig on resources, some way to quit pretty necessary.
I'm significantly confused. The article seems to be talking about the distinction that OS X has between closing a window and quitting an application, and talking about how this behavior exists on Windows and Ubuntu as well. I must be missing something somewhere.
a) Does this mean ubuntu will be patching each and every application in synaptic to follow this new paradigm?
b) Isnt an explicit action more favorable over an implicit action? ie a direct mapping between what the user is doing and what happens with system resources.
c) How do you close emacs with 25 buffers or browsers with 20 tabs in this new world?