It gives me a big smile when developers take performance seriously. Performance is user experience.
I am passionate about performance in web, native, and mobile applications. It irks me tremendously when web applications are slow to respond to my actions. There are some otherwise very-useful sites that routinely frustrate me with their high latency.
But I think my greatest performance peeve is embedded devices such as commercial DVRs that are not simply slow but agonizingly slow. I've seen some DVR UIs that frequently consume several seconds to react to user input. I consider it remarkable that engineers can be so beaten by their budget-tightening peers to allow for these abysmal user experiences.
Machines need to act like machines: fast if not clever. They process input literally, so they had better do so quickly. That's their lot in life, so to speak. When I speak with a human, pauses—a few seconds of thinking—are expected because we are thoughtful creatures. My DVR should not need to think about what it means that I pressed the "Guide" button. The channel guide should just plain appear. In fact, a DVR does not think about anything. As the user, I perceive the DVR software squandering hundreds of millions, perhaps billions of processor cycles doing unimaginably wasteful operations instead of simply displaying the channel guide.
Bravo to the OP for looking well beyond his own use-case and considering what his users may be doing with his application!
The DVR is establishing a connection with the cable provider so that it can communicate the details of your activity back to them. This can take several seconds due to load or inadequate resources on the server side.
I don't know if you're being serious or not. If that's true, it's horrible on many levels, perhaps the most bothersome being holding up the UI while waiting for a network service. (Though tracking DVR usage at that granularity is annoying in its own right.)
I'll give another example of why this is important. Ubuntu decided to have a 'Mac style' global menu bar for all applications. Turns out that when Unity tries to display a menu bar for Firefox and encounters a lot of bookmarks, something goes wrong and the whole desktop freezes. Guess the designers/developers never considered someone with thousands of bookmarks?
I couldn't agree with Brent more. Performance is absolutely critical to the user experience of an app. It is not premature optimization to ensure an app remains responsive to the user at all times and can handle real-world sized data sets. It is good engineering.
In his beginner lesson on programming a Poker game with Python, Peter Norvig ask the student to write a test for a 10000 cards deal.
That's the same. "It just work" is the user's resulting feeling for a product that is working well under any conditions, known and unkown, aka a well-engineered product. (Which often means getting over the project manager's head)
This means to have few to no implicit assumptions in the code. Vim is a great example: developped when a 10kb file was a huge file, it will kindly open and edit a 2Gb log file for you (if you don't have all these silly 3rd party plugins). Vim may be burping a bit on very very long lines, but it still shows that the running code behind is not assuming on your behalf that you are editing a couple of haikus.
This is one of the major intersections between CS theory (big-O notation, etc.) and usability that many self taught programmers are missing, or don't take into account.
Being conscious of the structural decisions you're making at a low level, and abstracting them out so that they can be refactored goes a long way to dealing with performance issues along the way.
Is this strictly an issue with self-taught programmers? I mean, there has been a ton of articles on the front page here about why coding tests at interviews are a bad thing, and it seems that most of it revolves around not ever using it, and this attitude appears prevalent among web-coders.
As a self-taught programmer, I take speed and big-O considerations very seriously, but I've gotten the impression that my opinions on this topic are too extreme. I hate nothing more than a slow program and website, and I especially hate it when the site runs so slow it causes my fan to whir. It is nice to see that there is at least one other person that probably wouldn't think I am insane.
Well, I am not sure how often big-O applies to websites. Sure, you can tune your backend to handle processing in .1 s instead of .2, but then user can still wait 10-12 seconds till all the assets are downloaded and rendered.
Speed considerations, though not precisely big-O, certainly applies to the database design and how you call it.
And it must apply in many other areas. How many sites have you visited that takes more than 10 seconds to load lately? How many "cloud-based" programs have you used that takes far longer than it should to do basic operations? I know I can name quite a few.
It's worth noting that one can still be conscious of performance without knowing about big-O or other academic nomenclature. In my experience as a self-taught programmer -- it became quickly evident that doing something in less cycles or while consuming less memory takes less time and gives the user a nicer experience. I didn't need big-O to figure that out. That said, eventually learning about it was very useful.
I sometimes sigh at poorly performing apps. I've been sighing a lot at the feedly app lately. But that doesn't stop me using it, because the functionality is compelling enough.
Putting effort into performance is probably something you'll want to do at some stage if your app gets big enough. But a lot of the time you have more important things to do right now. Users will put up with an awful lot if you've got something they really want; get that part down first.
But if you do not take care the "more important things" will have to be refactored or even rewritten allo over because you build them on a poorly performing base.
Finding the bottlenecks should always be a priority. Fixing them can be done at a later point.
Why not just soft limit it? If you think a vanishingly small percentage of people are going to run into the limits, just put a note "You've reached the end of the internet! Please click here to email me and let me know you've found it" and wait until you get a response, and do the cost-benefit analysis then.
Putting an artificial limit and forced human interaction into software is problematic.
I encountered a database once where there was a hidden limit on the number of categories for customers. Specifically, you could add more categories over some very low number (16 or 20) the program would fail to open - it would hang on a loading screen.
We didn't have source, so we ended up having to find and pull the developer out of retirement (it was a 16-bit windows crystal reports program), who found the error after looking at it for a few weeks. It was a one line change that bumped up that limit to something like 1000.
Most of the time the limit you may have aren't realized, and you assume that the programmer is smart enough to note and communicate with the user about all possible limits.
Don't cripple your software. It's better if it runs poorly than it not running at all.
Poor performance with large numbers of items is just such a soft limit, except instead of artificially disallowing you from doing what you want, it degrades gracefully.
Taking that into account, your proposed approach would be: don't worry about performance on what you consider to be abnormally large data sets, and instead just wait for people to e-mail you about it, and you can decide if it's worth fixing then.
And there's nothing wrong with that approach, I think! Although I think a lot of developers rely on it too much. But it's better than putting in an artificial roadblock.
Well, my point is that you add an explicit call to action for it - my experience has been that <1% of people will email/contact about a problem. Most people just abandon it right there, so a big 'This is a problem' button helps.
Hard not to agree here. It's one thing if you're architecting your side project to handle millions of requests from the outset but I think designing for 30,000 notes will give you a better design at 30 notes too.
One example where this approach obviously wasn't taken was early Android builds (at least on the HTC Hero). I emailed support because it started to take 15+ seconds for a sent/received message to appear in the message list and the support engineer said it was because the whole list had to be raised into memory, the message appended, and then the list written out. I've never seen a single iPhone with that problem, so I assume they thought of it from the start.
I don’t know who would type in 30,000 notes. That’s a lot of typing.
Maybe on a touchscreen keyboard, or in an app limited to that; it would behoove the OP to consider cross-platform possibilities and very long timeframes, or even look to use cases in similar systems. In org-mode (which I've got on my phone and desktop), I've got tens of thousands of lines, and could easily see keeping 30k notes. Granted, it's different, there's more than purely human entered data, but as the OP sagely observes, there may be automated systems inputting to Vesper someday.
You really should have kept reading. OP made precisely that point, that there may be syncing from a desktop or web app, or submitting from other apps, or some other form of automation.
I am passionate about performance in web, native, and mobile applications. It irks me tremendously when web applications are slow to respond to my actions. There are some otherwise very-useful sites that routinely frustrate me with their high latency.
But I think my greatest performance peeve is embedded devices such as commercial DVRs that are not simply slow but agonizingly slow. I've seen some DVR UIs that frequently consume several seconds to react to user input. I consider it remarkable that engineers can be so beaten by their budget-tightening peers to allow for these abysmal user experiences.
Machines need to act like machines: fast if not clever. They process input literally, so they had better do so quickly. That's their lot in life, so to speak. When I speak with a human, pauses—a few seconds of thinking—are expected because we are thoughtful creatures. My DVR should not need to think about what it means that I pressed the "Guide" button. The channel guide should just plain appear. In fact, a DVR does not think about anything. As the user, I perceive the DVR software squandering hundreds of millions, perhaps billions of processor cycles doing unimaginably wasteful operations instead of simply displaying the channel guide.
Bravo to the OP for looking well beyond his own use-case and considering what his users may be doing with his application!