Hacker Newsnew | past | comments | ask | show | jobs | submit | dashingdolphin's commentslogin

> Most things Just Work now.

I would definitely agree this is true compared to "before". I remember fooling with Linux in the late 90s and getting graphics to work at a native resolution to my monitor and sound was a big accomplishment. In comparison to now, where out of the box on my slightly aged thinkpad I could use my touchstick, touchpad (with two finger scroll!), volume/brightness/media keys, etc.

However, I think the goal "just works" is not necessarily the right one, at least not for Linux. It "just works" is kind of like a two way street, the computer won't throw you any curve balls as long as you don't throw it any curve balls, either. Toying with Linux on the desktop for the last few months, I've seen a lot of "it just works" type solutions which become a complete disaster when things break -- and things _always_ break -- because figuring out how it works, what went wrong, and how to fix it is not intuitive. Trying out KDE Neon, I had my screen just go black on login. Booting into the terminal and clearing the cache fixed it, but then later it happened again. I don't like the culture of copy-and-paste witchcraft, if something breaks, I want to know why it broke, and what steps I can take to deal with it.

Trying to make things automatic often means making things complex -- you have to include the baggage to cover every possibility and the logic to identify what's happening. Making things easy often means making things inflexible -- you remove what's not necessary from the main use case. Instead, I'd like to see a focus on goals like simple, direct, and transparent, so it's attainable for someone to observe and learn to understand how their system works and to work with it to accomplish their goals.


I believe the problem is that you can either get something that works, or you get something that is very open to tweaking. Systems that are very open to tweaking and changing things around just don't work as smoothly and easily as systems that "just work". And I think that's the reason why so many people are put off using desktops that aren't Windows or macOS.

My computer is a tool, which I happen to use to develop software and do some schoolwork on. I didn't get my computer for the sake of examining how the operating system works or for the sake of changing around things in config files and changing my DE, I got it for the sake of actually using it as a tool to do other things.

So I got this laptop and installed Fedora on it, almost a year ago now. Not once has anything in the OS spontaneously broken, even after two OS upgrades, never have I been forced to go into config files and muck around, no tweaking necessary, using this computer is a breeze. I've never been happier using a computer, because at last I'm using a setup which is stable, never breaks on its own, and which I also know is safe, FOSS, and not full of NSA backdoors or whatever.


The thing is, Windows and macOS separately worked to expose a consistent model/metaphor of how your computer works for users to learn and understand. This model is an abstraction (and thus leaks), but they had a lot of time, money, and smart people to take a crack at it and it works pretty OK. Also, the model they've built is visual which has some benefits like discoverability -- for example, I learned about computers growing up essentially by clicking every button and seeing what what happened.

It's hard to really do that in Linux. It's functional because so many people can contribute all those little pieces, but that comes at a cost of consistency and often complexity. There are a lot of opaque moving parts in a Linux system, and I don't think that's a good thing. When I decided to use Linux on a personal computer, I tried a lot of different ones until I decided to use Void Linux. runit was a big part of that -- all the init systems I've touched over the years, nothing was as immediately simple and obvious to use or observe as runit.


1. Search "from:John Smith newer_than:30d", note the number of emails as x 2. Search "from:John Smith newer_than:30d sustainability", note the number of emails as y 3. pctg = (y / x) * 100

Guess my career as a product manager is secure.


In what client? Doesn't work in outlook.


A few years back I was in the same boat, rocking a Sansa Clip and no smartphone. I'm a little fuzzy on the details, but the way I consumed Audible content was:

- Burning to CD, which their software did - Ripping the CD to wav using other software - (optional) chopping up the .wav files into 5 minute chunks - Compressing everything to MP3 (V8 VBR)

Not sure if they yanked this feature, but worked then!


I think this is being attempted, though everyone I've seen is horrendous.

- Graphics designers are not necessarily qualified UI (or "UX") designers - UX designers are not necessarily qualified UX designers - Qualified UX designers are not necessarily qualified designers of an OS UI. Most UX work you can get is essentially applying a few simple principles to yet another implementation of a CRUD app.

I've only seen Linux distros make "design" progress on these fronts:

- Eye candy: making things shake, glow, blink, blah, whatever. This is often pointless, and can exacerbate issues with hardware. - Terrible "modern design" copycatting: making fonts and buttons really huge and interfaces empty with no real clue what someone is going to use it for. KDE Neon's package manager sticks out as a bad offender I saw recently - Out of box setup: releasing something that has all the batteries included. Unfortunately, this also pretty much guarantees: it has software on it you don't need; it has software on it you don't want; it has software on it that will break at some point; understanding and customizing it will be treacherous and nigh impossible.

You can go two ways: make a single/narrow purpose system, or make a general purpose system. Linux is great for single/narrow purpose systems, since it's all pieces you can pick and choose at your will. Of course, that won't work as a general computer.

For a general purpose system, I think we need to stop trying to pretend that it's easy or can work out of the box and start accepting the fact that it's difficult and provide people the tools and documentation to handle it.


I've been dealing with the same issue. I haven't figured out any real trick to it; the best I can tell, a working Linux system is composed of so many different utilities worked on by so many different people, there's a lot less standardization than what you might expect.

For example, if I do something and it doesn't work, I want to see something that went wrong. This could show up:

1. In stdout/stderr 2. In a logfile specific to that software, somewhere on my computer 3. In an aggregated logfile somewhere on my computer 4. Nowhere, but will do 1, 2, or 3 after you enable something in a config file or cli parameter

This all assumes that if there's something you have to look up (like where it logs files to), you can figure it out based on the man page or website or inspecting your filesystem, and then it actually logs some actionable information on the problem.

I've just come to terms with the fact that everything, no matter how simple it should be, is an ordeal the first time I do it. I just set aside time to research and make notes.


Out of habit, I tend to always use a not-logged in state for everything, and if I need a particular service (including Google), I do so in an Incognito window.

However, I don't use a VPN or spoof my User Agent. Google 100% knows who I am without using any fancy tricks. I've wondered -- wouldn't I be better off logging in everywhere, with my Google privacy settings flipped to maximum? While it's certainly possible for them to map my fingerprint back to my account(s) and opt-out preferences, I think that's a bit much to expect for them. I might be shooting myself in the foot by always being logged out!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: