Hacker News new | past | comments | ask | show | jobs | submit | mediumdave's comments login

I agree with the article that there is a lot of low-quality "journalism" out there, designed to outrage or entertain rather than inform.

However, that does not mean that all journalism should be disregarded. I read the Washington Post and listen to NPR (regardless of how you feel about their cultural programming, their news organization is excellent.) Citizens in a free society have a duty to be informed about the issues facing that society. I reject the idea that there aren't readily available high quality sources of information about the world. Are any of them perfect? Clearly, no. But ignoring them because they're not perfect strikes me as nihilistic.

There's one quote from the article that alarmed me:

> The news is overwhelmingly about things you cannot possibly influence

In democracies, we do have elections...


I tried proper left hand mice for a while (naturally-right-handed, switched to left-hand mousing for RSI reasons similar to above), and eventually I just embraced the idea of using a right-hand mouse with my left hand. It feels completely natural now. As an added benefit, I can use any "normal" mouse by just moving it to the left of the keyboard.


Now that Raspberry Pis are hard to find (in mid-2023), the Dell/Wyse 3290 makes a very nice print server for a reasonable price.

https://www.parkytowers.me.uk/thin/wyse/3030/


Certainly was the case but there's been plenty of Pi 4 for the past few weeks, and other models getting better now too. https://rpilocator.com

Although an old thin client might still be cheaper.


I've had good experiences using Upduino 3.0 and 3.1 [0] with the IceStorm tools via apio [1]. I wrote a blog post [2] with some info on getting things set up via Linux. All you need is the Upduino board, which interfaces to your host system via USB (so no special programmer is needed).

[0] https://tinyvision.ai/products/upduino-v3-1

[1] https://github.com/FPGAwars/apio

[2] https://daveho.github.io/2021/02/07/upduino3-getting-started...


> And if you want a portable computer (which by the way are demonstrably more secure in the face of physical tampering) you basically relegate yourself to terrible battery life, poor display support, dicey sleep support, and the fixes for these often compromise performance.

This depends a lot on how well supported your hardware is. I run Linux Mint (MATE) on a Thinkpad T430 and have had 0 issues with displays, sleep, battery life, you name it.


Running Linux Mint MATE 17.3 on my ThinkPad T530 for a couple years now with no problems. I even game on it with Steam (XCOM runs OK on the integrated Intel graphics). Been very happy with it.


I have a T520 and last year I tried to run Ubuntu, but my battery life was around 20% worse than under Windows 10 (and the Windows 10 battery life wasn't great either). You say you haven't had any batter life issues. Does that mean it isn't worse than Windows?


Have you plugged in an external monitor and not had an issue? I think that's not the norm even for that setup.


What do you mean by issue? After a couple of xrandr invocations to configure them, which could be automated by some GUI app if I were so inclined, I've never seen external monitors have issues on either of my Linux laptops.


Requiring an xrandr invocation equals to having issues. Perhaps it's a small problem, perhaps it's a very easy problem, but it definitely is a problem.

Not having issues means that you arrive at the hardware (which you may have never seen, e.g. at a customer's site), plug the wires, and it works immediately.

More importantly, not having issues means that you can rely on being able to just plug the wires and have it work, and that you don't have a risk of being unable to make it work immediately even if you forget the right invocations and are offline and can't look them up.


> plug the wires, and it works immediately

How does it work immediately? How does it know of I want to clone or extend the display? If I extend, do I want the same resolution on both screens, or different? You'll have to set that somehow, and whether it's a GUI or a CLI tool doesn't matter.

Forgetting the invocations aren't really an issue anymore either, my shell (zsh) has autocompletion of xrandr outputs, modes and resolutions.


With OSX/Windows, it tends to mirror by default and/or present a window asking you what you want to do.

Requiring a CLI to connect a monitor/projector is a UX fail.


There are plenty of tools to automate that. arandr is a very simple and powerful tool to arrange monitors with a GUI. Gnome has its own, much simpler, monitor configuration dialog. My system defaults to extending to the right, by the way, because that's generally what I want when connecting a projector.

None of this is new, and the whole point of this discussion is that Linux desktops are much better than they were ten years ago. It is that old state that many folks have in mind when criticising Linux distros' usability. It's just not a very interesting discussion to have.


And the thing I think you're hearing is that "progress" should not equate to "good enough." Doing a sensible and even helpful thing by default is part of user friendliness.


On my Thinkpad's X200 (VGA) and X220 (VGA, DisplayPort), plugging in external monitors has always Just Worked. I've been running Xubuntu on them for about 5 years.

It even worked on the first try from the X220's DisplayPort through a DP-to-HDMI-cable onto a TV.

-----

I've never had to use xrandr or any command line tool to select displays – on Xubuntu, there's this little graphical dialogue: http://netupd8.com/w8img2/xfce-mini-displays.png that pops up (or you can force it to show by hitting that key on the Thinkpad keyboard with the picture of an external display).

OTOH, I did just a month ago for the first time actually use xrandr, but this time it was because I wanted to write a script that set my windows and monitors up "just right" for how I like it when I'm at the office. I love how easy Linux makes it to do that stuff when I find I do want something automated.


> Jekyll should work on Windows because Ruby works on Windows.

Jekyll is a bit of a nightmare on Windows at the moment, at least if any significant part of your workflow is based on cygwin. (I helped a colleague get it running a couple weeks ago.)

Cygwin ruby + gem install jekyll -> fail.

rbenv to build ruby from source under Cygwin, gem install jekyll -> fail.

The one solution I found that works is using Chocolatey [1] to install ruby and then gem install jekyll. Even so, I wasn't able to get pygments working for syntax highlighting, and the Chocolatey Ruby does not play nicely with cygwin terminal for interactive programs.

[1] https://chocolatey.org/


> jOOQ is SQL-centric. Your database comes "first".

This approach has always seemed completely backwards to me. Isn't the database simply a mechanism for persisting records/objects whose structure is determined by domain modeling?

To me, it would make about as much to say of a GUI framework "foobarWidgets is GUI-centric. Your UI comes 'first'.", as though the application itself was just an afterthought.

Don't get me wrong - I like SQL, and I happily use relational databases to store objects. I just see the database as a means, not an end.


Your data will live for the next 30 years. Your UIs are replaced with every new fad. Do you think it is more important to thoroughly design your database or your client domain model?

Of course, projects are different, and some projects are more user-centric, others are more data-centric, but chances are that you're successful and then you'll regret working with a horrible database schema that you didn't properly design 5 years ago, cause all you cared for were your fancy foobarWidgets that you implemented in a tech that no longer exists...


> Your data will live for the next 30 years. Your UIs are replaced with every new fad. Do you think it is more important to thoroughly design your database or your client domain model?

This cannot be repeated enough. It is an argument I have over and over with people who want to treat the database as a dumb store usually because they do not want to understand databases. Any successful software that stores or generates data will see that data live and used way beyond the original program. The database used will absolutely be the foundation multiple different pieces of software are built on.


10x this! I always get a shiver when I hear people saying "... and it just generates the DB for you...".

DB is what I always start with and it shall stay so.

BTW, is there some jOOQ equivalent for .NET? EntityFramework 7 is all about code-first, which I really don't like.


"Show me your flowcharts and conceal your tables, and I shall continue to be mystified. Show me your tables, and I won’t usually need your flowcharts; they’ll be obvious." -- Fred Brooks

At least in the realm of business applications (i.e. not scientific apps, or games, etc.) the most common approach is to keep data (including data dictionaries and db structures) at the center.


In many enterprise contexts, shared databases of very important data are more valuable, more long-lived and more managed than easily replaced applications; applications, often strange and ad hoc ones, are the means applied to the end of keeping databases current and useful for business. Altering old tables in backwards-incompatible ways isn't normally an option.

If you treat a database as "a mechanism for persisting records/objects whose structure is determined by domain modeling" your database will be messy and redundant.


> Isn't the database simply a mechanism for persisting records/objects whose structure is determined by domain modeling?

not at all. you're thinking of a file system.


It's a fair question whether the visual/blocks model is appropriate for expert programmers. However, there was some research presented at ICER 2015 showing that visual/blocks languages have benefits for students learning to program. Mark Guzdial had an interesting blog post[1] about it.

[1] https://computinged.wordpress.com/2015/08/17/icer-2015-block...


He also invented dataflow analysis, which is the basis for a huge number of compiler optimization and program analysis techniques: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.102....


I have a couple T61s from Ebay that I got for my kids. I've been using one of them for some development work, and I've been surprised at how well they work for that purpose --- as in, I could easily use one for my day to day work. Put an old SSD in one, install Linux, and you have a very nice machine.


I bought a Thinkpad x220 tablet off ebay as a "poor man's Cintiq" and put an old SSD in it. The thing is like a developer Swiss Army Knife. I can use the very nice keyboard for real programming work. I can use the Wacom stylus for pixel art and graphics. Makes me think that Apple is aiming at dominating the common-person's "workstation" form factor, while Lenovo is aiming more squarely at guys like us.


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: