Hacker News new | past | comments | ask | show | jobs | submit | more temeritatis's comments login

I wish they could finally include some mechanism to map running WSL instances to a hostname / static ip. Also, a big difference to running a vmware instance is that you never know when the WSL instance is shut down, and you can't save the "state", that is, hibernate or suspend it.

Honestly, i just want a replacement for vmware that uses less resources. So i can run a completely isolated desktop for each project i'm working on. Suspend it, reboot my machine, and then load it up and continue where i left off, all windows and terminal sessions intact. The last bit assumes one can run a (non ephermal) graphical desktop of course, which rules out docker pretty much.



Wow i didn't know about that project. Thanks for sharing!


Check out Ujjayi or "oceans breath" if you're interested in breathwork. The latter name refers to the sound that happens as you breathe in and out in this way. All you need to do is to constrict the back of your throat as you exhale, which makes the exhale longer in a very organic and natural fashion. Because of the slower exhalation it is said to stimulate the vagus nerve in a very positive manner. More people need to learn about the link between a healthy body (and mind) and the vagus nerve IMHO.


> guess it would be best to create a mini app (reading a list of things from a backend and render it and maybe delete something from the list) in two or three frontend frameworks and then choose the one that felt the nicest for you.

This is really good advice. I did that when evaluating react, vue and angular once upon a time. It was pretty clear to me after that what framework i would like to work with, despite what people said about each one at the time.


I'd like to add monk fruit to that list. Usually you wanna combine: (monk fruit OR stevia) + (erythritol OR xylitol) to get a very nice flavor. Stevia or monk fruit on their own are "flat" in sweetness but very strong. The xylitol or erythritol gives some fullness, or body.


I knew planned obsolescence was a thing, but this is just insane to me! At best, it's really disingenuous (which is bad enough). No wonder we have so many conspiracy theorists. They've somehow socially engineered this situation to be acceptable (and relatively unknown).


The subscription model is older than most people realize


That is super interesting, thanks for sharing! Sometimes, less is more.


Which sucks if you like me want to keep virtual spaces (or desktops) separated. I only want to see windows (or apps) running on the current desktop in the dock, which kind of breaks that (apple app) model i suppose. This is doable in windows (and a few DEs in linux) but not in Mac OS unfortunately.


Wow, i didn't know that Window 11 is forcing grouping of windows. This is pretty my biggest problem with Mac OS which is why i always stayed with Windows. No reason to do so any more i guess.


> why can't I filter out apps that aren't open source?

This is laughable. It's 2021 and we still can't to basic filtering in our app stores. Maybe google (and Apple) engineers only know SELECT, but haven't learned about WHERE yet? Yeah i know it's a little much to demand. I mean we only went to the moon like 52 years ago.


I basically gave up using the play store since I could not find any way to search/find 'ad free' or 'tracking free' or 'open source' etc.

The very few apps I use from the play store are either ad free or offer a premium version to remove ads completely - it's the only thing I'll put on my phone.

Seems there is too big a conflict of interest for even basic phone functionality - apps with ads make the money tree keep raining.

One day my (paid for prem version) "timed / scheduled silence" app pushed ads out to all users on an update and I had quite the back and forth with them - they fixed it - but did not seem to care one iota that their full screen overlay ads would prevent a mom from answering here phone right away or other basic functions. - It was all about the money and they were going to maximize that.

All the while no incentive for google to make basic functions available in android or find-able in the play store. So little competition for the abusive, ad-ladden, add basic function to your phone parasites.

Glad to find f-droid at least, but I've given up on using my phone for as many things as it's capable of because the search and find is no-ad-hostile.


with currently 2 exceptions, i only use apps from f-droid. I think I'll buy a pinephone one of these days to throw some dev time at it, we really need an alternative to ios and android


I often wondered i could build some sort of general computing machine if we were pushed back to the dark ages or something. I guess you have to define exactly at what level of technological achievements we were pushed back to. But with the knowledge we have today, and without ICs (or advanced manufacturing facilities) and only "simple electronics" (whatever that would be) if this would be possible. Fun stuff to think about!


First gen transistor computers often used standard functional units - gates, flip flops, and such - packaged into small modules with edge connectors and wired together with wire wrap on a backplane. Like this DEC PDP-8.

http://www.oldcomputers.arcula.co.uk/files/images/pdp8104.jp...

It's fairly easy to design a computer like this.

Later TTL/CMOS designs replaced the packaged modules with much smaller 74xx/40xx ICs.

You can make basic logic gates with just diodes and resistors, but you need transistors for inversion, buffering, and a usable flip flop.

That's probably the minimum level for useful computing/calculating. If civilisation has ended and you have no transistors you probably don't have the resources to make glass valves either, so that's going to be a problem.

Of course there's always clockwork...


A famous example of a modular design is IBM's Solid Logic Technology that was used in their System/360:

https://en.wikipedia.org/wiki/IBM_Solid_Logic_Technology


These modules seem to be the primary influence on sci-fi movie computer design, starting with HAL in "2001".

When sci-fi writers need to create some plot tension around getting a computer either up and running or down and disarmed, the characters will inevitably be plugging/unplugging colorful modules at some point.


We could use vacuum tubes instead of transistors.

(I googled this to make sure I wasn't misremembering what I read 40 years ago in an already outdated book at the library and I was suddenly filled with a sense memory of the smell of the interiors of old electric appliances loaded with tubes and dust.)


Yes...there were a couple of generations of what we would recognize as vaguely 'modern' computers (say...roughly ENIAC to the IBM 704/709) built completely out of stuff that looked like this:

http://www.righto.com/2018/01/examining-1954-ibm-mainframes-...


Yes, that was the first all-electronic generation after the very earliest relay designs.

They were shockingly unreliable and incredibly expensive. Tubes have a very low mean time between failure, so any design that uses tubes exclusively can't work for more than short periods without breaking down - possibly minutes, maybe hours, probably not days, and absolutely not months or years.

And each failure means a cycle of fault finding, which can take hours or days in turn.

As a technology it sort of works in a prototype way - you can get some work done until you can't. But the unreliability means it's qualitatively different to a modern laptop or server farm.

The wonderful thing about integration on silicon is that it's the opposite - it's incredibly reliable, as long as you keep the thermals reasonable.


Well...certainly true of the original tube computers (ENIAC was famously temperamental), but that module comes from an IBM 700-series, which was a production product. Tube machines from IBM, Burroughs, Univac, Bendix, Ferranti and many others were in no way mere prototypes with hundreds built. The tube based AN/FSQ-7 was for years the basis for the USAF SAGE air defense network.

Tube reliability improved radically over the 15-20 years tube computers were a thing; it had too. And just like you point out about silicon, reasonable thermal management became recognized as important to tube reliability and designs changed accordingly. MTBF was lower than a modern computer, but they certainly ran for days or weeks and more. And debugging was usually fairly quick as you ran some diags that pinpointed the module (not single tube) that failed and replaced the whole thing.

I have an acquaintance with a Bendix G15 that still runs. Admittedly, the G15 is much simpler than an IBM 700, but it's a nearly 65 year old tube machine.


We could use telegraph relays instead of vacuum tubes - might be better reliability and repairability.


Except that mechanical contacts are the bane of all things electrical. Vacuum tubes are lightyears ahead of relays in this regard.


Repairability, yes. Reliability, no.


I forgot which book it was (maybe "the three body problem"?) but there was a science fiction story where a Chinese king makes his soldier act as a logical gate and his army becomes a computer. I was like, wow, I didn't think about that, but it totally makes sense!!


That is the three body problem, and while avoiding spoilers, not exactly a Chinese king.


There's an XKCD for everything :-) https://xkcd.com/505/


In that case, if you want a somewhat entertaining very-high-level overview of what would need to be done, then there's a manga that showed this off a few chapters ago, it's called Dr. Stone. What stuck with me the most was that the purity needed for the silicon used in processors was absurdly high, so much so that they couldn't quite do that just yet, so they made a processor out of parametrons and used magnetic core memory. I knew semiconductors had to be very pure, but it was a bit discouraging to realize just how much effort it would take if you started from zero.


Dr. Stone is great but I also found it to be a bit too hand-wavy. In real life you can't just build steam engines with a small village worth of labor + a "master craftsman". Mining, transporting, and refining iron ore alone is a huge task that could easily consume every drop of the village's labor resources and still not produce much iron. Fuel is also a huge task. Unless you have a high quality coal mine nearby, you have to create charcoal which is also very labor intensive (see: https://www.youtube.com/watch?v=GzLvqCTvOQY). I just can't fathom how Senku realistically makes processors unless he has a nation state worth of labor at his disposal.

But yeah, it is a fun "what if".

"What if a super genius with the entirety of wikipedia in his brain were sent back to the stone age? Could he rebuild modern society?"


IIRC a key obstacle why steam engines were not used earlier despite the concept being known for at least a millenium was the requirement for quite advanced metallurgy - you can make a nifty proof of concept from copper or iron, but a useful steam engine needs to be (a) relatively high pressure and (b) large, so you can do it only if you can reliably and cheaply make large quantities of decent steel. If you can't make large quantities of steel, your steam engine doesn't work; if your steel-making process has unpredictable results, then your boiler blows up at a weak spot, and if that steel is expensive, you're better off having the same people work a literal treadmill instead of making a steam machine.


At least with iron, you'd have the benefit of the existing refined ore lying all around you in a post-apocalyptic setting. There's little need for actually mining iron ore anymore if your population has been reduced by 99% or more. You can walk down any abandoned street and find sources of iron and other metals. Now, there's still the refining process (but it would be shorter from something already processed) and fuel to contend with.


Also, making glass is not just combining sand and seashells and fire.

I don't doubt that they could have made glass plates or something, but they start turning out vacuum tubes and borosilicate beakers next to each other like it was all a matter of knowing the recipe.


>the purity needed for the silicon used in processors was absurdly high

Yes. Silicon wafers are cut from a monocrystalline boule, a single flawless silicon crystal with no defects or inclusions. A big chunk of silicon atoms, nothing else at all. (Doping happens later) To the extent any physical object can be called "perfect", a semiconductor wafer is perfect.

(Of course after manufacturing it will start picking up embedded hydrogen and helium atoms from cosmic rays and alpha particle background radiation.)


Wow, no idea Dr. Stone was that hardcore. That sounds watchable!

edit: didn’t pay attention that you were talking about the manga. That makes more sense. Sounds highly readable!


The first computing machines used relays which are electromechanical mechanical switches. Current would flow into an electric magnetic and it would magnetize a switch and close a loop thereby switching something "on." By placing these switches together into different configurations you could form equivalent logic gates.

Sometimes insects or moths would get stuck in the relays which would screw up the system. This is the origin of the word "bug."

Prior to incorporating logic into electronics, computing machines were hand cranked or motor cranked gear machines. See: https://www.youtube.com/watch?v=fhUfRIeRSZE. The YouTube video literally is a hand cranked portable calculator.

The world you envision has already existed.


The use of the term "bug" in engineering predates automatic computers by nearly a century; the Wiki article [1] on the topic gives a pretty good summary of its history.

[1] https://en.wikipedia.org/wiki/Bug_(engineering)#History


> The YouTube video literally is a hand cranked portable calculator.

It can even do square roots?! That's amazing. And it fits the palm of your hand!

Now we're down to specks of sand calculating so fast they melt without cooling. Seriously wtf.


Then the next question is; what would you do with it?

You need a source of problems to solve, and until you've bootstrapped the rest of society at least to the point where something like high-resolution trigonometric tables, desktop publishing, high-speed accounting, (for example) are needed, the effort isn't going to keep you fed...


Calculate ballistics, like some of the original computers were created for? Never too early post-apocolypse to start the thinking about the next war.


I agree it wouldn't be high on the list, but I also imagine there would be practical needs. Like command/control. So, voice only radios first, but some sort of messaging that doesn't need a live listener on the radio would then be a nice next step. And that could be done with a simple computer.


No need in electronics whatsoever - mechanical computing is a sufficiently advanced engineering discipline, as, incidentally, is fluidics!

https://en.wikipedia.org/wiki/Fluidics


Looking historically, you have a bunch of options for a pre-IC computer; there were lots of pre-IC computers. Transistors, of course, or vacuum tubes give you a useful computer. You can build a computer from relays, but the performance is pretty bad. Memory is also very important. Magnetic core memory is the way to go if you don't have ICs. None of this is going to help you if you went to the dark ages.

As far as mechanical devices, mechanical calculating machines didn't arise until the late 1600s and weren't reliable for many years. It's unlikely that you'd be capable of building a mechanical computer until the industrial revolution. Note that Babbage was unsuccessful in building his machines even in the late 1800s.

If your goal is to build a Turing-complete machine of some sort, even if totally impractical, you could push the date back a lot. But that would be more of a curiosity than a useful computer.


For arithmetic, pinwheel calculator (aka "Odhner's arithmometer") [0] is a pretty decent and reliable mechanical device. You can even give it an electric motor for doing the rotations for you and a numerical keyboard.

[0] https://en.wikipedia.org/wiki/Pinwheel_calculator


You can even build a binary machine without electronics, have a look here: https://en.wikipedia.org/wiki/Z1_(computer)


CollapseOS is a z80-based forth that is targeted at bootstrapping computing from scavengable components in old electronics.

https://collapseos.org/


Relay computers are relatively simple to make, and require just electromechanical relays.

Some semi-random examples

https://web.cecs.pdx.edu/~harry/Relay/

https://relaycomputer.co.uk/

Main issue is memory. Takes a lot of space to make any usable about of memory out of relays.


On that note, I was wondering on several occasions whether it would have been technologically possible to build neon lamp logic circuits in Babbage's time. Aside from the problem of building an air liquifier a few decades early, I don't see any really major technological hurdles there. That would have nicely solved his problems with mechanical manufacturing...


Good point!

I used to play that same thought experiment with more basic utilities like my toaster with it's various settings and electronic controllers. Then I was given a Dualit. No more philosophical dilemmas!

Kidding aside, it's always staggering how far removed we really are from operating on (humanly) first principles. Humbling.


There are many people on the internet researching how basic things can be made in a low-tech fashion. I particularly enjoy https://simplifier.neocities.org/ for example.

But if you read those blogs you still notice the mind boggling height of the giants whose shoulders the bloggers stand on. Having access to simple chemicals like acids or various salts for example is huge. I wouldn't even know where to start if I had to bootstrap a highschool chemistry kit starting with nothing but my hands and my knowledge.


The Dr. Stone manga provides an interesting perspective on how you'd bootstrap that chemistry kit.


219 chapters and ongoing. Whew. This confirms my suspicion that bootstrapping is really hard.


Whoa this is great! Thanks for the link!


I mean, you can apparently trick swarms of crabs into being logic gates. https://phys.org/news/2012-04-scientists-crab-powered.html


Babbage's analytical engine comes close, and doesn't even use electricity.


In the dark ages you can build gears. Gears can do arithmetic and calculus.


The catch is that there are tolerance issues. Doron Swade's account of building the two existing Difference Engine #2 models (http://www.amazon.com/exec/obidos/ASIN/0670910201/donhosek) is a good example of where the challenges lie. It was just barely possible to do with 19th century technology. Physical mechanisms deviate from theory by quite a bit.


Antikythera mechanism was built before the Dark Ages, he just wanted to go back to the Dark Ages so he can do precision work by that time, he can probably build a battleship fire control system at that time.

edit: https://www.youtube.com/watch?v=gwf5mAlI7Ug


Now he just needs to build a battleship to go with it!


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: