Hacker News new | past | comments | ask | show | jobs | submit login

I love retrocomputing. It completely flips the idea of "outdated consoles" on its head. Instead of being less valuable because there are better options, they are instead a forever-immutable computer for developers to target.

The produced consoles will last many more years (I have an N64 still kicking running new ROM hacks on an Everdrive). And even if every bit of hardware becomes defunct, the emulators will live on preserving that architecture in an immutable state forever.

I used to think "what's the point of creating new software for old consoles" but once I reframed them this way, I find them as or more exciting that writing software for modern hardware.






Completely agree and I'll go a bit further. I see old consoles and computers as the only VMs that will last. In 500 years, assuming we all survive, I believe it's more likely that code targeting the NES will be runnable than code targeting today's browsers, .NET or the JVM. The reason is that while these competing VMs are well documented, they are extremely complex and code targeting them tend to rely on idiosyncrasies of current operating systems, browsers and even hardware.

Also, the retro hardware itself is the ultimate documentation. You can look at every chip using (nowadays) accessible equipment and create an absolutely perfect replica in software or FPGA. VM documentation, however, can contain inaccuracies.

My speculative fiction statement of the day is that only software targeting relatively simple architectures will stand the test of time.


Another aspect of this is the relative simplicity of the toolchain.

On my 10-year-old PC (Core i5 something or other) I can build a cross assembler and C compiler for the Amiga in around 21 seconds (vbcc, Vasm, Vlink).

I can build the same on the Amiga itself in minutes (admittedly quite a lot of minutes!) rather than hours.

Meanwhile, I read recently of a build of llvm on a RISC-V SoC taking well over 12 hours.


Not to mention that so much more modern software will fail due to requiring network connectivity to non-existent services

Isn't this general idea what the 100 Rabbits people theorize? i.e with uxn and all that

I had never heard of it, so I went to find out more. Yes, I think it's the same philosophy. Very cool project.

Unfortunately, it seems that they went against the project's very principle by inventing a new language, new VM and toolchain instead of simply targeting one of the existing platforms. I can see why they wanted to build an abstraction layer, so their code is portable across different classic (and modern) platforms, but this is one more case of https://xkcd.com/927/.

It would have been totally okay if they had said "The future of software is creating applications targeting the Amiga 1000", or anything else from the 80s-90s, which I'll arbitrarily define as "simple enough to accurately emulate forever".


> Unfortunately, it seems that they went against the project's very principle by inventing a new language, new VM and toolchain instead of simply targeting one of the existing platforms.

The intentions of Uxn are not directly in line with using say, a commodore 64, for preservation and as a portability layer, which is a monumental project to implement for each new system. The project's core principle is to design something perfectly tailored at hosting a handful of specific programs, document it in a way that if needed, others could create their own systems based on their own vision, and not centralize all preservation efforts around a handful of retrocomputing emulation.

https://wiki.xxiivv.com/site/devlog

https://100r.co/site/weathering_software_winter.html

It's more akin to using brainfuck or subleq, or Another World's VM or even Alan Kay's Chifir where the goal is to target a virtual machine that is so small(< 100 loc) that it can be easily ported. As opposed to a system so complex that it might take someone months to implement a passable C64, Amiga, or ST80 emulator.

https://tinlizzie.org/VPRIPapers/tr2015004_cuneiform.pdf

Other related similar projects:

https://dercuano.github.io/notes/uvc-archiving.html


Thank you for the great links!

It can be really hard to get accurate emulation with a somewhat loosely defined, high level VM such as some of your examples. If it's that small and simple, programmers might accidentally create a dependency on implementation idiosyncrasies. Just see what happens on the retro computing scene when emulators aren't perfectly cycle and hardware accurate - applications, the "demoscene" being a good example of them, that rely on very low-level details of the architecture don't work on certain emulators.

If we want to create a new "forever VM", that VM would have to very strictly define behaviors across I/O, graphics, audio and other areas. I don't want the application to stutter or run too fast on future emulators. I want the emulation to be perfectly cycle accurate.


Exactly, you get it. That's the goal of the project, no undefined behavior, no hazy specifications. :) I've dabbled in this space for quite a while now, I can assure you that dependency on implementation idiosyncrasies get increasingly worse with complex VMs.

> code targeting the NES

Well, the 6502 in general. It's an extremely important processor in history. People want to emulate NES, Commodore, Atari 8-bit, Apple II - so an accurate 6502 emulator has a HUGE base of nostalgic geeks to improve it.


Yes, and we have freely available, cycle accurate 6502 emulation code bodies along with full coverage tests that include both the official opcodes and many illegal ones. Mostly the ones people used such as LAX, which can get the same value into both registers on a single read.

I just had a fleeting thought right now related to that behavior:

Basically, that opcode works because of how simple the design is. Electrically, wiring both registers up does the trick. And many parts of the chip can work together like that even though none or it was intended.

I wonder what a revisit, that takes these now well understood behaviors into consideration, would look like!

Zeroing all or multiple registers, same with bit ops, maybe inc, dec...

Many new, efficient instructions are possible. Would be a fun programming exercize and design one too.


I think it has less to do with the architecture or medium and more to do with the type of media itself.

We are more likely to actively preserve art in usable forms than something like JIRA. We play Chess more than we use whatever contemporary accounting tool. We read old novels more than we read transcripts of business meetings.

So we'll be more motivated to keep these architectures and games continually preserved.


Hardware does fail over time, apparently due to stuff like leaky capacitors. Enthusiasts and repair shops can keep a few alive for a while, and shops like PCBWay may produce replacement parts.

That said, I think your point about emulators is very on point because they facilitate experiencing these long after hardware becomes impractical. And folks can try a huge variety of games without a lot of travel or shipping. I'm also curious to try some FPGA solutions, especially if they can support save states.


In my experience relatively few of the components on old computers and game consoles are particularly failure prone and most of them can have a future contingency plan:

- Electrolytic capacitors can be replaced relatively easily. Some people are replacing them with solid state capacitors to try to improve reliability and avoid corrosion from leaked electrolyte.

- Batteries likewise can be replaced easily, and you can usually fit a socket in there if there isn't already one.

- Damaged traces on PCBs (usually caused by leaky capacitors or batteries) can often be patched. It is definitely not the easiest work, but if patching a few traces is all that's needed to get something back into working order, then it's probably worth it.

- Some of the old AC-DC transformer blocks are dying or horribly inefficient anyways. Most of them are outside of the machine and can be replaced with readily available modern equivalents, so this one is extremely easy. For old computers, ATX supplies are easy to adapt to pre-ATX standards and even some different machines entirely since they provide some of the most commonly-needed voltage rails (some new supplies lack -12V but it will be listed on the PSU specs either way). There's even very small form factor ATX supplies using GaN based transformers that can fit pretty much anywhere.

- CD-ROM lasers are definitely starting to wear down, but there's quite a lot of optical drive emulators available nowadays for a variety of machines, with more showing up every year. As long as small-order PCB manufacturers remain around, it will probably remain viable to make more of these ODEs.

- Likewise, floppy disks and their drives can fail for a variety of reasons, but floppy drive emulators are at the point of reasonable maturity and can support a lot of machines, too. I'm sure there's some weird standards where emulation may be spotty (thinking of NEC) but for typical Macs, PCs and Commodore computers I imagine most of the ground is covered already.

It is true that a lot of hardware is failing and some of it is not so easily replaced, but honestly, I think if we wanted to, we could keep a good amount of the retro hardware working for possibly hundreds of years longer. The real question is if enough people will deem it worth their time and money to do so. But then again, I suppose it's not much different in that regard from vintage automobiles.

There will always be a place for emulation, probably a much larger one at that. Not only does emulation give a very nice long-term solution to keeping software libraries accessible, they offer plenty of advantages over actually using old hardware, and it's obviously a lot more accessible.

P.S.: to whoever does eventually come into possession of the machines I worked on next, I apologize for my soldering. In fairness, some of these old boards are stubborn even with a ton of flux.


The optical drive emulators are great but sad: being able to play games with original disks is half the fun, IMO

>- Some of the old AC-DC transformer blocks are dying or horribly inefficient anyways. Most of them are outside of the machine and can be replaced with readily available modern equivalents, so this one is extremely easy.

Except the commodore ones that fry the computer when they stop working.


I believe the Commodore 64 power supply I am using I ordered from c64psu.com. I did not evaluate the quality in any way, but years down the line it hasn't failed me. So, at least working replacements are available, and while they're not necessarily cheap, it's probably worth it considering you're definitely right about the fact that the Commodore ones tend to fail in pretty ugly ways.

re: FPGA, the Analog Pocket with an Everdrive also works great. I use it to play GB/GBC/GBA games on my TV via its dock. Sadly, those don't support sleep and stuff but that's worth it.

Why bother with an Everdrive on an Analogue Pocket? Mine has only seen a cartridge once, just because my partner happened to have an old GB game she wanted to see if her save had survived on, otherwise it runs entirely off SD.

I guess it's mostly because I already had it? That's a good point though.

Not a bad reason, especially if you have saves on it. I'm sure there's a way to transfer them and make it work but I wouldn't be surprised if it wasn't perfectly straightforward so I can totally see not finding it worth the effort.

I never had a Game Boy, back in the day I was the kid tethered to the power cord with his Game Gear. I almost bought the adapter, but then I realized none of my games had saves so I had no reason to want to use the real carts.


The save transfer is very simple. You just copy them to the correct folder.

I already had a GBA Everdrive but I picked a GB/C one up because for about a year after release, the FPGA cores didn't support the display emulation features of the Pocket, which is a good chunk of the reason I bought mine. I believe they are now all fully supported though. The Pocket's sleep doesn't really work with flash carts either, so arguably the core + SD card route is now the better option, though I still own my AGS-101 model SP and GB micro, and it's cool to play on those still sometimes.

Real Time Clock support for GBA games that need it. The OpenFPGA core doesn’t support it.

There are also companies like Analogue who are producing high quality clones which will keep gamers (mostly) happy for at least another generation.

Analogue makes FPGA-based emulators. These are pretty cool because they can eliminate a number of downsides with software game console emulation while still retaining some of the upsides, and versus pure hardware clones, they can be updated and patched, either to fix bugs or add new targets. Another bonus of FPGAs is that they're accessible to hobbyists. I don't know what the current preferred solution is but a while ago people were buying DE10 Nano boards for running MiSTer, which could support a large variety of cores.

MiSTER is going strong and the recent release of the excellent-but-confusingly-named MiSTER Pi [1] board has brought the cost of entry down considerably. After I do some more SD card shuffling and verify game compatibility I plan on selling my N64 and Saturn and using OEM controllers with my MiSTER Pi. Unfortunately the Altera Cyclone FPGA in it doesn't have the horsepower to run anything beyond fifth-gen systems so a new platform would be required for the Dreamcast, Gamecube, and PS2. The common refrain from FPGA enthusiasts regarding these systems is that because the hardware has many more layers of abstraction they're less dependent on cycle-accurate timing than older consoles so the juice might not be worth the squeeze as far as building HDL cores for them is concerned.

[1] https://retroremake.co/pages/store


> they're less dependent on cycle-accurate timing than older consoles > so the juice might not be worth the squeeze as far as building HDL > cores for them is concerned.

There's also the question of the huge amount of engineering effort required to recreate the more advanced platforms.

The Replay2 board should provide both a much more capable FPGA and loads of RAM bandwidth to go with it. (Finishing touches are apparently being made to the prototype board layout, and production is slated for the Spring)

And for anyone who thought FPGA gaming was a new phenomenon, work on the original Minimig FPGA recreation of the Amiga started in January 2005 - 20 years ago!


It's worth nothing that while they do eliminate many of the drawbacks with emulation, FGPA solutions are still not 1:1 recreations of consoles.

Yeah, totally fair. I was actually debating whether "clone" was appropriate. I've also struggled with how to explain how my Pocket works to friends who don't know what FPGA means. "It's like emulation but ..."

The DC will be around for a long time, but games won't be played from optical discs much longer. It used a format that was double the density of the yellow-book standard, and when my drive failed, it was much cheaper to replace it with a flash-based option than to buy a replacement drive.

Yes, optical drives tend to fail. A couple of days ago I tried to play with my old Gamecube that had been in storage for at least 10 years, and surprise, it doesn't read the disks anymore.

On the other hand, a few months ago I bought a Saturn and despite being much older, it works flawlessly. That thing was built like a tank...


I've got one of those, bought and installed it a few years ago. Unfortunately it doesn't work with all games. Maybe that has been fixed, but was very annoying to find that hydro thunder didn't work!

If you're using an HDMI converter then that's most likely your reason for Hydro Thunder not working. Hydro Thunder notoriously doesn't work well with VGA output (HDMI converters for the Dreamcast usually convert the VGA signal)

I figured out how to fix Hydro Thunder's compatibility issues last year (it would very briefly change the pixel clock). You can either use a Codebreaker code to fix the game if you're running from a real disc, or if you use an ODE, you can patch the disc image.

https://www.dreamcast-talk.com/forum/viewtopic.php?f=5&t=165...


This reminds me, I remember back in early 2000s hex editing games to force 60hz on my PAL Dreamcast. This looks like a similar idea. Thanks.

To be honest I now pretty much just play Dreamcast games via the redream emulator on my steam deck and pc under my TV. Though obviously there are drawbacks with emulating. Every so often I bother to setup my Dreamcast with my ossc and play on real hardware.


VGA to my ossc is usually how I connect to my TV. But yes I believe you are right, it's a vga issue too

From my limited understanding the Dreamcast also has one of the simplest architectures of that generation (since Sega learned from the monstrosity that was the Saturn) which I would think would make it a good target for homebrew. The GPU was also an early PowerVR design so maybe contemporary mobile GPU expertise can be leveraged instead of trying to target the Gamecube or PS2's proprietary architectures.

Of course the OG XBox is probably simpler to port a PC game to since it basically _was_ a PC so it doesn't really count in this discussion.


The people who mod retro machines into more “modern” formats are pretty wild on the hardware side as well.

This nut job decided to make a portable PS4: https://youtu.be/bJSLscnFd_M


excellent, did you try SM64 - Through the Ages? Great custom level design.

It's on my list :) I kinda wanna replay SM64 first for a comparison.

I replayed Majora recently but insisted on replaying OoT first for similar reasons.

EDIT: idk now thinking about it again, I think I'll just play it. I want something pure and fun to play. Sadly it doesn't play on original N64 but alas. I can always use my Steam Deck :D


Also reminds us that line goes up is often more a perceptual FOMO than real technical milestones.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: