> Yes - some things go out of fashion for a while, but trends almost always cycle back.
Exactly, this is even supported by Nintendo's own services offering emulation of their older systems. There is clearly demand for the ability to older games.
Capitulation to an "inevitable" fate of download only games is just taking the easy way out by not sticking to your own core values. I have personally pre-ordered a Switch 2, but I will not being purchasing any online only cartridges or download only software.
We haven't had the watershed moment that brings it into focus for gamers at large yet, The Crew was close. But Nintendo has kept the download servers going for all of their systems which has provided a false sense of security. Once those start being shut down maybe we'll see some actual response. Though with the introduction of Gamecube emulation on the Switch 2, they are only a small step away from emulating the Wii and giving people another scapegoat for their lazy acceptance of lack of ownership.
I have a project that uses a proprietary SDK for decoding raw video. I output the decoded data as pure RGBA in a way FFMpeg can read through a pipe to re-encode the video to a standard codec. FFMpeg can't include the Non-Free SDK in their source, and it would be wildly impracticable to store the pure RGBA in a file. So pipes are the only way to do it, there are valid reasons to use high throughput pipes.
What percentage of CPU time is used by the pipe in this scenario? If pipes were 10x faster, would you really notice any difference in wall-clock-time or overall-cpu-usage, while this decoding SDK is generating the raw data and ffmpeg is processing it? Are these video processing steps anywhere near memory copy speeds?
You have the dependency either way, but if you use the library you can have one big executable with no external dependencies and it can actually be fast.
If there wasn't a problem to solve they wouldn't have said anything. If you want something different you have to do something different.
It looks like FFmpeg does support reading from sockets natively[1], I didn't know that. That might be a better solution in this case, I'll have to look into some C code for writing my output to a socket to try that some time.
At some point, I had a similar issue (though not related to licensing), and it turned out it was faster to do a high-bitrate H.264-encode of the stream before sending it over the FFmpeg socket than sending the raw RGBA data, even over localhost… (There was some minimal quality loss, of course, but it was completely irrelevant in the big picture.)
No, because I had hardware H.264 encoder support. :-) (The decoding in FFmpeg on the other side was still software. But it was seemingly much cheaper to do a H.264 software decode.)
Look into the Domesday Duplicator project for Laserdiscs as an example of how what ssl-3 is talking about can be done using a high sample rate input. That exact process is possible and with enough storage and processing power can be used to get the most "low level" access to the data. It is not for the faint of heart though, and can take around 1TB of storage and hours of CPU time to process full movies in this way, I know because I've done it.
I believe I've seen there is work being done to attempt this on CDs but it would have still been in the exploratory phases and not yet ready to start archiving with. It might seem like overkill to do this to something meant to be digitally addressed but I've experienced enough quirks with discs and drives when ripping that I would 100% be willing to switch over to a known complete capture system to not have to worry about it anymore. Post process decoding also allows for re-decoding data later if better methods are found.
Even BIN/CUE is not enough. It cannot store subchannel data like CD+G and is only able to hold a single session which breaks bluebook CDs with audio and data.
We do not currently have a widely supported CD standard for storing data from a CD that can properly hold all data. Aaru [0] is close, but still has to output back to other formats like BIN/CUE to use the contents of the disc.
I'm curious if you have a specific example of an album with the crowd noise between tracks like that? I collect and rip hundreds of CDs and am always on the look out for edge case discs to further hone my tools.
On your pregap + 99 indexes remark, the "pregap" is the space between index 00 and 01 which continues on up to index 99. Players seek to index 01 as the start point of the track. There is no separate pregap designation. I've paid special attention to this because it is a difficult problem to solve as many discs have space between tracks stored in index 00-01 but rarely is there anything audible in there after the first track. The only example I have of this is a specialty music sample disc, Rarefaction's A Poke In The Ear With A Sharp Stick, that has over 500 samples on the disc accessed by track + index positions.
As a sidebar based on the later comments in the thread, I've made it a habit to rip and store every audio CD as BIN/CUE+TOC using cdrdao. This allows me to go back and re-process discs I may have missed something on. But that is imprecise even because it usually breaks bluebook discs with multiple sessions to store data due to absolute LBA addressing. Also the ways different CD/DVD drives handle reading data between index 00-01 on track 1 is maddening. Some will read it, some will error, and the worst is those that output fake blank data.
>I'm curious if you have a specific example of an album with the crowd noise between tracks like that? I collect and rip hundreds of CDs and am always on the look out for edge case discs to further hone my tools.
E.g. the Japanese version of Flying Lotus' album "Until The Quiet Comes" has a pregap of 5 seconds before the 19th track, to separate it from the rest of the album, as it's a Japanese-exclusive bonus track.
Not sure why I didn't think to mention this before: One tool you might consider is redumper, it's designed in particular to handle multisession CDs, and it attempts to over-read into the disc lead-in and lead-out to catch data outside of the range covered in the TOC (particularly common in older CDs). It only outputs a final split bin+cue, but everything read, including scrambled data, toc, and subchannel/subcode, is saved for future processing. The bin+cue can be used with ISO Buster (and probably other tools) to access Enhanced CD filesystems. Feel free to reach out if you need some tips, this is what I use for my collection.
Caveat: It is mostly intended for use with the low-level features of Plextor drives, so CD support on other drives is relatively limited; in particular it doesn't have any overlapping read paranoia-style features. The recommendation is to dump twice to confirm; it's running straight through without seeking so that's usually still quite fast.
Seven minute pregap on disc 1 track 4 of https://vgmdb.net/album/5549 , it's a whole long discussion between songs, with some audience cheering. VGMdb follows the "append pregap to previous track" convention, that's why track 3 looks so long. There's similar but shorter gaps with banter on tracks 2 and 7.
Cuesheet looks like:
TRACK 04 AUDIO
INDEX 00 00:00:00
INDEX 01 07:34:43
I became "pro-net neutrality" back in the 2010's when Verizon was trying to charge an extra $20/mo for hot spot functionality on my provider locked android phone.
After some rooting and side loading I was gleefully working around that until FCC came down on them for it [1]. Net Neutrality was passed after that and only seemed like a logical response as a means of consumer protection.
It has always been a user facing issue, it's just not one that many people seem to want to expend the energy to think about how it impacts them. Netflix isn't using that bandwidth, the users are. Without users, Netflix would use low/no bandwidth, just as it did when it was renting DVDs. The users are paying for their own access and speeds to be able to watch netflix over the internet instead. And in turn Netflix is paying their ISP to be able to provide that data. Punishing either the users or the web hosts for finding a more effective use case for the internet than just sending static pages is the ISPs either trying to find a way to blame someone else for having over provisioned their network. Or they are trying to strong arm web hosts into paying more because they have regional monopolies and can get away with it. As a consumer if I had a choice between two ISPs and one of them throttling Netflix to try and extort them for more money, even for self centered reasons I would pick the other just to have better service. But there are a lot of areas where that isn't the case and there is a single major broadband provider who has free reign.
Not OP, but to provide some historical perspective, RTX hardware raytracing is very firmly a gimmick and it isn't AI nonsense that's going to be the end of it. It's going to go the way of PhysX, 3D Vision, and EAX audio. Cool, but complicated and not worth the effort to game devs. Game designers have to make all the lighting twice to fully implement RT, and it's just not worth the effort to them.
Nvidia's own site[1] lists a total of 8 Full RT compatible games, half of which they themselves helped port. There are far more games that "use" it, but only in additional to traditional lighting at the same time to minimize dev costs. Based on that and past trends, I would personally predict it to be dropped after a generation or two unless they can reuse the RT cores for something else and keep it around as a vestigial feature.
"Full RT" means the game uses 100% raytracing for rendering (in some mode), which currently needs still far too much power to be a mainstream thing and is only added in a few games to show the prowess of the engine (IIRC a review of the Cyberpunk 2077 Full RT mode only a 4090 is really able to provide the power needed). The important entry is "yes", which shows far more entries and means there's Raytracing enhancements in addition to rasterization.
So, no, it's quite the opposite of what you stated: RT gets more important all the time, is not a gimmick and there's zero reason to assume it will be dropped in the future.
It is a gimmick in that you have to sacrifice far too much performance. An RTX 4080 will need to run at 1080+upscaling+framegen to get above 60 (!) FPS with ray tracing.
No thank you, I’ll take buttery smooth real 120 FPS at 4K. Especially because games have gotten so good at faking good lighting.
It does look fabulous though. I have a 4090 and absolutely turn RT on for cyberpunk. Even with a 4090 I use upscaling for a good frame rate. But the resulting image quality is just spectacular. That game is really beautiful.
You could argue 4K as a gimmick if you’re sitting at TV distances, but the difference between 60 and 120 FPS is extremely jarring. Try playing at 120 and then mid-session capping it at 60.
I would hardly say it's a gimmick. Now that frameworks like epic's unreal engine and others implemented for the developer. I don't see these technologies going away. One can hope that nvidia's dominance lessons overtime.
I believe the next big thing is generative AI for NPCs as soon as the models are optimized in the hardware for the average GPU. Let's see what the next generation of Intel AMD and arm produce. Windows branding of an AI is going to make this possible. It's going to take years though for the market to be saturated with hardware capable for developers to pay attention.
You do realize that RT greatly simplifies the job artists and engineers have to do to make a scene look well lit? The only reason it's done twice currently is because GPUs aren't powerful enough yet. RT will simplify game production.
I am deeply steeped in the history of computers and the biggest three things I can point to as the reason (MS-)DOS won are:
- Licensing: Most computers either had custom operating systems that were not shared with other hardware vendors, or in the case of BASIC frequently, were licensed themselves.
- IBM letting the genie out: The BIOS on the IBM PC 5150 was cloned, quickly and legally, and other companies started making compatibles. This caused an explosion of computer variety in a few short years for a single platform.
- Microsoft: DOS usually means "Microsoft DOS", Microsoft also was responsible for many of the BASIC environments of early systems as well. The ability to buy your OS from someone else lowered the pressure on hardware makers. IBM also favoured Micorsoft's DOS over CP/M-86 and stopped supporting it quickly.
All this meant the PC compatible ecosystem with Microsoft DOS became easy to make from a hardware side, and lacked a single point of failure like Apple, Radio Shack, Commodore. Atari, etc. There were other MS-DOS compatible DOS's out there, but MS-DOS was usually the one shipped with computers to be as "IBM compatible" as they possibly could and gained dominance through that.
EDIT: To those who may not be aware, BASIC did become more OS like before going away. HP BASIC was extremely feature packed before HP-UX replaced it and was more capable than MS-DOS in many ways. It evolved far beyond just a programming language.
> This caused an explosion of computer variety in a few short years for a single platform.
The impact of this point can not be overstated. 99% of businesses make a much larger investment in software (and people!) than hardware. The idea that compatible hardware systems existed was a great hedge on their investment in software and training. For most businesses, this would be a no-brainer!
Over a short time, other propietary/non-compatible systems were relegated to home use, education, and gaming.
Exactly, this is even supported by Nintendo's own services offering emulation of their older systems. There is clearly demand for the ability to older games.
Capitulation to an "inevitable" fate of download only games is just taking the easy way out by not sticking to your own core values. I have personally pre-ordered a Switch 2, but I will not being purchasing any online only cartridges or download only software.
We haven't had the watershed moment that brings it into focus for gamers at large yet, The Crew was close. But Nintendo has kept the download servers going for all of their systems which has provided a false sense of security. Once those start being shut down maybe we'll see some actual response. Though with the introduction of Gamecube emulation on the Switch 2, they are only a small step away from emulating the Wii and giving people another scapegoat for their lazy acceptance of lack of ownership.