Part of the problem is the lack for AAA studios to take big risks. They keep iterating on old ideas that do not take advantage of powerful hardware. The exception to this is photorealistic graphics, but that obviously limits the type of games that can be made. Only games like GTA 6 will be leveraging hardware in this way, but for most other games the graphical ceiling has been reached. Most games have more than enough resources to render whatever art style they want, barring photorealistic details.
The innovation that needs to happen is along the lines of leveraging lots of physics and entities in real time. I'm talking about zombie games with thousands of zombies on-screen, or RPGs with towns that have a realistic amount of NPCs, all of which can be interacted with at some capacity. The problem here is that this currently isn't leveraging GPU as much as CPU resources, but it doesn't have to be that way, especially with the new AI pipelines.
That said, consoles are now glorified gaming PCs. The advantage back in the day was that it provided a platform for developers that was already optimized and designed for games, but software tools have matured enough that this can be achieved in any hardware with engines like Unreal.
You lost me at '... that do not take advantage of powerful hardware.' The risks they should be taking are in making new kinds of experiences ... for existing hardware. There is plenty of room to innovate. They are scared to innovate so you propose that they innovate _and_ require costly hardware upgrades?
I meant in other ways than just pretty graphics. Powerful hardware can be used for game mechanics, adding more interactivity, physics, etc., but AAA mostly just upscales graphics.
Powerful hardware can help innovate in areas other than pretty graphics, but they're not taking those risks, and I'm pointing out that powerful hardware is mostly on the GPU side, but it doesn't have to be that way. Games could take advantage of powerful CPUs, but at the moment you just need a powerful GPU for the most part. CPUs would be needed for game logic and mechanics, but since there isn't a ton of innovation in that area then it hasn't been a requirement for games. At the same time GPUs could be leveraged for things other than graphics like AI based animations to allow for generative animations, etc.
Maybe an unpopular opinion, but graphical improvements for me have diminishing returns.
From PS1 to PS2 era the leap in graphics was massive. Suddenly those blocky games with a ridiculously short draw distance looked fairly pretty, and ran much better.
From PS2 to PS3 era it was still quite a leap. Games looked much smoother and better detailed. It still felt like a different generation.
From PS3 to PS4 era it was a decent improvement, but a lot less impactful then before. I mean, games got more photorealistic when they went for that style, environments were maybe more immersive, movement more fluid. Still meaningful, I'd say.
From PS4 to PS5 I feel that the improvement was marginal. Games have better lighting thanls to Ray Tracing I think? Overall a little prettier? I to this day question why exactly I bought a PS5, as I got mostly the same experience from the previous generation.
On the other hand, I think most - if not all - AAA devs have been improving in graphics, whereas the gameplay has been very stagnant and interesting.
Perhaps that is why I love my Nintendo Switch a lot more than either PS4 or PS5.
Big hardware change for the PS5 was memory and SSD. You can stream in a lot of assets and forego the Mass Effect elevator load screens. I can't think of a GPU innovation that isn't iterative from the PS4 to the PS5
- It seems like Sony chose to showcase "improvements" in games that already look pretty good, but there are other games for whom the improvement will be pretty dramatic (I've heard FFVII Remake cited as one). There may be an element there of Sony not wanting to put the worst offenders on blast by showing side-by-side of how shitty the game currently looks versus how nice it'll look on the Pro, which is interesting.
- Fully agree that unless you own a 4k TV, you probably won't give a damn. I don't and don't.
- I somewhat disagree about graphics plateauing, but I do agree that progression isn't 1:1 with hardware. I recently finished Alien: Isolation, which is an excellent game but showed its age in some areas (crappy smoke effects, uncanny facial movement even in pre-rendered cutscenes, etc). On the other hand, Metal Gear Solid V still looks amazing to me despite being just as old—and both were games that came out simultaneously on PS3 and PS4, so they're equal in that regard too. But PS5 games blow me away in areas that even the best PS4 games didn't: Horizon Forbidden West has mind-blowing water effects, parts of The Quarry (granted, many of them pre-rendered cutscenes) legitimately looked like a real movie, and Returnal is just all-around gorgeous (plus smooth as butter, to boot). Little things like non-shitty textures, objects interacting correctly, and halfway realistic hair really add up.
I think there is something like consistency to the look of a game. A game can be 20 years old and clearly be less detailed in every measurable way compared to a modern title, but still look good. Because there is nothing that makes you think "well they tried here, but it just doesn't look good". Or on the contrary, like when programmable shades were new and fancy, some games had super awesome effects in some spots that looked out of place because they seemed to have been added last minute.
Some games really made the best out of what was possible when they were made, the designers seemed to have gone back and forth until everything came together nicely. These titles are timeless.
MGSV is just a good looking game. It's one of those Yoshi's Island titles that will look good whether it's the day of release or 50 years into the future. Characters are well-designed, setpieces are pulpy and full of cold war personality, and the cinematics are all handled with that loving Kojima madness players have craved since MGS1. It really feels like a game that cost 80 million dollars to make, even though the version we all played is technically not even finished!
The real question (to me, at least) is how willing the games industry is to support total direction. I don't usually encourage one-man-band style development, but amidst all the paint-by-numbers Ubisoft titles I genuinely fear the design-by-committee releases. For games to be interesting again, we need to branch out from the current homogeneous identity of quicktime events and cinematics the length of a four course dinner. People gotta make games, again.
Here's a fun MGSV fact: the amount of time that's passed since MGSV:TPP's release is the same amount of time that separates the events of GZ and TPP :)
PC seems to be where more gameplay innovation occurs because the barrier to entry is much lower. I'd love to see Playstation make console development more accessible for a wider range of developers.
We are well into diminishing returns territory with high resolution textures and 3D models, but most games are still using comparatively crude lighting technique. Raytraced lighting, when combined with the above high-fidelity game objects can produce startlingly realistic games today. Here is an example for a recent game called Bodycam:
My point is, lighting is a force multiplier that we don't yet have in most mainstream games. But now with many modern graphics card providing support for it and technologies like Unreal's Lumen making it easier experiment with on the developer side, I am expecting this will be the primary area we will see games distinguish themselves in the "good graphics" category.
Sure, and I've seen comparisons before, but the truth is that I don't care. It doesn't change the game. Visual improvements between PS1 and PS2 made the characters look less bizarre and helped me recognize things. At this point, whether it looks photorealistic is just... irrelevant?
Maybe some people care more, I can accept that. But there is no chance I'm going to buy a PS5 Pro for it, and I don't think many fans will.
Graphics card companies have done a lot to make me less and less interested in better graphics anyway. What I really want are better games.
An example just off the top of my head is a game where you have to fight a medusa by relying on shadows/reflections only because if you look at it directly, you're literally petrified. Attempting to do that without proper ray tracing simply wouldn't be good enough.
I really wish developers on consoles would just accept what kind of performance consoles really offer. There isn't a single reason a game should be 30fps these days if that game tests your motor skills in any way. If you can't hit a locked 60fps target, you need to reassess your visuals.
Hearing that 2/3 PS users are switching to 60fps, away from a default of 30fps is really interesting. You can have the best screenshots in the world, but if your game doesn't run well people can tell.
IMO low latency, rock solid, 30FPS is fine but hard to pull from on a 3D engine. The issue is average FPS means almost nothing when lag happens right after dramatic inputs like rapidly turning around. Worse there’s significant latency on everything from wireless controllers to the screen.
I just wish the industry standardized on reporting and low 0.1% FPS and latency, it would have done wonders for the entire gaming industry.
Some rendering engines do smear time in various ways like Motion Blur. Ray tracing can pull it off more cheaply than rasterization but you still need to compute geometry multiple times in as single frame.
Depends on the specifics, but you can burn basically unlimited processing power shooting more rays and there’s just diminishing returns. We’re not there for games today but yesterday’s render farm is tomorrow’s GPU.
As to why it’s used, higher FPS creates artifacts on things like spinning tires. Persistence of vision means flashing something for 1/240th of a second can be surprisingly obvious. We perceive fast objects as actual blurs not just sequences of still images.
As long as people keep buying garbage, people will keep selling garbage. Can't really blame them.
Framerate aside, a good percentage of modern TVs have abysmal latency. I recently played the original Super Monkey Ball 2 on a gamecube hooked up to a CRT and was shook; that sort of precision would never fly on modern hardware.
Game mode helps, but only somewhat, it does not generally solve the issue.
For this reason, as far as tight gameplay goes, my money's on devices like the Switch.
Not all that glitters is gold. I had a horrible experience with Hollow Knight on the Switch and can't understand how anyone would enjoy that game on that console. I did a video recording and the lag playing on the console screen was easily in the 100ms (iirc; can't find it).
From what I read on Reddit, there are some people who don't notice any lag (even in the aforementioned HK), others notice it in most games. For me, even the Zelda titles have some lag (though these I play on the TV, so it's a bit unfair).
For my part, I'll stick to PC; even if it means crying a little bit on each upgrade. And that's coming from the guy who used to often game at 25fps as a kid; and still enjoys playing the odd couch game (with friends) via wired Steam inhome streaming.
If you (or others) are happy with their experience, that's fine of course ;)
The switch (especially the pre-OLED) does not have better input latency than a kinda modern OLED TV.
The problem is that we're compounding latencies that would be ok if it was just 1. Controllers? Bluetooth instead of a cable. Monitor? OLED instead of CRT. Audio? Maybe also wireless.
All together creates a sluggishness you can feel but can't pinpoint.
You have to have the best screenshots for the press, but most players want to play in 60fps or better. Games have for some time now been offering two graphics modes so that the press gets pretty images and players get fps.
I personally believe it is hard to see a better graphics right now, but! After you start playing better graphics, it is pretty noticeable when you go back.
Examples:
1. Switching to retina displays. For a lot of people that was not noticeable, but after you start using only retina displays everywhere, going back really sucks.
2. Going to ProRes displays, also hard to notice, but now I cannot use my iPad Mini, because it is noticeable. And I definitely see it on my spare iphone mini.
3. Gaming in 60/120Hz, it is impossible now to play 30Hz games.
So yes, if your mind is not spoiled, you can play low resolution games on Steam Deck on 30Hz with some drops. But if you spoiled your mind with 120Hz on 4K display/TV, you will see such a difference going back.
Regarding 3... I disagree with this one. I like games with stories and cutscenes, and 30Hz feels a lot more cinematic to me. I don't mind 60Hz/120Hz but I actually prefer 30Hz for most games I play. Also, whenever there is a choice between 30Hz+Better Quality and 60Hz, I always pick 30Hz.
- If you just run the previous generation of games on the newer graphics, it's only a tiny bit better. But games designed for this generation will have more of a difference. Ultimately, it's all about giving the creators more freedom in making the game. Amazing games have been made with much less capability than today.
- You're going to get another big jump in graphics and immersiveness once the current neural rendering techniques are productionized. (though PS5 Pro probably isn't going to be important for that.)
Maybe so. But the part of the article I resonated with is that it just doesn't matter. I don't care. I buy games for gameplay, and better graphics -- which once opened up kinds of gameplay -- just doesn't matter anymore.
And this has been thing for good long while. One could even argue that some previous titles that played well with art style in their limitations look better than some later one. And this is even when doing "realistic" games. Games have looked good enough for many years now.
I really feel that we are still often lacking in other areas, or not enough care is put in there or lot of work and effort is being misdirected for various reasons...
Its a hardware event and they talked about hardware with screenshots from games you're familiar with.
The alternatives are to dead-reckon the fidelity of new games you've never seen, fake some downgraded comparisons that haven't shipped for those games or talk about gameplay that should be identical on the existing PS5.
I guess they could have thrown in some irrelevant sizzle but wouldn't it just overshadow the hardware?
Games are being held back by cultural stigma. At its core it's a new artistic medium, just like moving pictures (film) was back in the early 20th century. Imagination is the only limit here, but society at large still thinks "video games" are for kids. Perhaps if it was called interactive entertainment or something like that, people would be more likely to get into it.
That's one data point. If you compare it to movies and shows, the audience skews a lot younger. If you ask the average person their perception is that they are too busy to play games, yet they probably spend many hours watching movies and shows.
It's not as much as it was 10 years ago, but the perception of wasting time is still there. I'm mostly talking about the US though.
Agree with other comments regarding play vs. visuals.
Don't get me wrong, I've been floored with some of the visuals the PS5 and Xbox have produced, and every generation there's a few titles that just blow my mind with how good they look.
But Nintendo has really been the winner here - they march to the beat of their own drum and while their games don't look jaw dropping, the enjoyment I get out of playing them more than makes up for it. Games, at their core, for me atleast are for playing and enjoyment. And I'll take a game that can put a smile on my face and give me and my friends hours of enjoyment together versus a game whose visuals are effectively real-life and running at 60fps.
I could bring out an old gamecube or n64 at a family gathering and we'd all have fun playing with and against eachother in the same room. Can't say that for many Xbox or PS games.
I mean, that's a 4 controller console vs 2 controller consoles... Games were made with that in mind. Can't have everyone on a PC but there are obviously beloved multiplayer PC games.
The thing I loved about the PS5, above all else, is that basically every game has a “performance mode” aka 60fps. I appreciate there are lots of people out there who don’t care about 30fps, but the ps5 generation gave us the choice.
Furthermore, almost all the games I played on PS5 were PS4 games that now ran butter smooth.
I'm all for user choice, but at the same time, the reason I opt for a console is to not think about that kind of thing. I expect the developers made the game with the hardware in mind and optimized it to run well with the graphics provided.
The idea of a graphics vs performance mode feels like they needed to push the PS5 too hard to make the graphics good enough to justify a new console, but realistically, the hardware can't handle it.
The sales pitch of the PS5 Pro is that the choice no longer has to be made, which is how it should always be with a console, imo. If people want to tweak and tune to make their personal tradeoffs between graphics and speed, that always seemed like what the PC was for.
I was looking at getting a PS5 slim, but heart the Pro rumors, so I waited. After seeing the graphics vs performance toggle, I don't want the slim, but looking at the comparisons, I don't see enough of a difference in graphics where I want to pay more for the Pro... so I think I'm likely going to skip it all together, at least for now. Things might change when GTA6 comes out. That could make the choice more clear.
If games only shipped with one mode, it would be graphics mode. The performance modes are there to cater to users who prefer a different tradeoff (i.e. to get PS4-level graphics in exchange for a higher framerate), not because the graphics modes aren't good enough.
Personally I was disappointed to find this generation of consoles still asking us to choose between performance and visuals. It felt like a necessary compromise last generation because those consoles were seriously underpowered; I'd hoped this time around we'd be back to just plug-and-play. It's easy to forget that prior to the PS3 and 360 generation, 60fps was very common.
I know I'm in the minority and a lot of people prefer the choice, but when a game is offering three or four different fidelity/performance modes (not uncommon now raytracing is a factor), I get decision paralysis and worry about whether I'm getting the optimal experience. I'd like a "developer recommended" option that they feel best matches their creative vision.
The fidelity vs. performance decision is precisely what Sony markets the PS5 Pro as eliminating. I’d also just prefer developers make 60fps the baseline, then use any additional horsepower for more frames or more real resolution.
The biggest problem with realistic graphics is the ever growing amount of busywork required to keep it all together that seems to overwhelm game developers. It makes everything feel very "rigid", like walking through a museum full of stunning sculptures but you are not allowed to touch anything or the illusion falls apart.
One example: In the old 2D versions of Anno you could shoot animals on the islands with cannons but you can't do it anymore in the newer 3D games. It's kinda stupid but was hilarious and was probably cut because adding those small "random" interactions is so much harder at increased realism.
I recently played God of War on PS5 when visiting a friend, what I noticed was that it was feeling really slow/sluggish, you press a button to punch someone and the whole character animation feels like it take 2-3 seconds. I also think, that in most of the games I've seen, the running animation still looks really stupid, especially in games that try to look realistic, it doesn't bother me that much in games like Zelda. But these days I mostly play round based JRPGs on my switch, so I really don't care that much about graphics or framerates. :)
After trying the Vision Pro at an Apple Store I’m more convinced than ever that immersive VR entertainment is the future. I have no doubt we will need high end graphics arms races for a while more.
The guy in the video is sick. It's in San Francisco. He's "no actiallying" and tugging at his ears. It's contagious here. I have no idea what it is. What is his name? Is he still alive?
Edit - his name is Raph Koster and he's a game designer. This sort of sick is everywhere in San Francisco. He may be a first patient or early patient vector. The ear tugging, the linguistic and verbal crutches. He's sick. It causes me to be in extreme pain. I've emailed him. He mentions early in the video about a "game he hated". So this may be a "virus as a game". These headaches are the worst I've ever had and they never go away - they're suicidal ideation inducing. I'm spending all of my time warning people away from San Francisco so if this is a joke then I don't find torture by a lynch mob funny.
He further mentions a telepathy joke which refers to gaslighting done to Daniel Cohens 1968 book Myths of the Space Age. I don't give a shit if Astro Teller is my cousin and Elon Musk put a big X on the side of a building, you don't poison people.
San Francisco is a broken hellscape of shitty people. I can't believe how fucking ignorant everyone is.
And if someone is going to gaslight me to the next "head" and fuck with me in a circle of idiots I'm just going to keep breaking it. It's sick and fucked up in the extreme.
This may be epidemiology that's bering spread through the ABDL community if it hasn't already run its course. See the "game over" onesies. Meaning that there are other epidemiology that is spread in association with other forms of viral meme based gaslighting and sick. Holy shit dude.
So that now means I'll be emailing all of academia and anyone else I can reach and pointing out that this viral vector meme is showing up in this branch of pornography and then asking what the other patient zeros are so we can backtrack and do root cause analysis. Goddamn.
And here is the list of people I just emailed. https://imgur.com/a/VMD37QA. I email a list like this every day. Thanks for the perspective. Perhaps the epidemiologists at every ivy league university in the country will be interested as well.
I don't like being poisoned. Perhaps you've noticed.
I recently got the rulebook for a Wizard of Oz TTRPG. It re-uses the D20 system, but everything else about it (including all the spells, races, monsters, etc) are new.
What most game studios have forgotten is to put the "game" in video (or my preference, computer) games. They forget the fun.
The first Unreal was an amazing spectacle, but it was also a blast to play. There are games that jam pack everything they can onto the screen, ultra hi-res this, and ultra-realistic that... but they're missing the play.
Minecraft is not fun because of it's amazing fidelity. It's because of the things it allows you to do, and how much it leaves to your imagination.
Funny, I was playing Cyberpunk 2077 for the first time very recently. Previously, I'd played Cloudpunk, a "cyberpunk" voxel game with significantly less overhead.
But, most of the "vibe" is impressively similar; down to the first time I visited my Cyberpunk apartment, I was like -- haven't I been here before, because it's so similar to the Cloudpunk one, despite "graphics" differences.
I thought the entire eighth and ninth console generations were unnecessary, with the possible exception of the Switch. I eventually knuckled under and bought a PS4 so I could play Final Fantasy and Rez Infinite, but I haven't gotten into the ninth-gen stuff yet, several years into the console cycle.
Sony claimed that their user base compromised on "graphics" for the smoother experience of the "Performance" mode, so I don't see how they missed the ball when they decided to launch a Pro model that helps tackle this user need. Even if it was an excuse and they would release it anyway.
The whole "Improved graphics" thing isn't just about "how nice a game looks compared to the previous generation", and I think people understand that. It's a common marketing expression for improving the game's visual experience, not only resolution and textures but also the number of things happening on the screen, smoothness (FPS), drawing distance, etc - which they addressed in their presentation video.
Now maybe we're reaching the point where marketing videos cannot display accurate improvements due to the limits of streaming, because a game experience in 30FPS vs 60FPS or 120FPS is vastly different. Even in a desktop environment, you can see it.
The thing is that we were used to having a new hardware look revision as well, with a different design or on a smaller form factor, consuming less power, which didn't happen this time around but that's ok I guess.
But the thing of "you should stop launching improved products because the games looked fine" is a bit silly. I think Sony serves their clients well, just like Nintendo does, they cater to different people.
The reality is that games are taking longer to develop, as they are more complex - let's not forget that Zelda BOTW was a Wii U game, and the Zelda TOK was released at the end of the Switch lifecycle.
Anyway, good for them to release the pro model. I won't be buying it because it's not a product for me.
Some of it (judging from myself) might be that companies have been pouring resources into graphics for 20 years and the benefit to me, as a gamer, has been modest. Sure, escaping PS1 graphics was good. Better graphics make it possible to recognized smaller features of the screen and interact with them. And, up to a point, better graphics help you imagine the gameworld.
But it's gone way overboard. I've looked at what raytracing can do. In an intellectual way it's impressive. But it doesn't really improve the experience of playing most games, which is driven by writing and gameplay.
And the latest video cards from NVIDIA just reinforced that feeling, as does the PS5 Pro price.
I think they've lost the plot, at least in terms of my interests.
> But it's gone way overboard. I've looked at what raytracing can do. In an intellectual way it's impressive. But it doesn't really improve the experience of playing most games, which is driven by writing and gameplay.
I understand that sentiment, but the implementation of ray tracing as part of an improved game experience is bound also to game devs. Art direction and game design are other key aspects of this.
That's why I don't see how Sony can be to blame for this. As for the 700$ price tag, I don't think it's cheap, and I'm not buying it, but I don't think 300$ for a Nintendo Switch is cheap either.
I welcome the age where my steam deck is beaten by a recent iPhone in benchmarks, yet can smoothly play a lot of triple A titles and they still look good.
I’m less interested in a jump in graphics than performance, stability, and good gameplay. That wasn’t the case until about 10 years ago.
Say what you will, but this is why Cloud Gaming will make sense eventually. Upgrade cycles have to make you want new games - you can't advertise the same experiences with a "now in 4K/8K" label on it. On top of that, console gamers already pay for online service access and many subscribe to PlayStation+/Xbox Game Pass. They're effectively getting the worst of both worlds - paying SaaS prices for an experience they don't control.
So... get rid of the console, and the equation balances again. Gamers just sign up for the service they like and play things from the devices they prefer. Console upgrades happen server-side, and the cost is amortized against subscription revenue for everyone involved.
I personally prefer playing games locally, on my PC. But I can't guarantee the majority of console owners feel the same way. For all of Microsoft's short-sightedness, hopping on the Cloud Gaming bandwagon seems more justified with each passing day.
I tried Microsoft's cloud gaming service to play some games on my Mac without having to buy a gaming PC. I had some lag issues, which I assume will get better with time, and also could have been due to some local network issues that my ISP fixed several months after I dropped the service. I know it can be a decent experience, as I had OnLive back when that was a thing and it worked pretty well a decade earlier.
Ultimately my issue was the lack of ownership, and much like with video streaming services, the seemingly random licensing agreements. Due to the lag, GTA V was one of the only games that I could play decently well. I had it years ago for PS4, but never finished it, so figured I'd jump back in. One week into playing, sorry, the game is gone, it might come back later. I find this unacceptable. For long games, or games I love and want to come back to again and again, I'm not willing to put up with that nonsense. I cancelled the service the same day that happened.
I've mostly been playing the Switch over the last 2 years due to Breath of the Wild and Tears of the Kingdom, but have about a dozen games. All of them were bought with physical media. I can trust the physical media will stick around, because I control it. When Nintendo shuts down an old eShop, people's favorite games could die with the hardware they happen to have it installed on. Those 2 Zelda games ended up ranking #1 and #2 as my favorite games of all time. I figure I will be able to buy a Switch in some form for a long time, but who knows how long they'll keep the servers up to download a new copy for a new (or new to me) console. If I was playing Zelda on a streaming service and had the rug pulled out from under me 200 hours in... nothing about that is good. The possibility of that would make me not want to start playing in the first place.
I don't like the trend away from physical media. As the article suggests, using game size, due to ever increasing realism, doesn't seem like it's going to be a great pitch for why it needs to go away. When we can fit 1TB on a microSD card, it seems like physical media can and should still exist, maybe just not on optical media. I also typically like playing single player games, so there is no practical reason to connect to the internet. With portable systems, like the Switch, internet requirements become a problem.
Fundamentally, I agree with you. I pay a premium to game on PC because I care about digital persistence and refuse to accept a world where Sony/Microsoft/Nintendo ultimately draw the line on my usage of their software. That being said, I would comfortably wager that the majority of console owners aren't concerned with whether they will be able to play Star Wars: Outlaws and Far Cry 6 in 2035. Most of them aren't even thinking as far ahead as next month, if Game Pass and Playstation+ subscriptions are anything to go off of. Hell, the PS5 Pro just got announced with no disc drive - the Xbox Series S has shipped without physical media for years now. Millions of console gamers have accepted their fate as digital-only households, for better or worse.
Betting on the frugality and apathy of console owners is an incredibly smart bet on Microsoft's behalf, and I say that as someone that considers the Xbox a failed product. Microsoft spent the past 10 years trying to promote a vertical integration of Windows and Xbox services, something which almost nobody cares about. Betting against next-gen demand for new console hardware is a move that only seems smarter as the economy gets worse.
I’d be curious to know what percentage of PS5 Pro buyers end up installing an optical drive. I would assume the Pro buyers are less price sensitive and more demanding customers, who would be more likely to opt for the drive, even if they need to install it themselves.
Frankly, I think it’s insane they can release something called the “Pro” with fewer features than a version of the normal console.
It’s clear the consoles are pushing to remove physical media (at least Microsoft and Sony). But I don’t think we should pretend for a second it’s anything more than a way to increase profits. A digital only console effectively kills the second hand market. Like you said, I don’t think these people are thinking ahead, and they just take what they can afford. Sony can price the optical drive version at a premium and put up additional barrier, then report how customers prefer the digital only version, when the price is all they like.
If by “eventually” you mean in a few decades then maybe I’m with you. The state of internet connections worldwide is still way behind what you need in order to make that kind of streaming an option. And for some players the input lag will always be an issue.
I don’t see hardware to game getting away anytime soon. Especially because with the current setup Sony gets to earn twice so I don’t see why they’d stop doing that.
I'm fine with cloud gaming being an option for those that don't care about the compromises, but I hope it never becomes the only option.
Ownership and preservation has already effectively vanished for gaming given the steady move away from physical media (and the fact that what often ships on the disc is severely compromised without a day-one patch); streaming only would be the final nail in that coffin.
All the Stadia users who I spoke with were generally happy with the service while it lasted. Their main concern - latency - turned out to be a non-issue.
To me there's a great niche for it to be explored: cheat prevention in e-sports. Of course aimhacks will always be a thing, but the top players were always indistinguishable from cheaters in this regard.
> Their main concern - latency - turned out to be a non-issue.
I'm not fortunate enough to live near a Google (or Microsoft) datacenter. So latency in cloud gaming is a very real issue. I'm glad for those who could use it lag free. I never could (and probably never can short of moving).
The middle of Montana isn't exactly a hotspot for datacenters. For good reason.
Yes. stadia basically worked well and latency wasn't an issue. But if you care about image quality (artifacts, colors, etc) it was only OK. I found it pretty distracting, moreso for faster paced (lots of motion on the screen) games.
The issue is that handhelds are limited by their batery capacity. Running a steam deck at full power will exhaust the battery in ~2 hours. The ROG ally X outperforms the steam deck, but at the expense of nearly 2x battery consumption (25 watts vs 15 watts). You can't even run the ROG Ally at full power unless plugged in. Valve is probably waiting for efficient processors to make a significant leap in computing performance at the same TDP.
The innovation that needs to happen is along the lines of leveraging lots of physics and entities in real time. I'm talking about zombie games with thousands of zombies on-screen, or RPGs with towns that have a realistic amount of NPCs, all of which can be interacted with at some capacity. The problem here is that this currently isn't leveraging GPU as much as CPU resources, but it doesn't have to be that way, especially with the new AI pipelines.
That said, consoles are now glorified gaming PCs. The advantage back in the day was that it provided a platform for developers that was already optimized and designed for games, but software tools have matured enough that this can be achieved in any hardware with engines like Unreal.