Hacker News new | past | comments | ask | show | jobs | submit login
The tech arms race in AAA games and why I'm abandoning it (andreaspapathanasis.blogspot.com)
189 points by jsnell on June 3, 2015 | hide | past | favorite | 191 comments



Tech to games is equivalent to the medium an artist uses. Crappy tools and materials can ruin the experience (i.e. if Michelangelo's David was made out of, say mashed potatoes it would have not survived and it would have looked silly after a while), but good materials do not make an artist (i.e. give a non-visual person marble and give Michelangelo mashed potatoes and what Michelangelo produces will probably be better than what the other bloke produces).


I agree. It uses the term "tech" in the article and fails to mention everything that isn't graphics related. Even if you decide as a game maker to forgo 3D for whatever reason, you still have REAL technical limitations when it comes to the game mechanics. Physics, AI, and Feature X can cost just as much processing power as some fancy 3D graphics, regardless of whether it is realtime or not. Major game makers already know this fact, and that is why optimization of the tech isn't optional: not doing so means there is a lot less you can actually do.


The author tells a story about being a graphic developer, so that's probably the reason for focussing on graphics.

Telling the AI people that their efforts are wasted on an arms race with little effect on the game play is a much weaker argument than 'hey, I need to revisit what _I_ am working on'.


Re AI I was just recently thinking about Half-Life 1 marines AI (low tech, required hint nodes) against which I definitely had memorable experience, compared to the slew of titles that followed, with "improved" AI that looks like a game of whack a mole alternating with scripted sequences.

Guess what? I booted HL1, and that wasn't just nostalgia tinted memory. Smartly applied low tech produces fantastic results in the proper hands.


Something that kills me today is many modern games still have worse facial animations than Half-Life 2.


I don't think the article makes that mistake, there are plenty of non-graphics examples: AI / simulation of thousands or RTS units, motion controls, companion apps.


Two lines out of the entire article mention those. I would not exactly call that "plenty of non-graphics examples". There are many more CPU intensive processes besides just those that result in real technical challenges, as well. My point is that since the article doesn't do them justice, it shouldn't be the phrased as the "tech" arms race, but instead as the "graphics" arms race.


It was definitely more than 2 lines.


yes there are many technical aspect in game development

Game Engine Architecture(non affiliated)

http://www.amazon.com/Game-Engine-Architecture-Jason-Gregory...


I use the Louis Armstrong example. On a mediocre trumpet he'll outplay me on the best. But the added tools will let him transcend into something magical that can't be reproduced.


Not if that 'best trumpet' takes more effort to play. Every new generation of games hardware brings extra demands on the artists making the games (through expectation of increased level of detail). Their creativity isn't being unleashed, it's being forced into a narrow specialism.


Better trumpets are actually often more difficult to play because of extra weight and wider bore. Depends on the model of course, and in other ways they'll be easier to play, like for intonation, smoother valves, etc. A player might also choose to use a mouthpiece with a deeper cup, which gives a warmer / bigger sound but makes it more tiring to play, especially in the upper range.

That said, I'm sure Pops could handle just about any trumpet you threw at him!

It's said that Charlie Parker played a plastic saxophone at times because he would pawn anything he owned of value to buy heroin. Bet he still sounded great!

I'm not sure how this applies to AAA game dev, though... maybe something like, every new performance venue built is larger than the last, which results in bands getting more members, longer arrangements, lighting and set design, signers, dancers, etc.


> "Better trumpets are actually often more difficult to play because of extra weight and wider bore. Depends on the model of course, and in other ways they'll be easier to play, like for intonation, smoother valves, etc. A player might also choose to use a mouthpiece with a deeper cup, which gives a warmer / bigger sound but makes it more tiring to play, especially in the upper range."

That's interesting. I know next to nothing about trumpets, but the situation appears to be different than for guitars. Generalising, guitars are setup differently depending on the style of music and the technical demands, which has an impact on how easy they are to play. For example, tonally you can get a richer sound with slightly heavier than usual strings and higher than average playing action (more space for the strings to vibrate), however from a speed point of view you generally want lighter strings and lower action. These are just general rules of thumb, it's possible for heavy strings to sound 'muddy' and possible to get great tones out of lighter strings (guitarists like Page and Iommi are reported to have used light gauge banjo strings on their guitars, and I don't hear too many complaints about their tone).

> "I'm not sure how this applies to AAA game dev, though... maybe something like, every new performance venue built is larger than the last, which results in bands getting more members, longer arrangements, lighting and set design, signers, dancers, etc."

It tends to be the larger the team, the harder it is to build something with an interesting personality. Plus, to use your concert venue analogy, more creative effort is spent in overcoming how to fill the venue with sound, rather than exploring what that sound could be.


But will he outplay you using more notes, or the right notes?


Playing the same notes more right. :-)


Find myself reminded of the late micro era when people came up with all manner of crazy tricks to produce 3D effects and such on the C64 and like.

It may well be that PC gaming producers have gotten "lazy", as they are working with a ever improving platform. Thus they can target "future" hardware rather than attempt to optimize and push what they already have.

This because we see something similar to the micro era in gaming consoles. There games towards the end of a platform generation are more refined and elaborate than those released at platform launch, because the producers have gotten to know the ins and outs of the hardware.


Refined technically, but not necessarily refined from a gameplay point of view.

Look, I get it, it's cool to see hardware get pushed beyond what people previously thought was possible, it demonstrates ingenuity and can produce a captivating spectacle. But it doesn't make the games better (beyond the wishy washy 'immersion' stuff).

To be honest, this sort of 'pushing the hardware' focus makes me look forward to the end of Moore's Law. When games can no longer meaningfully compete just by looking prettier, when there are no new console generations, I hope there'll be a renaissance of games as a form of expression and interesting experiences.


Depends. While i perhaps gave a poor example initially, consider the difference between SMB1 and SMB3.

The latter have all kinds of one off and crazy mechanics (walking behind the apparent background, a sun that attacks, windup shoes one and enter and exit) that keep the various worlds fresh, in part because they show off the greater understanding the developers have of how to work the hardware available.

Then again i may be that working with two dimensions provided the ability to put in more elaborate game mechanics because the player had better situational awareness.

3D games seems to have a severe issue with this because enemies can walk behind the camera. Thus many games with free camera movement and/or zoom ends up being played in a kind of overhead position at the furthest zoom possible to improve situational awareness.

something that ends up defeating all the work put into elaborate detailing on both characters and enemies unless there is heavy use of cinematic pans etc.


> "Depends. While i perhaps gave a poor example initially, consider the difference between SMB1 and SMB3.

The latter have all kinds of one off and crazy mechanics (walking behind the apparent background, a sun that attacks, windup shoes one and enter and exit) that keep the various worlds fresh, in part because they show off the greater understanding the developers have of how to work the hardware available."

For the most part, the evolution of game mechanics doesn't rely on improving hardware usage.

Using the SMB1 vs. SMB3 example, the ideas from SMB3 could've existed in SMB1, they might not have looked as good but they could've been done.

If the 'hardware enabling game mechanics' hypothesis was correct, every new console generation would be able to add new mechanics just by being faster. Taking 2D gaming as a whole, I'd suggest the horsepower needed for 2D gaming peaked around the Dreamcast/PS2 era, I don't envision any 2D game type being built that couldn't exist on those consoles.

If VR is to take off, gaming hardware will need to evolve, but beyond that we won't need new hardware to enable new game types anymore. VR/AR is the end of the road.


The original assertion of "developers working the hardware better", and your rebuttal, are both incorrect. There was a literal constraint enforcing SMB1's design: ROM space. The developers of the first game had already had several years of experience on the platform, and they were straining to squeeze in the amount of content they had. It fits in 40kB. SMB3 uses 384kB, in comparison: 9.6 times bigger. Essentially, each world in SMB3 can contain 1.2 copies of SMB1.

What changed was not a matter of experience, but that cartridge game technology grew massively more powerful and changed the whole nature of the platform within the span of a few years. This was very much a case of "more is more." You could get a few of the ideas in SMB3 in the space of SMB1, but not all of them at once. And development costs rose sharply in the latter half of the 1980's for precisely this reason - you could feasibly make a "big" game that evoked a whole world with a variety of interactions, and doing so was a huge spectacle which necessitated a team dedicated to content creation, rather than a solo dev, yet was also largely independent of the graphics fidelity. Sierra Online built its name in this period by capitalizing on exactly this quality - presenting a world just a little more fully realized than the text adventures preceding it. It was a golden era for role-playing games, too - Ultima, Wizardry, Dragon Quest, and Final Fantasy all had big entries during this period.

The picture changes as you get into the 1990's. Most of the advances are pretty firmly on the side of "graphics treadmill." Games do keep getting more complex, but by the end of the decade, with stuff like Baldur's Gate, Jagged Alliance 2, Starcraft, Counter-Strike, or Deus Ex, you can find pretty much all of the design templates that AAA leans on. Design has mostly trended towards efficiency and simplification since then.


With the recent rise of indie games and the platforms to distribute them, has the swing back toward solo devs actually changed the industry?

Or will the AAA games just continue their graphics treadmill?


We've only seen the tip of the iceberg, in my opinion. AAA is in the process of a slow-motion collapse, despite its best efforts, because the nature of retail has changed to favor digital distribution, triggering an erosion of price points and upending the balance sheet. (no shelf space, no distributor lock-in, etc.) This was set in motion in the 2000's, but only became a life-or-death issue more recently. Console gaming hasn't had it this bad since 1984.

And indies have always had it tough, but they've experienced a sudden shift from a private market with the most prized audiences available only to the well-positioned and connected(e.g. get on XBLA in 2008, or Steam in 2011), to a market that is saturated everywhere, with few opportunities to stand out of the crowd. In theory, this means people should be more creative and adventurous. Anyone who intends to make a business of it is obligated to shrink scope, derisk and focus on getting a great promotional story together, though, and so it's hard to remain optimistic if your approach is more technically-inclined and favors design risks. I got to be in the thick of all of it as it happened and gradually concluded in the past year that I couldn't make it work full-time, not within the context of the projects I genuinely wanted to work on.

There is still room for AAA where it feeds into the F2P business. So MMOs will definitely stick around, and competitive games like League and DOTA. But that model doesn't favor the typical single-player campaign experience. All the paths to sustainable, growing success in gaming right now lead to something that involves F2P marketing tactics, community-building, platform ownership, or preferably a combination. Anything less than that faces a huge uphill battle to not simply get lost in the shuffle.

All of that said, the market is not dead by any means. The graphics race is on its way out - it'll continue to be advanced by dedicated engine devs, but the ability to render things is entirely a commodity now, borne witness by the frenzied price wars between Unreal and Unity. That, along with the shift to digital, has shaken the industry's dynamic into something way more unpredictable. It's great if you're enterprising enough, but most devs aren't, when it comes down to it.


One idea you couldn't do in SMB1 was going backwards. SMB1 was a game that was going to be included with almost every console that was sold so the cost of the cartridge really mattered to the corporate bottom line. Not only was the rom chips for the SMB1 cartridge a fraction of the size of SMB3 (and a fraction of the price), but SMB1's cartridge contained no expensive SRAM chip.

The tiny SMB1 board: http://blog.grunick.com/wp-content/uploads/2014/06/IMG_9484-...

On SMB3 you will find the level data stored at 6000+ which is inside the SRAM. This level data is what allowed you to go back to areas in the level you had already played and find consistency. In SMB1 you couldn't go back because it didn't store that data and so the bad guy you killed and the blocks you destroyed would have to appear again.


There is already a huge renaissance of games as expression. The Indie game scene has been and will continue to be huge and a lot of those games approach the artform as something more than a tech demo.


I'm grateful for the growth of the indie game scene for that reason, perhaps I just need to be patient as gaming takes time to evolve.


"It may well be that PC gaming producers have gotten «lazy», as they are working with a ever improving platform. Thus they can target «future» hardware rather than attempt to optimize and push what they already have."

It's not that. There is a lot of optimization going in there and often the reference target is actually the common denominator of mainstream game consoles (lower than what each of them individually-taken have). Only on top of that there are optionally-added (and activated by-default, out of marketing reasons) layers that attempt to use more resources and which gives you your stated impression. And yes, it's true what you say about "the end of a platform generation" because the optimization efforts (including platform-specific ones) are stacked trough time.


it would have been delicious initially :p


One of the sadder things I read today:

"What I was blindly ignoring back in my teen years was games like Elite, Ultima IV, Zork, MUD - games that fit just fine on the tiny possibility space my original PC enabled."

I had a (slap back of hand) cracked copy of Ultima IV. No User Manual. No Map. No Potions Guide. I had to create my own maps of the world, and permute ingredients to figure out all the potions. I spent three of the most glorious weeks of my life completing that game, and, to this day (close to 30 years later) I recall the level, (you couldn't restore health in the midst of a dungeon) in which I ran into the mirror image of my party and had to battle myself. And riding the balloon over pirate bay. Oh my, ...


What I remember most about Ultima IV was the blind shop keep who determined if you paid her the correct amount by hearing the clink of coins. So you could buy anything for 2 coins so they clinked against each other. I thought I was so smart, but my 9 year old self didn't realize that the game was about building virtue...


What's particularly fun if you read some of the modern guides, now that the game is thoroughly understood, is how to game the system. Buy stacks of reagents from the blind women, crashing your honesty to the bottom, then boost it back up in the most expedient way possible later. Steal everything not nailed down in the early part of your run-through (if you play honestly the whole way through, resources are actually sorta hard to come by), then use the stolen money to give to the poor for Sacrifice, then later on fix your honesty (or whatever stealing counts against, I forget) in some expedient manner.

I find something quite hilarious about the idea of a MinMaxing Avatar of Virtue. Minmaxing often breaks the mechanics of a game, but it's rare to see it break the morality of a game the way it breaks Ultima IV.

Brilliant game.


Ultima IV was and continues to be the primary reason I do anything with games development.


For anyone who is an Ultima fan, you might want to check out Mortal Online by Starvault (based in Malmo Sweden). It's the closest 3d mmo in the same vein I have found so far (Full loot ffa pvp, guild wars, taxes, generally hardcore) I was very active in the alpha and first years of the game as part of one of the main guilds, which were quite rough due to bugs (and still so), but they have fixed many of their pipeline issues and have fixed tons of bugs and just released an expansion last week. There were no maps so I helped make the main map that everyone uses today for it: http://4.bp.blogspot.com/-Twm2yJ-ndrg/TymPaFeZbEI/AAAAAAAABh...


Ultima and Ultima Online are very different games, and the fans of each are different. I don't think an MMO can do what Ultima did, by its very nature.


Just so you know, not every fan of UO is an anti-social PK asshat.


No, but the game lends itself to it. =) But it's different from that, anyway. Ultima, I feel, is a game about you-the-player, whereas UO (and all MMOs, really) are games about you-the-collection-of-numbers.


Oh wow! Thank you for the large sized shit of nostalgia. What a wonderful game that is. Like you, I had to draw my own map. I never did finish it, because it was fun just wandering around.


At some point it comes down to what gamers want, and that's always going to be "different things to different people." Papathanasis illustrates it himself when he shares the reasons his wife prefers Sims Social over Sims 3.

Also a great game with less than cutting edge graphics will probably always win over an average game with amazing graphics- think Minecraft as the most obvious example. I see games like The Long Dark or Don't Starve or even something like Fez, all coming from smaller studios and all about 180 degrees away from an AAA game being more embraced by gamers in the long run than whatever the latest Call of Duty is. Of course if we break it down that far then we start getting into arguments between "serious gamers" and "casuals" and "console vs PC" and on and on and on...

Again, circling us back around to "different people like different things and that's OK".


> Again, circling us back around to "different people like different things and that's OK".

Sure, there's always that. However, just commenting on game graphics for a moment, many people have a clear bias towards graphics that are strong aesthetically vs. those which are only technically impressive.

Consider, Sonic the Hedgehog was released in 1991. In my eyes, that game still looks good today, and I don't think that's purely nostalgia, I believe it has a striking, cohesive aesthetic style. Compare that with something like Doom 3, which was released about 10 years ago. Graphically, it's not awful, but it doesn't quite have the same timeless appeal.

Gameplay matters above everything else, but in terms of presentation the aesthetics are IMO more important than the strength of the graphics engine.


Just look at Hotline Miami or Gunpoint for other examples. I'll note that it's partly a question of genre, IMHO. While Eldritch got some success, I think FPS are a genre where it's a lot more difficult to go for a minimalistic look.


Thanks for the examples, I agree Hotline Miami and Gunpoint have strong visual styles, never knew about them before today. Gunpoint in particular looks like a great game too, would happily play that. Hotline Miami visually looks like a trippy neon Alien Breed to me, I like trippy visuals, Rez is a good example of that IMO.

I agree FPSes traditionally have been hard to make stylish, but that's not to say it can't be done. Whilst it's not a shooter, Mirror's Edge is a decent example. Eldritch isn't to my taste but I agree it's an attempt to craft something memorable. I also think the makers of Timesplitters were onto something, perhaps it could do with some cleaner lines (easily achievable now), but it could have become something iconic. There was also a FPS game (I forget the name) where bullets only moved fast when you moved, you can probably still find it somewhere, I thought that was stylish too.


The last example you cite is called Superhot, fyi


Yeah, that's the one.


I agree with the author that quality gameplay should be more of a focus for teams, and that their choices in graphics technologies should enable their gameplay choices, not the other way around.

But, gameplay isn't the only way to enjoy a game. Sometimes I enjoy something just because it's the visual and auditory equivalent of a summer blockbuster where I'm in control. And perversely, I'm willing to pay a lot of money to have the hardware necessary for that. Sometimes the gameplay is objectively worse than other, less graphically accomplished games, but that doesn't make it less aesthetically pleasing.

I think it's perfectly valid to enjoy a game for aesthetic vs. gameplay reasons, although it can sometimes be sad when gamers have no concept of the latter.


Even though people are giving "The Order: 1886" a hard time for it's linear, QTE-heavy gameplay, I really want to play through it simply because it is so technically impressive. (It might help that I make 3D engines for a living.)

http://c0de517e.blogspot.com/2015/02/why-rendering-in-order-...

https://www.youtube.com/watch?v=EOHTJiRQB9o

http://blog.selfshadow.com/publications/s2013-shading-course...


The Order: 1886 is what I thought of while reading this. I read all of the negative reviews and whatnot, and I still enjoyed the hell out of that game. Every couple of minutes I kept stopping and just looking around and marveling at how ridiculously good the game looked, and all the cool effects I was seeing, including such mundane things as looking at my reflection in the many-paned window of a shop (where each pane had its own individual distortion).


It looks very cool, I just give The Order a hard time because I need a console to play it ;), I often like linear gameplay.


Wow, are those cut scenes rendered in real time? It looks as though when he's actually playing the graphic is slightly worse and that they use a lot of blur to cover it up, still very good but not nearly as good as the cut scenes.


"Since The Order: 1886 is completely rendered in real-time with no pre-rendered frames..." http://blog.eu.playstation.com/2015/04/13/order-1886-gets-po...


just watched the youtube video (second link) and WOW. That's... crazy amazing cool.


aesthetically pleasing != technically advanced

It's all about how you use your tools at hand, https://www.youtube.com/watch?v=Tp6DjATqEzg preferably enjoyed with surround speakers on high volume. Today the graphics looks like a joke and even for it's time this graphics was considered mediocre, the gameplay is actually quite boring and the artwork isn't the greatest either. But even today, and by just watching, the immersion is so intense and the tension is still there. I remember when we first played this as kids my heart rate was at max for hours afterwards, everybody at school just kept talking about how incredible this game was and that it was just like playing in the saving private ryan movie. Today you could surely improve the immersion in this scene tenfold but do you really need to? What makes this scene is the intense ambient sounds, the camera shaking effects and the overall feeling of pressure, nothing that requires any new graphics technology.

The article also brings this up with the discussion of artwork in sims 2d looking better than the 3d equivalent.


>Some AAA studios subscribe to the idea that games can deliver the maximum emotional impact in a similar way Hollywood does: By using actors in heavily scripted sequences to tell the story of someone else that the player/viewer relates to. Instead of playing to their medium's strengths, these studios go through great pains to emulate what Hollywood gets naturally: emotive characters, good looking lighting, spectacular locations. It's a very literal attempt to imitate another established, successful medium, and because it gets some results, it's popular, despite the fact it's very expensive and brushes aside many of the benefits that games get naturally.

Games can be much more immersive than movies can. You can much more easily pretend to be the protagonist. By going down this route games lose some of the advantages of, say, arcade games, but they gain others and are generally not at a disadvantage compared to movies.


It's not immersive the moment your character does or says something you don't want to do.

It's the antithesis of immersive. Seconds ago I could strut where I wanted, chop who I wanted and tea bag who I wanted.

And then suddenly I am forced to watch a cut scene with some super slow speaking voice over artist, because no-one gets to the fucking point in computer games.

Sometimes it works. More often, it does not.

That is the point he is trying to make.

I do like getting into the role of the good guy, there are greaat cut scenes, but it's ultimately a cheap gimmick with the game author stealing agency from the player. Every now and then something happens and you're all like, you what?


That's the biggest difference between WRPG and JRPG.

JRPG's don't tend to try and put you into the shoes of any characters, and instead you just learn to care about the characters the way you would reading a book or watching a movie.


You don't need to be AAA to be interactive. Plenty of indie games have compelling characters and immersion in spite of (or because of?) their lower fidelity.

Hell, many AAA games stray away from interactivity in favour of awe-inspiring, non-interactive movie sequences, railshooter level designs, et cetera.

http://hitboxteam.com/designing-game-narrative


>heavily scripted sequences to tell the story of someone else that the player/viewer relates to.

I think one of the strengths of videogames is that you can make your own story. Heavily scripted voiced-acted videogames go against that because they force you to go a certain path.


Most games guide you neatly through a bunch of hoops; you create your own experience, but not your own story or anything like it, in most cases. This isn't necessarily a bad thing; the experience of playing a game like Journey is very rewarding, even emotionally engaging. But you discover the story rather than create it nonetheless, it's just not as linear as heavily scripted games.

Games are an interesting storytelling medium, but stories are much more than variations on the order in which plot points are hit. The biggest issue is motivation - why characters (maybe including you) act as they do. In most free-range videogames the characters are very transactional; they want something and enlist the player to get/do something. In turn, players collect items or set game flags that allow them to defeat antagonists in contests of skill or force. But it's not really practical to interact with characters by appealing to love, greed, loyalty, arrogance etc. in order to motivate them to behave differently, so videogame stories can't do dramatic comedy or tragedy at present, although you can of course have slapstick or situational comedy, and you can have sadness or gloom. But the general lack of permadeath and the reflexive nature of AI are still a limitation.


>But it's not really practical to interact with characters by appealing to love, greed, loyalty, arrogance etc. in order to motivate them to behave differently

You can do something like that in Deus Ex: Human Revolution. You can have a cybernetic augmentation that lets you read microexpresions and do voice analysis. Then it tells you if the character you are talking to is more likely to be persuaded by flattery, assertiveness, bribery, intimidation and so on.

It ends up being little more than a gimmick, because being a modern high-budget game with impressive actors (they do good voices and their face expressions are so realistic that you don't need that cybernetic augmentation if you know how to read expressions in real life), it would be too expensive to have the kind of huge conversation trees that old style text-based computer games have.


Quite so. I have some ideas about how to do it but they're pretty abstract and I'm not sure you could offer the sort of consistent gameplay experience that AAA titles are offering to players, os the first games to do this successfully will likely be commercial failures.


"In the video game industry, AAA (pronounced "triple A") is a classification term used for games with the highest development budgets and levels of promotion.[1][2][3][4] A title considered to be AAA is therefore expected to be a high quality game and to be among the year's bestsellers."

http://en.wikipedia.org/wiki/AAA_(video_game_industry)


I KINDA knew what AAA meant, but not really. Now I do! Thanks for posting this.


I still don't know what it means because the term is massively overloaded. Does it mean:

* a game with a huge development and production budget

* a good game

* a challenging game

* a game intended to be played competitively online

* a game that features adult themes like sex and violence

My guess is that it actually means the first, but game industry marketroids speak as if all these criteria were coterminous with one another.


It means only the first. I've never seen it used to mean the other ones.


The term AAA implies a sort of rating of quality, which is why it seems duplicitous to me.

It's a bit like the term "blockbuster" in film. Originally it meant a movie so good, lines to see it would go around the block. These days what it actually means is a huge-budget film filled with CG and spectacle, and studios labor under the belief that throwing that much money at a production will lead to blockbuster-scale ticket sales. Then they wonder why their "blockbusters" flop catastrophically at the actual box office.


It only implies a rating of quality because a large budget tends to imply high quality.

To give an example, no one calls Pillars of Eternity a AAA game despite it being raved over by gamers. The reason is simple. It didn't have an overly large budget, it had just a single artist.

People expect high quality to come from a big budget and that's completely reasonable (not duplicitous in any way). But the term still means a large budget. No one with much familiarity with the industry would argue otherwise.

I think the misunderstanding comes about from those who don't pay too much attention to the gaming industry as a whole misunderstanding the meaning because they guess at it via context and get it wrong. There is absolutely an implication of quality, but it is not a part of the definition of the term.


AAA means a certain budget, which should at least correlate* with high production value. Not necessarily "good," but "expensive looking."

*This might not be true in cases of outright corruption or incompetence.

You're spot on about the term blockbuster. Same problem there. "Blockbuster" now means high budget and mass market.


And exactly that point is the reason why I hate triple A games.

They are produced like movies. Throw insane amounts of money at an idea. More money, more insane things are possible.

But games don't actually _need_ lots of things, so much complexity or licenses for soccer teams/car brands. For me AAA games make no sense, and their economy makes even less sense.


I think there's a distinction to be made between complexity/expense of production and complexity/depth of gameplay. AAA games seem to be maximizing the former and minimizing the latter. The average single-player AAA experience is a two-to-ten-hour long series of cutscenes and linear gameplay areas, with the multiplayer more than likely some reskin of Counter-Strike. Or look at iterations of the same/similar games - Firaxis' AAA Civilization: Beyond Earth felt shallow and lifeless next to their inaugural almost-indie Alpha Centauri, despite BE's fully voice-acted and 3D-animated leaders and AC's mere static portraits.

Or for the ultimate in non-AAA gaming, Dwarf Fortress. Looks like Nethack, plays like AutoCAD, devours time like nothing else.


I don't think most recent AAA games count as "challenging", the tendency has been going in the other direction for a while, minus accidents like Dark Soul. I'd say it's mostly about the budget, and as Hollywood demonstrates regularly, it's extremely easy to create a flashy but ultimately mediocre product with a huge budget.

Not to say that all AAA games are like that, of course (eg, Alien Isolation looked pretty decent).


It always means the first. Some people may use it in a way that also implies the second. The rest are entirely unrelated.


I think sometimes people conflate mid-career personal attitude changes with some sort of profound pan-industry insight. There's obviously money to be made from high tech AAA games pushing hardware limits, and money to be made from prioritising gameplay over polish and visuals.


Indie vs AAA. I thoroughly enjoy both.


I hope the GPU arms race continues for as long as possible. Without it there would be no GPGPU and modern deep learning wouldn't be feasible.


It's true. Arms races often drive innovation in other areas, the Russian/American space race being the poster boy.


Won't that arms race continue anyway? I guess it wouldn't be nearly as affordable if people weren't laying down big chunks of change for eyecandy though.

I dunno, you think the Xeon Phi stuff is ever going to show up in consumer grade?


The development costs of the hardware used in today's supercomputing has been heavily subsidized by PC gamers for well over a decade. If PC gamers all switched to low-tech games for the next decade, the supercomputing industry would find that maintaining the momentum of their tech advances to be much more expensive than they are accustomed to.


Last I checked, the arms race seemed to be almost over. GPUs are only 20% faster per year, and the progress seems to be getting even slower.

Gone are the days when performance doubled every year.

I think we're soon at the point upgrades make only sense every 5-10 years. Same is happening to consoles, 7 years from PS3 to PS4 or Xbox 360 to Xbox One. I guess PS5 and next Xbox are at least 10 years away. Maybe they'll even be the last consoles.


> Last I checked, the arms race seemed to be almost over. GPUs are only 20% faster per year, and the progress seems to be getting even slower.

This is what happens when you try to make too much inferences of a trend in a short time within a step function.

The primary ingredient that fuels continued performance growth in semiconductors, including GPUs, is process shrinks. They are fundamentally what gives AMD and nVidia the additional transistors that they can use to increase performance. The foundry 20nm processes were a bust, meaning that GPUs effectively skipped a shrink. This means that the last 3 years, after 28nm became available, the only way they have had to increase performance was slight efficiency increases and growing the chips bigger.

GPUs will move to next gen foundry processes, the Samsung/GlobalFoundries 14nm and TSMC 16nm late this year or early next year. This will provide the most dramatic single change in the underlying manufacturing process of the history of making GPUs. Assuming no major architectural advances, expect twice the performance.


Process shrinks are giving diminishing returns as well, due to dark silicon [1]. We can fit more transistors in the same area which gives the possibility for more functionality but if we power them all at the same time we'll just melt the chip.

[1] http://en.wikipedia.org/wiki/Dark_silicon


Despite this general trend, the next process shrink will be the largest single step in GPU history.


4K and VR is going to drive the performance demands up quite a lot very soon.


Heh, I've got the opposite - a very fast machine for it's day - 386DX 40Mhz but with Hercules (mono graphics chip). What I did was a resident application that has internal 320x200x256 colors buffer, and I would hack each game (looking for 0xA0000 or debugging a bit to use my buffer rather than using the graphics card directly). Then every few milliseconds I would transfer the buffer as monochrome - there was no palette transformation, just use the highest bit or something like this.

It worked fine for Star Control 2, Trolls, and few other games - yes it was very weird, slow, etc. - but I got my excitement just for doing it. And some games were not playable - I was taking 10% of the available memory back then, and others were just too damn hard to crack (taking over interrupts, etc.)


Well its rather surprising and terrifying to see a game developer even having the idea "graphics makes games" for even a minute... Ask anyone about their favorite games who are passionate about them and i can bet they will list games that didn't had 3D graphics...


You mean like Mass Effect, Thief, Tomb Raider, GTA, Witcher, and Dragon Age: Origins? Or perhaps StarCraft II, Diablo III, Everquest, WoW, or Final Fantasy VII?

Don't get me wrong, there are lots of fun games out there which do not have 3D graphics, but there are a lot of great games out there which do. The added realism that high quality 3D offers can add quite a bit to the immersion. It won't make a bad game, or break a great game, but it will certainly enhance even an OK game.


The problem is that 3D often doesn't tend to age well, as polygon count, texture mapping, terrain sculpting, particle effects and other graphics fidelity has been consistently and rapidly improving over the past 20 years or so.

That doesn't mean that it makes the games less enjoyable, necessarily, but certain art styles are just better at conveying timelessness.

At some point we'll probably reach a certain peak level of graphics that can simply be considered "good enough" for most purposes henceforth (Have we already?), but in general investing too much in graphics for production purposes over tech demos and research, is often shaky.


2D doesn't always age that well either, try to play Diablo II or Starcraft 1. I promise it will not look the same way as you remember it.

3d can age well also, Warcraft III still looks better than most modern games, it might not have dynamic lightning, shadows, etc but the overall impression is just stunning.

Warcraft III is such a great example of how make sure the 3D isn't working against you, both from gameplay and visual perspective. The camera angle is always fixed which means they can optimize the way characters look from that particular angle, not like sims 3d where you can zoom in and around and end up looking at an untextured ceiling. This also makes it much easier to design the gameplay such as selection and control of characters because there is only one scenario to cater for, instead of twisting the camera to select a character behind a building/mountain you simply design the game so that characters can't be hidden behind mountains. Other 3d RTS games from that era(probably today as well) made the mistake of thinking we haz 3d!!!1 now the player should be able to move around everywhere which just can only end with infinite bad camera angles instead of one good one.


> 2D doesn't always age that well either, try to play Diablo II or Starcraft 1. I promise it will not look the same way as you remember it.

One of the main issues with 2D is the fixed resolution. If you can live with that, it's not bad. I actually played D2 again a few years ago and really enjoyed it. Same for Planescape Torment.


Pixel art has problems with the fixed resolutions of modern displays (which CRTs lacked), yeah, but vector art and/or high-resolution but (relatively) sparsely-detailed/stylized 2D art can be effectively scaled to a wide range of resolutions with minimal artifacts.


> The problem is that 3D often doesn't tend to age well

We are well past the point where it explicitly ages badly though. HL2 still looks pretty good IMO and that is past a decade old. A few years ago you couldn't say that about many decade old games.

No 3D game released now that doesn't look terrible to start with is going to look hideous against what-ever is modern in a decade's time - it won't look as good of course, there will various bits of flair/realism/detail/what-ever missing somewhere, and the uncanny valley and such effects will be more noticeable, but it'll be perfectly playable and if the game is good the lesser graphics won't make it any less good.

Going back to HL2, if you compare it to Bioshock Infinite you can see the improvements in technology over the years, but it sill just looks less refined, not bad, and the differences don't affect immersion much (I would argue that BSI is more immersive, but that is mainly due to the design improvements (the whole but: art, music, story, ...) and not really due to the graphical improvements on their own).


> The problem is that 3D often doesn't tend to age well, as polygon count, texture mapping, terrain sculpting, particle effects and other graphics fidelity has been consistently and rapidly improving over the past 20 years or so.

This hasn't stopped Mario 3D, Zelda: Ocarina of Time, or Final Fantasy VII from becoming classics, despite being some of the first games in the 3D space.

> That doesn't mean that it makes the games less enjoyable, necessarily, but certain art styles are just better at conveying timelessness.

However, that art style is not pixel art. I've gone back and tried to play some of the pixelated stars from the past - they don't age well on modern display devices. From their 40pt fonts, to their pixels which actually look worse on LCD monitors than they did on our old CRT TVs.

They are still classics, and frequently enjoyable, but in many ways they require a bit of context to understand why they were such classics.


>However, that art style is not pixel art. I've gone back and tried to play some of the pixelated stars from the past - they don't age well on modern display devices. From their 40pt fonts, to their pixels which actually look worse on LCD monitors than they did on our old CRT TVs.

I have a stash of bulky crt monitors and TVs for this reason. They are almost free nowadays.


Yeah, I get this - obviously it's game-dependent (and there are exceptional examples that have aged well, as pointed out by others), but I generally find this game-breakingly noticeable when I've tried to play older 3D games. I was at a games exhibition recently which had a huge collection of playable historic games, and the one that really stood out was Tomb Raider, which was almost almost impossible to play, the graphics are just horrific. But more recently, I tried to play Fallout on the PS3, and gave up, mainly because of the graphics. I felt the same about Skyrim. Compared to something like LA Noire on the PS3, the gulf in graphics was huge. The more I think about it, it's not as much the graphics per se that age things terribly, it's how populated the world is, and how reactive it is to player actions. GTA2, for example, is still quite a lot of fun, and I found it much more playable than GTA3


Look to DooM, perhaps is the 3d game that age better that any other.


Actually, DOOM isn't pure 3D at all. The graphics are 2D sprites and your field of view is strictly limited to a 2D plane, but the use of isomorphic projection and billboarding techniques are used to create depth.

That's why it's timeless.


Everybody uses billboarding techniques and 2d sprites and stuff. We're just a lot better at it now than before.


Again, read this : http://doom.wikia.com/wiki/Doom_rendering_engine

The original DooM engine isn't based on billboarding.


Why did you attach it to me instead of the OP who made the original claim?


This applies to every proper 3D game though (excepting VR)

> "...your field of view is strictly limited to a 2D plane, but the use of isomorphic projection and billboarding techniques are used to create depth"



> Actually, DOOM isn't pure 3D at all.

I played this awesome "pure 3D" FPS the other day.

It was called Laser Tag and it rocked.


Look these games at best can be __good__..

But lets talk about Legendary ones: might & magic series, planescape torment, baldur's gate, day of the tentacle, neverwinter nights, monkey island series, ultima online, warcraft 1, starcraft 1, diablo 1, heroes of mm & Ultima 1/2/3 etc...

The games you are listing can be highly rated, but they will not be remembered..


I think this is your nostalgia talking. There are plenty of 3D games which have been and will be remembered. (You even list Doom as a 2D game! I are astonished.)


The original DooM engine it's a "false 3d" or 2.5d game. Really it's a 2d game displayed on 3d. All game mechanics, and the monsters IA works based on 2d data. Even the maps are based on 2d BSP and data. Z data are only used to rendering.

Source ports add more 3d stuff to the game engine, like collision by cylinders with height (ie, a flying Cacodemon not would colide with you if it is flying above you), opnegl rendering, 3d camera, modern controls, slopes, etc. But at the core, keeps being a 2d game.


Ultimately the environments are 3D images being generated from a 2D dataset. How is it not 3D? A cube is a 2D square that's been extruded into another dimension. It's not 2D.


>How is it not 3D?

It's the difference between looking 3D and being 3D (well, being represented 3D in the game engine before rendering to your monitor).

Doom isn't 3D because the level maps can be squashed flat to a 2D grid. For every x,y position there was only one z for you and the mobs. There weren't any situations where a bridge or ramp took you/mobs over a lower spot you/mobs could also be at, rooms weren't stacked, etc.

That was fine, Doom pulled off its sleight-of-hand extremely well and the gameplay it offered was captivating!


Lots of 3D shapes can be squashed flat to a 2D grid. If you printed out a Doom level on a 3D printer and put it on your desk, you wouldn't claim it wasn't 3D just because you couldn't find any stacked rooms. The final product is 3D regardless of the constraints on which 3D geometries are possible.

You could argue there are no 3D graphics. Everything is transformed into a 2D screen representation.


I'm not taking about shapes, I'm talking about the LEVELS themselves. There are no DOOM levels where part of the level exists "above" or "below" another part of the level - they are all 2D maps that through various (ingenious) engine tricks, such as elevators that teleport you to fake entering one spot and exiting another spot "above" it, to visually appear 3D.

Hence my squashing analogy, DOOM levels don't overlap and can be drawn ("squashed flat") on a 2D surface.

But you can't actually occupy a different height for a given location. This is NOT real 3D. That's why DOOM is sometimes described as a 2.5D game. Quake was the ID Software game that was 3D.

If you think DOOM is a 3D game, try creating a level that contains a player accessible location above/below another player accessible location, and see how the game renders it. Or, read the DOOM rendering engine links that are posted and pay close attention to the "not a true 3D engine" paragraph.


That is a pretty arbitrary qualifier for 3D. Doom is filled with spatial obstacles that make use of the 3D geometry. For example, a missile can pass over the player. Facing a wall on top of which a monster stands, I can't necessarily pass it or see or fire at the monster. Flattening the levels would change the game and remove many of its challenges.

The format of the levels or the fact that there are no rooms over other rooms don't really factor into it.


I think we're just arguing at different levels of abstraction here. I'm arguing that a 3D environment is being generated from 2D map data, and I consider it to be meaningfully 3D.

I suppose at the extreme end, I'd have to explain why Wizardry on the Apple II is or isn't a 3D game, which would be more of a strained philosophical argument.


I refer more to the game logic on the original game engine. For example, there can't be bridges or rooms over other rooms. Or a flying monster can't be over other monster or a player.

On other side, the original game engine, use 2d math with fixed point to rendering on a 386 CPU. Don't uses mvp matrix to render, like a real 3d game.

Take a look to : http://doom.wikia.com/wiki/Doom_rendering_engine


I guess yes there is some nostalgia involved, btw i removed it, its just that i feel disrespectful whenever a list like that doesn't include doom


I think there's fewer no-doubt classics among fully 3D games (Doom is 3D graphics build with 2D sprites, so I am actually walking back my amazement on that -- it's a tweener, not like Quake which was rendered polygons), but that's just because they've had less time to establish themselves as classics. Final Fantasy VII, Bioshock, Resident Evil 4... I think all of those are going to be talked about as long after their release as those games you mentioned.


> Look these games at best can be __good__..

As usual, tastes are subjective. Many people consider Moby Dick to be a classic - but when compared to more modern allegorical works, it's rather dry and boring. This doesn't mean Moby Dick is not a classic, but it also doesn't mean that Moby Dick is the be-all-end-all example of an allegorical tale.

Those games are classics, but being a classic does not prevent a game from being surpassed, nor does it prevent newer games from becoming classics as well.

> neverwinter nights

Was in 3d. The Starcraft and Diablo games had 3d models which were rendered down to 2d sprites (and considered by many to be dramatically inferior to hand-drawn 2d sprites).

Many of these series continue to exist today, in 3 dimensions.

And despite having arguably the worst 3d graphics in the lot, Final Fantasy VII is still a favorite for many folks.


FFVII has really crappy 3D, and sometimes awkward transitions between presentation modes. Still, some of what Square pioneered in that game remain: fixed cameras, character oriented controls, etc... (where the control "left" does different things depending on where the character is facing)

They have gone back and forth on open world vs rails or corridors, and I find the open world, like FFXII, the most enjoyable and replayable of the titles.

When FFVII was released, it was one of the first my wife really wanted to play with me. We played lots of games, but the vivid nature of the story, lots of activities, puzzles, mini-games (Oh, how we spent time on Chocobo breeding...), really made for an awesome experience!

But, an attempt to play it today is tough. The really poor visuals often get in the way, and it just reeks of old.

This is true of many 2D titles too.

One difference is how abstract the game is. An old 2D game, such as TEMPEST, or DEFENDER, actually looks and plays well today. It is what it is, and that isn't much, but the challenge is there, presentation sufficient, and it's sort of timeless.

Those games that went for some realism, or that made more identifiable characters seem to age a lot more quickly, though somebody up thread mentioned SONIC. Yeah, that one is still relevant, and is a stellar example of a timeless presentation involving simple, easily understood mechanics.


I'm surprised there's been no mention of Crysis. This was a game whose entire popularity was largely based around "unnaturally pretty, will burn a hole in your computer"; to the extent that it became a rather mainstream GPU benchmarking metric.

I'm honestly surprised there's as much of a debate over this, the breakdown always seemed rather clear to me: That while there are "great games" that "serious gamers"(tm) love (I'm personally a massive SS2 fan, play it to this day) we are in the vast minority and the Next Gen race has been almost entirely about throwing increasingly flashy presentation-oriented features at core game mechanics that are known to be successful.


In defense of Crysis, it offered in its first half an interesting open-world approach to completing objective. Contrast with the latter installment of the game that became increasingly corridor-like in its level design.


Legendary games with 3d graphics? Mario 64 and Ocarina of Time.


For me it was GL-Quake. I was a huge fan of Quake and it ran (barely) on my 486. My cousin had a pentium 166 with an expensive graphics card (S3 - I think). Quake rendered using Open GL was a thing of beauty. That was my "I must upgrade my computer now" moment. I rarely play first person perspective games nowadays but back in the 90's Quake was my benchmark. I never really got into Half Life I loved Unreal but Quake was the real classic.


people will remember whatever they connected with the most at the right time in their life.


It's just a matter of generation : I haven't played any of the games you listed, while I know every 3D AAA games mentionned before.


I disagree. Graphics add nothing to the immersion, unless perhaps on the most shallow level possible, where the entire premise of the game depends purely on the graphics.

In that case I wouldn't call it a game, but more of a tech demo or an interactive experience.


Graphics add more than nothing. At a minimum, they are an important part of UI, just like on a web page. Colors, affordances, etc. are all important to games. Games like ASCII roguelikes exchange obvious visible indicators for speed/flexibility/pixel size/complexity. Graphics set the tone of the game and have the ability to visually influence the player.

Cramming more graphic textures or perfecting water algorithms might be secondary (or tertiary) to the game, but they can still influence the gameplay directly. Take Team Fortress 2, where a lot of care was put into making each character class have a distinctive silhouette, so at a glance you can see who everyone is from a distance or from different angles - this isn't what most people talk about when they discuss "graphics," but it is the element of graphics that matters most.


The entire premise of Portal is based on the ability to use 3D rendering technology to create the effect of opening up portals between surfaces. I don't think anyone would argue that Portal is a shallow tech demo. Narbacular Drop maybe was a shallow tech demo, but that same tech-based premise, when used by a masterful game design team, created one of the best video games of all time.

Likewise, the entire premise of Minecraft depends purely on the simplicity of a voxel-based world - a technique for defining a procedurally-generatable, deformable landscape. Maybe Minecraft is only a tech demo or an interactive experience, but... it's a damned successful one.


Narbacular Drop maybe was a shallow tech demo, but that same tech-based premise, when used by a masterful game design team, created one of the best video games of all time.

What has the story and design of Portal to do with graphics? Nothing, because story and design are independent of it. Would Portal be an equally great game if done with the engine used in Narbacular Drop, but with the same design and story. Yes, again graphics has nothing to do with story and design.

So to reiterate. What made Portal great has nothing to do with how many pixels you fit on the screen or how many shadows you can render. What made Portal great was story, design and the tech( portals ).


Your parent's point is that Portal's tech (portals) is most exciting in a 3D space, which is enabled by advanced graphics.

Sure, the number of pixels you fit or the lighting effects are not key to Portal being a good game- but being 3D certainly is.


You think only one of them is in 3D. That is not correct.


What? Only one of what is in 3D?


StarCraft II, like WarCraft III is a 3d game. But original StarCraft and Warcarft I/II are pure 2d games.


Well, not quite. StarCraft and WarCraft II used 2d sprites, but those 2d sprites were generated from 3d models - its what gave them so much depth for their time (and also made them look muddy compared to hand-drawn sprites).


If I recall correctly, in WC2, the entities were modeled in clay/plasticine, photographed from several (8?) distinct fixed angles, and then artists hand-painted cels over the photos. The cels were then scanned and digitized into 2d sprites. The maps were flat images with hidden metadata layers to define land and water boundaries.

I also recall that you could replace the peons' wood-chopping sounds with fart noises, which was hilarious for almost 30 whole minutes.


I think it is a great selling point. No matter how crappy the final product is, claiming new or better graphics automatically give more sold units.

It is easy to critique a game after you played it. But before you do, the only thing you can do is look. Since you can't interact with the game in any way, what is left is the visuals. And that impression sells the game. Therefore, graphics matter.


There's a thing I call Amiga Game Disease. It's a thing on many platforms, especially now, but it was prevalent on the Amiga way back when. It's when a game is pretty much a tech demo with a thin game wrapped around it. Take something like Shadow of the Beast. Beautiful to look at, and really showcased the Amiga's power. Nothing looked or sounded quite like the Amiga version of Shadow of the Beast, and the various ports to lesser systems certainly couldn't keep up.

But it was pure, frustrating, junk from a gameplay perspective. The controls were clunky and shit was always popping out of nowhere and killing you. I kept dying at the first "boss", a skeleton thing on a throne that looked like it was made of some bigger creature's jaw. What I didn't know at the time was that there was a gryphon I had to defeat by punching the crystal orb it was bouncing; defeating the gryphon would temporarily grant me the power to shoot hadoukens, and the skeleton thing was only vulnerable to the hadoukens. (If you used up your hadoukens before defeating the skeleton thing, well, sucks to be you.)

And it's just full of stuff like this. The game doesn't tell you ANYTHING about how to play it or what its goals are. Plus it was a pioneer in unskippable cutscenes: moving from one place to another -- like going in a door or something -- entailed staring at a still image while adventure-game-style flavor text scrolled by and no key or button press could dismiss it. And once you die, that's it. You have to start all the way from the beginning. The game could be cleared in half an hour -- IF you knew where everything was. I guess it got replay value by surprising you with deadly enemies and obstacles you couldn't see coming and making you start over each time a new thing bit you in the ass.

But we all remember Shadow of the Beast -- indeed Amiga users look back fondly at it -- SIMPLY BECAUSE OF ITS GRAPHICS AND SOUND.

Sword of Sodan was the same thing: clunky and repetitive, but WOW LOOK AT THOSE HUGE SPRITES.

So yes, "good graphics = good game" is a thing, and it's because of the market, not because of the execs.


Strange. I played a lot of games on Amiga and do not count SotB in the great ones. My list of great Amiga games is:

  - Perihelion
  - P.P. Hammer
  - Blues Brothers
  - Sensible Soccer
  - Lemmings
  - Alien Breed
  - Another World
Most of those had average graphics.


I was an Amiga gamer, I played Shadow of the Beast 2, I remember the graphics and audio fondly but I don't ever remember thinking it was a great game. The situation was similar for Shadow of the Beast, was an interesting setting but the gameplay never appealed.

The Amiga had a wide variety of games, and certainly didn't cater only for people who wanted graphics over gameplay, but yes good graphics were celebrated. I think part of the reason for that was the whole 'ahead of its time' idea, that a computer released in 1985 with a handful of underwhelming updates could still stand its ground into the 90s. Amiga was certainly not alone with this, applied just as well to the Neo Geo, and the Japanese got lucky with the Sharp X68000 too.

Another part of this was the game magazine culture, hard to talk about how a game plays, but easy to show off how a game looks. Game demos were a much more complementary medium than magazines for showing off the strength of games (though I had a sizeable collection of game magazines at one point, I'm not completely against them).


Whenever I start to think graphics are important to games, I think about the differences between the Activision graphical sequels to the text-based Infocom Zork games: Return to Zork, Zork Nemesis and Zork: Grand Inquisitor.

Of these, Zork: Nemesis is by far the worst game, even though it put a great deal of effort into its graphics. It was another victim of the Myst effect, but five years later, when everyone should have known better. The plot seemed shoehorned into the set, rather than the locations built to support the gameplay.

Return and Inquisitor felt like Zorks, but Nemesis felt more like a glorified fetch quest. I don't care if the graphics hold up or not after a few years, because the game itself was not particularly fun to play. The only factor preventing me from re-playing the other two is that I can still remember the solutions to their puzzles after 18-22 years, but all I remember from Nemesis is "bring me more red pages... er... I mean pure elements".

Therefore, if graphics matter, they cannot matter more than gameplay.


Nowadays at least you can see videos on youtube about the game. In the old days with the magazines you only could see the screenshots (with a real camera in front the monitor/TV) and trust the author of the review.


I see your point and it makes sense but won't you agree a story line or a rich description of game play won't make the same or perhaps a better impact on the potential buyer ?

Of course i am assuming there is a story line or nice gameplay..

To give an example: my favorite game of all times, Ultima Online has at __best__, __meh__ graphics..I remember i got hooked before i even installed it, not because of the screen shots but the way the game magazine editor was telling his own experience how he got murdered while fishing quietly in a small town..


A good description will impact a small percentage that is already familiar with the genre or that franchise. What will mainstream audience do. Believe the your words or your pictures? Games convey information trough the latter. Game magazines are obsolete and no-one reads long articles anymore. You have one minute to impress your audience. Less if it isn't in a form of a youtube video.


won't you agree a story line or a rich description of game play won't make the same or perhaps a better impact on the potential buyer?

Does it matter if you or I agree?

http://www.vgchartz.com/yearly/2015/Global/ http://www.vgchartz.com/yearly/2014/Global/ and http://www.vgchartz.com/yearly/2013/Global/

all seem to to indicate that, unless you happen to be Nintendo, pretty 3D graphics are very much necessary (but far from sufficient) if you want to sell a lot of games.

edit: Mobile and social games like Clash of Clans, Candy Crush etc. are of course and exception. If you want make a lot of money in games and don't want to deal with high end 3D graphics, then that is the market you have to get into.


If there are so many exceptions (plus entire classes), is your observation that meaningful?

Many games are marketed and sold based on gimmicks and features, of which, great graphics is one of them and seems to work well. Unsurprisingly, much of the videogame industry has optimized for it. That doesn't mean that pretty 3D is necessary.


Ask anyone about their favorite games who are passionate about them and i can bet they will list games that didn't had 3D graphics...

Probably because they grew up in games before 3D and our first forays into gaming are generally more memorable than later ones. Plus...nostalgia.


> Ask anyone about their favorite games who are passionate about them and i can bet they will list games that didn't had 3D graphics...

And I bet most of those won't be high-revenue games. Flashy, shallow AAA titles make stupid amounts of money.


A question for those in the know - as a programmer, I've seen a lot of democratization of the game engine side of game programming over the last few years. The price of entry to AAA-level technology has fallen so massively that now anyone can download UDK and start having a go.

I'm interested to hear of what's happened, democratization-wise, on the art and music side of game development. This is the bit that I find people don't immediately grasp. While engine technology has gotten cheaper, making assets for those engines is only getting more expensive.

So I'm interested from those who are in the art field, have asset tools become more 'democratic'? Or to put another way, are art tools moving to make art asset creation less expensive in the same way that engine tools have made the programming side less expensive?

One area I can think of off the top of my head is texturing and shaders - texture-painting tools and physically-based shaders that allow less-skilled artists (and programmer-artists) to still get aesthetically good looking results, without having to become experts in UV mapping or writing shaders.

Is there something similar for model and animation creation, etc? Is this even possible to do without making games that look 'cookie-cutter'? (My gut feeling on that last question is that it should be).


I live on the outskirts of 3d art world and there are definitely a lot of tools to make asset creation easier. You can buy texturing/shading packs, which are usually pretty customizable. There are tools like MakeHuman for humanoids and Marvelous Designer for easy clothing/cloth sim. Maya comes with built-in hair/cloth sim now which is supposed to be quite easy to use, as well as autorigging tools. You will probably want to know a little about 3D to make something decent, but then again you'll want to know a little about programming to make something decent in UDK.

I'm really curious about the music. I have a really hard time making music without having a composing background. I've tried a few music generators but the results have been pretty bad so far.


I'd heard of MakeHuman but not Marvelous Designer (it looks great). Think I'm going to submit an Ask HN, would love a list of these kinds of tools....


For the author's comparison of The Sims to the Facebook Flash game of The Sims -- one thing he doesn't discuss at all is that by using 3D models The Sims allows you a crazy amount of customization in creating Sims and their surroundings, compared to what's possible with the hand-drawn vector graphics of The Sims Social.

Which kind of undermines his point. The Sims Social isn't a game that made different choices in graphics, it's a different kind of game with the Sims brand thrown on it. It's not just about MOAR GRAPHICS, what you're capable of doing with graphics informs what kind of gameplay experiences you can create for players.


AAA games are like F1 cars, they are the extreme that hopefully technology will trickle down from so that 'consumer' indie games can benefit from. Without the huge AAA games pushing the limits of tech we wouldn't have as many game engines for indies to use. I hope it continues because I love TIS-100 and The Witcher 3.


OMFG the premise of TIS-100 sounds amazing! Thanks for making me aware of it.

http://www.zachtronics.com/tis-100/


> What are the games that people play for years that only have pretty environments or another form of impressive tech to offer? I can't think of any.

Well, Final Fantasy VII frequently tops "greatest game of all time" polls, and its visuals were driven by 3D tech (and arguably less artisticly interesting than those of VI).


I think FFVII is much deeper than its graphics and storyline. It was (not quite, but almost) an early Cookie Clicker. Laughably easy, and yet you had fun anyway. It was fun to max out your materia and level your characters up even though you had long, long passed the point where the final boss was a pushover. The graphics/story/popularity/etc. allowed you to enjoy the "Cookie Clicker" without admitting it to anyone else or even to yourself.

That's just my subjective opinion of course.


The thing that defines Cookie Clicker is that it only has improve-stat-get-dopamine mechanics. All games with character progression are partly Cookie Clicker. The interesting thing about CC is its distillation of a particular mechanic.


It's not just improve-stat-get-dopamine. It's a very fine-tuned, well-balanced implementation of improve-stat-get-dopamine. That same fine-tuned well-balance was seen in FFVII, in my opinion. It's subtle (1), but just think what a fine line it is between "grinding hits a plateau" (2) on one side and "grinding makes you grow out of control" (3) on the other.

(1) "Good design is obvious; great design is transparent"

(2) E.g., FFIV

(3) E.g., FFVI


"Because of the ongoing pursuit of Hollywood"

A better analogy is obviously the best hollywood movie is the one with the most and fanciest special effects.

Unfortunately 95% of the population doesn't want to watch a special effects demo reel, not even for free. And the analogy is the same problem with graphics, true the demoscene subculture is fascinated with graphics and its very technologically impressive, but 95% of the population responds with "meh".

(edited to add, the truly unfortunate part is the 5% of the population who want special effects reels is financially successful enough to completely prevent all advancement of the art other than the local maxima of special effects, so the 95% of the population who can't stand it are stuck, and are unserved by the monopoly / ogliopoly. The situation with games isn't as bad and the "casuals" are making huge piles of cash for non-3d developers, and the goal of the AAA studios and their journalist hangers on is to create and insert enough blocks in the marketplace to eliminate casual from competition and keep "gaming" a stereotypical pure 3-d WW2 FPS sequel experience, and the problem is technology has pretty much topped out eliminating "better graphics" as a marketing weapon)


I think you're dramatically overestimating the number of people who don't want to see demo reels in the cinema. In our tight technical circles, sure, I agree that most people don't. But our tech circles are such a small percentage of the audience of hollywood films. There are plenty of movies that aren't drowning in VFX released every year, yet the numbers don't lie. Avengers, Harry Potter, Avatar, Pirates of the carribean (see http://en.wikipedia.org/wiki/List_of_highest-grossing_films for more)style films blow others out of the water. It's extrememly clear that people want films that are knee deep in VFX. If they didn't, then they'd stop spending 1.5 billion dollars on going to see the avengers, and see something else instead.


I see truth in what you write.

However, given that I can't see a movie for less than $15, that $1.5B represents only 100M views, and if you theorize that only 1B of the world population is wealthy enough to occasionally blow $15 to watch a movie, that implies 90% of the population is uninterested in VFX demo reels, which isn't all that dramatically different from my engineering estimate, especially given we could cherry pick examples to "prove" either your claim or mine, or play games with $ per unit time to reach either conclusion.

I completely agree that $1.5B is important business, yet it makes my point that appealing to a narrowcasted local maxima of VFX fans also simultaneously means perhaps 90% of the population will be uninterested. And the point I'm trying to make isn't that 10% of the population is irrelevant, but if someone could crack the code and disrupt the industry and instead of narrowcasting, make something of broad appeal that perhaps half the population would be interested in, that mysterious idea would be worth about $7.5B which is actually pretty good revenue for a startup. But it'll never happen if the entrenched oligopoly isn't disrupted.

I honestly have no idea how to exploit the market, but "obviously" there's a huge under served market with possibly staggering revenue, for some future startup that can figure out how to make movies that appeal to more than about 10% of the population. Piles of money are sitting out there, waiting to be harvested...

There are similar analogies and financials with pop music. Entrenched industries have pretty much figured out how to sell simple mass produced formulaic music to each generation's (each years?) teens. There seems no biological or psychological reason a better industry competitor couldn't sell 3 to 7 times as much if they could appeal to more than young kids. Eventually someone will crack that startup opportunity and make mid nine figures.

Finally ditto, kind of, with sports. "Everybody is a baseball fan" but only about 5% of the population actually watches the world series. If you take existing advertising revenue and multiply it by a fraction representing some magic startup pixie dust that gets maybe 50% of the population to watch, that's some serious ad revenue money; in fact its "football" type money, LOL, which is a whole nother topic.


"The perfect balance of number of units with the amount of things a player's human brain could possibly track at any given moment was completely lost on me."

The human brain can actually abstract things away and starts thinking in groups rather than in individual units, regardless if the game mechanics allows the player to treat them as such or not. That units-limit was something I did not particularly enjoyed in those games (and appreciated instead games like Command & Conquer in this regard). I agree that there is more than technical features that make or break a game, but the units-limit in RT tactics games is a flaw and it's not a technical one.

For those who argue that units-limit encouraged unit-quality over unit-quantity thinking, I concede. It's a valid point that perhaps contributed to the *Craft success series. That feature could however have been left as an over-ridable game parameter (set on by default, if you will).


I recently started playing skyrim - i know, I'm late to the party - and I'm really taken by some of the graphics and gameplay. there aren't too many games i can really stick with (just cause,bioshock,and portal are it) so it's nice to find a game I can ACTUALLY play. I tried deus ex; good game, but got too complicated, and then i tried picking it up months later and simply couldn't remember anything.

But, I've also been thinking of making a game. I know next to nothing about games, but bunches about the web. So then it became a question of what could I build with the tools I know?

which then lead me to realize: you don't need good graphics for a great game. Some games you want the graphics because it's part of the experience; but others you don't need any. And still others just need some. A game just needs to be fun in the end, and there are a surprising number of ways to get there.


There's a lot of opportunity to use your web skills to create games. Consider things like Kingdom of Loathing or something... definitely not "great graphics" but has a lot of fans.

Monetization opportunities seem more challenging, if you're inclined in that direction. If not, then, well, don't worry about it.


"A Dark Room" on iOS is another one worth looking at for inspiration - compelling game, and not at all about the graphics..


It's actually available in the browser, not just iOS.

It's one of the best games I've ever played, and I fully consider myself a games connoisseur.

http://adarkroom.doublespeakgames.com/


not even talking that. I'm talking, click a square and it changes colors, or a number counter increases; something I could conceivably build. But, nothing actually cool has come up yet, just gimmicks.

But, I wouldn't be opposed to tinkering with something in unity either; but thats obviously a WHOLE other ball of wax.

and for anyone else who's curious: https://www.kingdomofloathing.com


I thought this was obvious... I've played plenty of great-looking but terrible games in my life, it's not that hard to put it together.

On another note, it's nice that indie games are starting to break down barriers by NOT pushing their games to extreme technical limits and focusing on gameplay. AAA games could learn from them.


Heh... I'll take this opportunity to post about my beloved MUME (Multi-Users in Middle Earth) [0], a MUD that's been running for 25 years, and still the best place in the universe for intense PvP.

[0] http://www.mume.org


Lots of interesting stuff in this article.

As a tech guy who went to a game programming school, my personal theory about why technical people get hung up on engine development (the old "I'm making my own game and have spent the last year writing my own engine") is quite simple - it's just an easier problem space.

Writing an engine has a defined, knowable set of steps that you can answer with if asked "What do I need to do to write a game engine?".

On the other hand, how do you answer the question "How do I make a good game?".

So people get stuck writing engines because it's interesting and because they can feel like they're making progress on something.


Atari.

Atari pretty much spawned the video game industry, and the average game wasn't more than squares on a screen. Squares. Your game character is one giant pixel.

It was all about the gameplay, not the graphics, and sold millions.


True, but at the time those squares were the cutting edge, and cost about $450 today for A game. http://game-consoles.specout.com/q/9/2566/How-much-does-a-At... http://www.dollartimes.com/inflation/inflation.php?amount=98...


For Atari, most of the games were written by one person, which leads to lots of potential creativity. Also, there weren't as many established games to copy.

For an AAA game, it's a big budget. To minimize risks, you have to make a game that's similar to one that already was a bestseller.

For awhile, when games were distributed in stores, the only games WERE the AAA games.

Now, with Internet distribution and better tools, it's possible for one person to make a game again.

If you work on an AAA game, you're probably working on some tiny piece of the game, and it's going to be boring like any CRUD software webdev job. However, game developers tend to work longer hours for a lower rate per hour, because games are "cool".

If you work on an indie game, you have more opportunity for creativity. But, if the game doesn't sell, you risk having no income.

If you're looking for games like the old Atari games, it's better to look at small indie games, rather than big budget AAA games.


At the time, those squares were an impressive technical achievement. Many cartridges were sold because the graphics and sound capabilities produced a new experience for many people. (I was, like, five years old at the time, so pretty much everything was a new experience for me.)


If what you care about is games for their own sake, that's as valid a perspective as any other.

My perspective is very different, because I care about games not for their own sake, but for their spin-off benefits: chips whose R&D was paid for by gamers, are being used to design aircraft, discover life-saving medicines, unravel the secrets of the universe.

From that perspective, abandoning the tech arms race in games abandons that which was valuable about the enterprise in the first place.


"For years the AAA argument for pushing the tech envelope has been that the bigger, better looking, better sounding, more detailed our worlds are, the more believable they are. But is this true?"

It is for me! I've enjoyed Far Cry more than the barren speed-oriented environments of Quake III or Unreal (I know, I know, this one never aimed to be "real", but it's a good example in anti-thesis). Now seriously, when I read this:

"In politics-heavy environments, there is nothing more effective in pursuing a personal agenda than using "facts" and numbers - even when those facts and numbers are hand-picked to support a certain story."

I can not help but think of the cherry-picking that the current article I'm reading does.


I don't think that young people today agree that this is a bad thing. I can see where someone who grew up with the older systems and older games would be nostalgic towards them (I'm turning 50 this year) and find the new to be unnecessary and gratuitous.


I'm not sure how reasonable it is the way the author mixes demographics to make his point. I don't see the type of person playing sims social, wanting to buy a AAA game to begin with. I think the tech arms race does matter to people. It also acts as a signal of minimum floor of quality,but the specifics I'm not sure myself. I know would definitely prefer HD in movies, tv, and videogames. I definitely prefer movie special effects to cheap television special effects.

edit: maybe it also acts a way to signal people in market full of noise (competitors?)


What else in his life is he seeing in the wrong way?


I wonder why the downvotes, it's actually a quite thought provoking question.


No it's not, it's an insult.


Disagree! It's a brilliant comment. Should really be at the top, in my view :) - though as food more for thought than discussion.


I will try to explain, though I think it's obvious - at least, once you've read that comment (which is why I like it so much - maybe to others it was obvious beforehand, in which case I apologise for being dense).

Here is an article about this guy discarding beliefs that he's held for a long time. These are beliefs that I suppose you might call his doctrine, in that he held them without really questioning them. Well: great! Unquestioned beliefs, questioned and revisited, changed in the face of the evidence, or whatever. And he seems happy with the result.

How many similar beliefs might he hold in total?

If changing one set of beliefs for another is worth doing, what about some other set?

Would it be worth his adopting this as more active process, rather than simply waiting for events to demonstrate that some arbitrary set of beliefs are worth changing?

Is there a wider lesson here?

(Etc., etc.)


That's exactly how I interpreted the 'insulting' comment. We tend to compatimentalize ourselves at a rational level, but sometimes fail to see the 'bigger picture' of our mental patterns and the impact they have.


My thought too.

When I read the comment, I thought, "perhaps reevaluating long held beliefs leads to some new creativity or insight."


This arms race is probably because what I want isn't a new game I just want the same game as the last game but looking twice as good.


One example where tech enables a more engaging (fun) gameplay is Gran Turismo. It simulates real cars on real tracks and the accuracy of driving physics of actual cars is a prime source of pleasure. Driving a different car, or different settings actually makes a difference in the way it feels. I'm still waiting for GT7 to have a reason to get a PS4.


The author points to Sims Social and League of Legends as points of reducing the technical prowess of games.

While it's true that these games are less technically complex than a hyper-realistic 3D game, they are extremely complicated in client-server architecture.

The requirements for technical complexity have changed focuses, not disappeared.


That story about trying to encourage his wife to try the PC version reminds me very much of most experiences I have with or watching console gamers moving to the PC.

The complexity at times isn't worth it because it loses sight of what the purpose was, to have fun


My thought is that it's more difficult then ever to make high tech games. There's very little games coming out now that push the limits of what's possible. It's almost like complaining there is too much water on the moon.


Good games that push the envelope is entertainment and art, It may also be genre creating.

Katamari-damacy for PS, Dynasty Warriors for PS2, and Quake comes to mind.


This debate goes back to the late 1990's.


I'm still excited for more power for games. For a game that uses CPU correctly see Dwarf Fortress.


You mean incorrectly? Isn't it still single-core?


You mean that game that only runs on one core?


I mean the game that uses CPU for simulation of world, not for pushing ever more trangles to the GPU.

Didn't knew it's single core only, that's a shame.


I have a stupid question that I felt the author should have addressed. What's AAA?


Marketroid term for game that has a huge dev team and production/marketing budget. Marketroids which bandy this term about are often under the delusion that this makes for a good game.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: