The cost of games

 Posted by (Visited 1482 times)  Game talk  Tagged with: , ,
Jan 172018

Yesterday I was in Anaheim giving a talk called “Industry Lifecycles.” It was intended to be a brief summary of the blog post of the same title, with a dash of material from my recent post on game economics.

Now, that latter post resonated quite a lot. There was lengthy discussion on more Internet forums than I can count, but it came accompanied by skepticism regarding the data and conclusions. If you recall, the post was originally replies to various comment threads on different sites, glued together into a sort of Q&A format. It wasn’t based on solid research.

As many pointed out, getting hard data on game costs is difficult. When I did my talk “Moore’s Wall” in 2005, I did some basic research using mostly publicly available data on costs, and extrapolated out an exponential curve for game costs, and warned that the trendlines looked somewhat inescapable to me. But much has changed, not least of which is the advent of at least two whole new business models in the intervening time.

So the Casual Connect talk ended up being an updated Moore’s Wall. Using industry contacts and a bunch of web research, I assembled a data set of over 250 games covering the last several decades. This post is going to show you what I found, and in rather more detail than the talk since the talk was only 25 minutes. (You can follow this link to see the full slides, but this post is really a deeper dive on the same data.)

Each game has a reported development cost which, importantly, excludes marketing spend. So this is mostly the cost of salaries and various forms of overhead such as tools. When costs were reported in currencies other than the dollar (Euro, yen, even zlotys) I went back to the year of release, and converted the cost to a dollar value using the exchange rate prevailing in December of that year. I then took all dollar values and adjusted them for inflation so that we are comparing actual cost in today’s money.

The result:

As should be immediately apparent, it’s pretty hard to read the costs, because the vast majority of games cost under $50 million US dollars to make. Outliers are AAA console and PC titles that have enormous budgets, and you have probably heard about them because costs like that tend to make the news.

The chart gets a lot easier to read if you plot it on a log scale; in this chart, each vertical box implies costs going up by a factor of ten.

The trajectory line for AAA games is very clear. You can just eyeball that the slope of the line for console and PC releases goes up 10x every ten years and has since at least 1995 or so, and possibly earlier (data points start getting sparse back there). Remember, this is already adjusted for inflation.

We can also clearly see the appearance of indie games and mobile games on the chart. I have a lot less data points for these, as you can see, and a truly staggering number of them are released with basically no budget whatsoever. But the vast majority of those are also done at a loss; most of the mobile figures come from games that were at least nominally successful.

I took an average of the data per year, but it only tells us so much given that the data is weighted towards AAA games, and they pull up the average so dramatically. So I wouldn’t read too much into this graph except to say that even with the lack of really recent data points and older data points, the line is shockingly straight. I will say that a couple of the recent top of the line mobile games have budgets ranging from $5m to $20m — the bottom end is not as low as people think, when doing “AAA mobile.” Even PC indie games with high polish hit multiple millions.

All in all, given reporting bias (crazy expenses are more fun to talk about), and given that exponential cost differences mean the median or “typical” game is certainly not climbing at the same rate, and given the lack of enough mobile and indie titles in the data set, this average line is certainly over-reporting for games as a whole. You may find that somewhat reassuring, especially if you’re working on a $50m AAA game right now.

On the other hand, this picture is actually far too rosy in another way: it doesn’t include any marketing costs. As a rule of thumb, you can say that an AAA game’s marketing budget is approximately equal to 75-100% of its development cost. So costs of getting an AAA game to a consumer’s hands are actually more like double. In mobile, it’s not uncommon to hear savvy shops set aside three to ten times the development budget for marketing, because the market is that crowded.

Looking closely at the data points, there is rather an upward trend to the mobile titles and the indie titles as well. This isn’t surprising, given that as markets mature production costs tend to go up. But it raises the question as to whether there is some way we can compare apples to apples and see if there are global trends. After all, costs rising is fine if revenue and audience rise to match, right? It all comes out in the wash.

So I went looking for something that would correlate. I expected something like hardware power and capability to introduce “steps” in the graph, for example, and I wasn’t seeing that. Finally, I settled on one simple thing as a proxy: bytes. I went back and for each game, I located the actual install size, space taken up on disk (or on device) after a full install and all sideloaded, streamed, or first day patches were applied.

Needless to say, this also had to be plotted on a log scale, because the earliest games on the chart were only a few K in size, and the latest were many gigabytes. The result was this.

So, needless to say, bytes go up. Surprisingly, they don’t tend to go up in stepwise fashion as platforms are released, even back in the midst of console wars. Early on, carts with extra memory were slipped into production midway through the lifecycles of consoles, and later on, new run-time decompression techniques enabled disks to literally just have more bytes on them. For example, the NXE update to the 360 reduced install sizes using compression techniques by up to 30%. Given the addition of various forms of streaming that aren’t cached, for lots of types of games that require a connection, and it’s likely that the byte count here is, unlike costs, rather under-reported.

Either way, we now have a simple way to baseline. How many dollars does it cost a developer to create a byte? We know what we want to see: costs falling. In my earlier Moore’s Wall talk, I had looked at costs and costs per byte for the window of 1985 to 2005, and had arrived a simple conclusion (one which I repeated in several later talks such as “Age of the Dinosaurs”): game size went up by 122 times, costs rose by 22x, and therefore we got six times more efficient at creating content.

So here is the simple division of dollars and bytes, on a log scale.

Suddenly what become apparent is that there’s about 10x variability in costs within a given year, most of the time. Looking at the specific data points, I can tell you that most of this can be chalked up to whether the game is content-driven or system-driven. A story-based game, an RPG, something with tons of assets, will just naturally have a higher cost. There are also some famously troubled productions on in the data set; no surprise that they tend to sit towards the upper end of the range for their respective years.

The real eye-opener is that the $5m indie mobile title and the giant $100 million AAA cross-platform extravaganza cost the same to make in terms of megabytes. (They were actually off by only 3/1000’s of a penny). That’s likely because salaries are salaries, and don’t move that much when you change segments within the industry.

More troubling to me was that eyeballing the average cost per byte, it looks like we have plateaued.

Unreal Engine 3 and Unity both launched in the 2004-5 window. I would have expected these two amazing toolchains to have hugely helped the cost per byte. Instead, it kind of looks like it went flat.

It raises the disturbing possibility that maybe standardizing on these two engines has actually blocked faster innovation on techniques that reduce cost. I don’t know what else might be contributing to the flattening of the curve. Maybe the fact that Unity and Unreal are designed around static content pipelines, and don’t do a lot more with procedural content affects this? Maybe this is actually the good result, and costs were going to boomerang back up? There’s no way to know. I even unrolled the yearly average and simply sorted the games by release year to see if I was seeing things, and if anything it looks flatter because it reduces the impact of those outliers.

Data complexity in games is a real thing, and it is something that players, I think, routinely hugely underestimate. This post by Steve Theodore on Quora is illustrative. In it he shows a 1997 character that took ten  working days, then one from ten years later that took 35. His estimate for a character today is a hundred days. What used to be one 256×256 texture is now authored as many 4096×4096 textures, for specular maps, bump maps, displacement maps, etc etc.

If we take the step back, though, the real issue here is whether we can, as developers, cover that cost. So I went back through the data set and where I could, plugged in the retail MSRP in inflation-adjusted dollars. For mobile games that were pay-once titles, I used the price; for older MMOs, I ballparked it at box cost plus six months subscription on average, and where I had actual LTV for users, I plugged that in. The result told me how much players have paid for a megabyte of game over the years. Spoiler: they’re getting a deal.

“Wait!” you might say. “We don’t pay for megabytes! We pay for fun! We pay for gameplay! Not raw install! We pay for value!!!” Yeah, yeah. But in practice, development costs are correlated with bytes, not Metacritic, I think (no graph for that, but it was an easy eyeball test, plus it makes obvious sense — a bad big game still costs).

Lots of people have made the observation that in terms of raw purchasing power, players pay around half of what they used to in the 80s. You can thank our old friend inflation; I particularly like the chart here showing the effect. Well, in terms of bytes, it’s way more than half.

What are the games that poke out at having high revenues per byte? They are “evergreen” games that rely strongly on

  • community
  • user-created content
  • player skill (sports-like)

Unsurprisingly, most of the high data points are MMOs and service-based online games. They’re probably not as high as they look, since these are also the games most likely to rely on streaming content — but for MMOs I did try to compensate by using the total space on disk after play, so any streaming caches are included.

The kicker on this is that this hugely underreports the fall in game prices because whole segments of the industry give away the games now. Those free to play games are still delivering that many bytes to users, who just don’t pay. And yes, some whales then pay enough to cover the free players. But for the resultant data point to be equivalent to the cost per byte of an AAA game of the same size, you would need every player to have a $60 life time spend in the game. On average. Needless to say, free to play games do not tend to hit $60 average for every player who enters the game (some do, in Asia especially, believe it or not).

That’s not even mentioning other aspects of downward price pressure, such as discounts over time, bundles, or Steam sales.

Now, I’ll be totally upfront here — I don’t have nearly enough data points on costs, install sizes, and typical revenues for mobile games. So this is all sort of speculative at this point. But I don’t like the shape of this curve, especially when I compare it to the other curve, on developer costs.

These two lines are separating, as you can see. Worse, this is a log scale, so they are separating faster every year. This is a classic “make it up in volume” scenario, you see. We can afford, as an industry, for players to pay less and less as long as we can sell to more and more players.

But… at least in developed countries, we are actually close to market saturation. There is a term, “total addressable market,” which means “everyone you can actually sell to.” We crossed the “50% of people are gamers” line almost eight years ago. It’s also a well-known basic rule of marketing that users who are farther away from your core audience cost more to acquire — in other words, the farther into the world’s population we go, the more marketing money we have to spend. And remember, marketing money isn’t in these charts.

On current trendlines, here are some naive forecasts generated by the simple expedient of overlaying a ruler on my monitor:

The first forecast is that at this rate, the average game will be free in about ten more years. And given that the dataset tilts towards AAA, yeah, I mean the average AAA game. Some games will be paying you to play them. Lest this seem crazy, that’s actually already the case for any free to play game we currently consider a flop that doesn’t make back its money; we paid dev and marketing cost, you played, and we didn’t cover the costs.

The second forecast is that the way we’re going, top end AAAA productions will drag the average cost of AAA into the stratosphere. We’re talking one terabyte games that cost $250m to develop, by the early 2020s.

We need to remember that a lot of this is simply the price of advancing technology. As long as technology advances exponentially, so will costs, especially if we keep using it naively.

And by naively, I mean, focusing on pixels.

Because there are some things that may ameliorate this curve. None of them are easy. In fact, most of them have not been executed consistently and effectively over the history of gaming, and we’re outright not actually that good at them. But specific outlier games have proven that these things can work and break these curves. The thing they all have in common is that they de-emphasize bytes in favor of other types of content.

Strong community drives retention, and retention drives revenue. Community is probably the easiest thing that developers should aggressively pursue, and it’s not cheap and it’s not at all easy. I estimate typical studio learning curve on doing this to be around 3-5 years of culture change.

Designing for systemic content rather than static content. This is bad news for a lot of games that I love. My absolute favorite game of last year was What Remains of Edith Finch. I thought it was a 10/10 masterpiece marrying the narrative and systemic arts. And with my business hat on, I wonder if in ten years we will see static content games like it as viable.

Focus more on multiplayer, since players are effectively content for one another. See the “community” bucket for the difficulties here.

Shift our F2P emphasis, which currently depends on trickling content and upselling it. That content load is exactly what may kill us.

We could also embrace users generating those bytes in various fashions. UGC, using player models, customization, whatever.

Algorithmic and procedural approaches need to become dramatically more widespread. Fortunately, the academic community is way ahead of you on this one, and there are already academic papers out there on generating entire games with code. Yeah, over the long haul, that may render you the developer obsolete, but at least publishers will live on and raise a glass to your memory as they feed your brains into the training data set for their neural net designer AIs.

Speaking of which, your servers are horrendously inefficient. Speaking as an old MMO guy, you are probably vastly underutilizing CPU simply because of libraries, containers, VMs, virtualization, and hugely inefficient web stack stuff. Try pretending that you need to host 5,000 instances of your online match-3 RPG on a Pentium box from 1999. It can be done. It might bend the curve.

Raising prices is the most obvious. Nobody wants to do this. It will probably happen anyway.

The other, similar, thing to do is make less games.

To the players out there: I know none of the above is stuff you necessarily want to hear. Trust me, a lot of it is not stuff developers want to hear either. If you want to preserve the games you love, you can help by not pirating, by supporting developers, by not tearing them down on social media and calling them inept greedy bastards, and most of all by just understanding the landscape.

And if you are a developer, the best advice I can give you is this… this world isn’t fully here now, but the trends are pretty dramatic in my opinion. So you should do some skill-building while you can.

  • Think of the whole industry as a mature market. We’re running out of platforms shifts that reset costs.
  • Get good at systemic design, design for retention, design for community. Basically, think like an MMO developer. Yeah, that means designing everything as games as a service.
  • Embrace procedurality.
  • But also embrace brand-building and marketing, because you ain’t gonna survive without it. This market is going keep getting more crowded.

And frankly, I think individual contributors need to start finding ways to get on-going revenue from older games. Because that world is also one where individual contributors become more and more interchangeable.

Now — it may well be that this data set is utterly inadequate. I invite more data points (submitted anonymously), especially from indie, free to play, and mobile. I’d need game name or unique identifier (so I can de-dupe), total development cost excluding marketing, year of release, and platform. I’d like total size of install or data generated and delivered to player as well.

Of course, this would be better if some web wizard built a website that supported anonymous submission of these data points as an industry service, and generated these graphs on the fly. Because this is not an issue that should pit developer against publisher, publisher against player. This is about the sustainability of the hobby we all love and that pays bills, keeps us sane, and sometimes drives us a little crazy.

To be clear: I would love it if these graphs were wrong.

  32 Responses to “The cost of games”

  1. It’s ridiculous to talk about costs without talking about growth of the market.

    There’s no mentions and comparisons here with profits.
    There’s no mentions or comparisons here of the size of the market and how many people are playing games.

    Just one example of rising profits:

    This would be like talking about hollywood and the rising costs and budgets of films, and not talking about how much money these films are bringing in.
    Really not all that-relevant to the current conversation about microtransactions/pricing etc, without looking at that part of the picture.

  2. The problem I find with your article is that it ignores a huge factor and an absolutely essential data point. Glaringly missing from your diagnosis. The population size of the potential gaming market.

    The cost of games didn’t spiral out of control because of any reason other than developers, studios and investors seeing a viable route to generating profits. Their dogged pursuit of the almighty dollar has created an escalating arms race in the industry. I see studios pouring money into marketing, when they should have poured it into the games themselves. Its why we get multi-million dollar flops (Wildstar, Deadspace 3, The Sims Online, are examples of a bridge too far).

    I don’t understand why AAA games even exist anymore. Indie games are able to produce incredibly creative, inventive and entertaining titles without 50 million dollar budgets. No ones forcing these studios to make 1 big game, instead of 10 smaller ones. In fact, the approach to 10 smaller ones would probably herald in a whole new generation of interesting IP and not a culture of Call of Duty 14 vs Final Fantasy 27. Yet, no, the industry stagnates and blames superstars like World of Warcraft for destroying the minds, the genre, and most importantly the wallets of millions of gamers.

    I’m not even sure what your section on “byte” size is even proving. Modern computing power has increased by magnitudes in the time span you covered, so it stands to reason that yea… more data would follow that curve. Storage is cheap, I fail to see how this is relevant. is a good reference for why this argument falls flat to me. The cost of a gigabyte of storage went from ~$250,000 in 1985 to $.03 in 2014. Relevance? If anything developers can better visualize (don’t forget AUDIO and ACTUAL music, not bleeps and boops!) their vision for a title with modern technology.

    I’ve seen this argument coming for some time. The idea that games keep getting more expensive to make, unsustainably so) have been circulating as you noted. Everyone uses these articles as evidence that a.) their favorite games monetization scheme is appropriate and not manipulative and devious (insert your favorite loot boxes/gambling argument here) b.) developers need to raise the price of their games, or worse they write off indie-titles as small-time and mock them for their tiny budgets.

    The cost of games has gone up because studios see an opportunity.
    The cost of games has gone up, because the audience for their product is factors of 100? 1000? 10000? more today than it was in 1985.
    The cost of games has gone up, because corporations demand success and market domination. Investors on wallstreet must be satiated.

    I don’t buy any of it. No one has a gun to anyone’s head telling them to spend $1 billion dollars on a game. Your data points on games is probably missing out on how many indie titles you can buy on steam right now? I’m guessing Cuphead didn’t eek across the finish line with $20 million towards to it.

    I find myself totally skeptical of an industry that is setting the groundwork right now for justifying the increase of costs, manipulative monetization schemes, and any other form of deceptive or flatout manipulation of facts to increase their bottom line. IF you don’t think this article will be used to justify all manner of shitty decisions, let me tell you about Battlefront 2’s lootbox system.

  3. Excellent post, Raph. Unfortunately, I’m not readily able to poke any holes in your data or reasoning. I’ve been seeing the writing on the wall regarding content-driven games, systemic design, and on-going revenue for years. The “content treadmill” is a very bad place to be, and it’s where a whole lot of games have been heading — in part, I suspect because systemic design seems more difficult and less certain than linear, narrative-driven design (just as F2P once seemed more difficult and less certain than retail pricing).

    nI know you’e long been an advocate of systemic, procedural design as well. With games ranging from FTL to Unexplored (among many others) showing how well this kind of design can perform in the real world, I have to think there’s going to be a big move — maybe a big split between AAAA games with hand-crafted normal maps, linear content-driven gameplay, and those with more procedural, systemic design. My bet is clearly on the latter as the only thing that’s going to be viable for the vast majority of developers.

  4. Excellent analysis! I love the “pay people to play your games” bit. Damn straight that’s happening in mobile.

  5. Incredible article! I just wanted to ask, if I were looking to apply an Economics degree to a career in the video game industry; what steps would you recommend and how would you go about translating such skills into an industry position? It’s a fairly broad question, I know, but this analysis really inspired me. Thanks!

  6. Interesting analysis – always good to get in to the weeds of this topic. However, the things that get (conveniently) lost in these discussions is a) The fact wage growth has not kept up with inflation, so therefore ‘games being cheaper than 20 years ago’ argument kind of doesn’t hold up as well. b) Cost of games across different territories, a lot of emphasis is placed upon the $60 USD price point while in many other countries, that price is significantly higher relative to the USD price.

    It would be interesting to see an in-depth analysis that takes those factors in to account.

  7. “If you want to preserve the games you love, you can help by not pirating,”

    Translation: If you want to hurt a publisher’s bottom line, rent a seedbox instead of buying the newest AAA garbage and push those upload ratios to the moon. If we’re lucky, they could flounder and leave the market and then someone who isn’t as far up their own ass might have a chance at the IP previously being dragged through the mud.


  8. Translation: If you want to hurt a publisher’s bottom line, rent a seedbox instead of buying the newest AAA garbage and push those upload ratios to the moon. If we’re lucky, they could flounder and leave the market and then someone who isn’t as far up their own ass might have a chance at the IP previously being dragged through the mud.

    What’s wrong with instead supporting the games you like and not the ones you don’t? 😛 This sort of thing helps no one.


  9. However, the things that get (conveniently) lost in these discussions is a) The fact wage growth has not kept up with inflation, so therefore ‘games being cheaper than 20 years ago’ argument kind of doesn’t hold up as well. b) Cost of games across different territories, a lot of emphasis is placed upon the $60 USD price point while in many other countries, that price is significantly higher relative to the USD price.

    Yeah, wage growth (and inflation applied to it) is definitely true. But “an individual at median wage can afford less games” and “players pay less for games” can both be true at the same time, of course; one’s based on percentage of wages, the other isn’t.

    I didn’t try tackling the price across territories, that was way too big a project to try! I agree it’d be very interesting.

  10. if I were looking to apply an Economics degree to a career in the video game industry; what steps would you recommend and how would you go about translating such skills into an industry position?

    Look for “data analyst” jobs, that’s probably the entry point.

  11. I think I replied to some of the specific folks talking about market growth on twitter, but just to recap very briefly:

    – Yes, audience growth has made a huge difference. Likely all that has made these curves possible at all!

    – I am not at all saying that the industry isn’t profitable. It sure is.

    – the questions longer term are around how long we can sustain the same pace of audience growth — these curves demand exponential growth.

    – there is also market competition to think about — more games coming means each one gets a smaller slice of the pie unless the pie grows at the same pace as game releases do. This, I am positive is not happening, but we’d need to get more data.

  12. Great post, Raph. The conjecture about Unity and Unreal forcing that plateauing of game costs is interesting. Would it be possible for you to segment your data by engine usage to try to characterize that further? It’d be fascinating to see if there’s any cost or cost-per-byte correlation to the usage of a specific engine.

  13. Excellent article! Thank you for presenting this

  14. When you mention price per megabyte or the cost per player, you have to take into account the increase in graphical quality. Be it the quality (pixel density) of in-game objects / terrain / environments, those things will inevitably go up — you can compress a bit-map only so much.

    For the majority of the 1980’s, we viewed displays in a QVGA resolution (320X240). Nowadays, 1080p is the universal standard with graphic quality reaching up to 4k visuals. Let’s do a quick calculation…width: (2048 pixels / 160 pixels)(1536 pixels / 120 pixels) equals almost 164x the resolution. It it obvious then, why the size of games increase at such a progressive scale. I am unsure of the future of the display industry and future standards.

    In my honest opinion, you should, instead of comparing the games, compare the growth of games while also simultaneously plotting the growth of typical PC storage capacity while also plotting the prices of said storage. Perhaps the data may reveal something.

    We should be more worried about the future performance of games as we see the expansion of AAA titles that concentrate of graphical reality and quality. Nowadays, games truly are stunning when run at maximum graphical capability. Be optimistic about the future of gaming! Everyone is able to play genres they enjoy, we have stunning 3D shooters and open-world titles as well as rich gameplay in modern-day side scrollers.

    Thanks for reading my reply.

  15. At first, doing some graphs myself, I know how much work is behind making these. I sincerely thank you for putting the effort for making such analysis for free. What can I add that doesn’t count under “random opinion I made in 2 minutes”?

    At first, the “Some games will be paying you to play them.” is in wrong tense. The correct is “Some games are paying you to play them.” I bought Playerunknown’s Battleground for $30 and sold a bunch of cosmetic lootboxes for $60. Not RMT, on the official Steam marketplace. Some people got rich by playing CS:GO (without streaming or RMT or anything. Just by playing)

    Secondly, the byte explosion and even more the marketing explosion shows that developers aren’t making NEW games, they make “more of the same”. Diablo 3 wasn’t a different game from Diablo 2, it offered the same gameplay, just with nicer graphics (and a flopped marketplace). PUBG is the first genuinely new thing I’ve seen in the FPS market, everything else offered the very same gameplay as Duke Nukem 3D (which was new in vertical movement over Doom), just with nicer graphics.

    Because of that, I’m unable to comply to your request to not call devs inept greedy bastards, because if someone sets out to make a game with no more vision than “let’s remake Duke, Command&Conquer, Diablo or Everquest with nicer graphics and sell it by extensive marketing campaign”, I don’t know what else I can call him? To be more than a vulture feasting on this once great industry, one should design NEW GAMEPLAY. Something that would be new even if it had to be made with graphics assets stolen from pre 2005 games. PUBG would be a new game if used nothing but Duke Nukem 3D assets. EVE Online would be a unique game if it used Wing Commander assets. But Halo using Quake assets would be just another Quake map.

  16. I would argue the cost of a game does not increase with MB. The opposite is often true, because, as any developer would agree, the less MB for the same functionality, the price goes up… It is easier to just throw a lot of code in to get something build, than to simplify and reduce code size (and thus risk). I also agree that with today’s graphics resolutions, game graphic files tend to be much larger than years/decades ago, so that must be compensated for.

  17. “To be more than a vulture feasting on this once great industry, one should design NEW GAMEPLAY. “

    May I refer you to my post “every genre is only one game”? Actual creation of new mechanics is very rare.

  18. A new gameplay doesn’t need a completely new genre, just significantly different GAME RULES (not different content). PUBG is still fundamentally an FPS (the minute-to-minute gameplay is the same as in Quake or Call of Duty), but the half-an-hour ruleset is different (everyone vs everyone instead of team A vs team B). Similarly, EVE Online isn’t different from WoW in the minute gameplay (you mine resources or grind mobs with tab-targeting, spells on your castbar), but it’s different on the ability of unconsensual PvP and looting player gear. Any EVE players would testify that the “EVE feeling” is fundamentally different from the WoW feeling and EVE shown surprising (even to me) persistence because of that.

    A game is unique by its ruleset and not by its graphics. One could make a fundamentally new game using nothing but WoW assets and WoW minute-to-minute gameplay. But they aren’t trying. They throw $100M on making the SAME DAMN GAME with more bytes.

    I’m sorry, but I have zero respect for anyone who is doing that.

  19. I agree a game is unique by its ruleset… my point was just that it’s rather a sliding scale. Pure, outright clones are also fairly rare. Halo plays vastly differently from Quake, but you feel that it’s too similar based on your earlier post. WoW plays vastly different from EQ, yet to me it was too similar to grab me (and actually, EQ was too similar to DikuMUDs to grab me).

  20. Great post, Raph – thanks for taking the time!
    Especially the ‘design systems, not content’ is a sentiment I’d fully subscribe to. I don’t think it negates the problem stated, considering the base costs of a AAA title, but it certainly helps longevity in any GaaS model.
    However you are elaborating on one of the most important points of our industry, which unfortunately most people (both industry & consumers) try to ignore.

    There is a strong mismatching trend in what players are willing to pay for a game and what they expect from it. Unfortunately lots of the typical comments here or on gaming websites are proving my point.

    People cite Indie games or the Witcher series as examples for games that have a great value on a low project budget. While being completely oblivious to the fact that a majority of these games are made under inhuman conditions. Way below a fair wage or no wage at all, at the risk of one’s own life (or the whole family’s) savings and with ungodly amounts of overtime. Even worse. They only look at the successful ones, forgetting about the thousands of indies and small studios that did not get successful and lost everything.

    People forget that a game is more than just it’s Design and that every project only has a limited budget for innovation. There is little innovation in Design over the Uncharted series and still I can enjoy every second of the experience. There are so many amazing people working on these projects, pouring in their passion and experience, working 80 to 100 hour weeks for a result that is so much bigger than the sum of it’s parts.
    “Everyone on the project is a vulture if not everything is new?” That’s not how art works. How many downright new paintings or statues do you see every year? Art is inspired by what already exists. The more of something already exist, the harder it is to come up with something new. Plus, innovation is risky. People are not used to it. There are enough examples where games utterly failed when pioneering a mechanic that would later be adopted by the industry. “Who would ever want to play a shooter with two thumb sticks”, right?

    So yes, we have rising costs of living, rising project costs and rising marketing costs but a constantly dropping price anchor for the average game. Acquiring more customers just causes even higher marketing costs. It’s hard to not acknowledge the delta here.

    As long as consumers are spending more in F2P, as long as retail becomes more and more a price dumping industry, where even an 70% steam sale still is considered too much, more and more companies will naturally shift to F2P. Raph’s prediction of a majority of games being free in a decade should not surprise anyone.

    People believing that the EA backlash will have a positive outcome for the retail market are overlooking that there are two solutions for the EA leadership to learn from a negative feedback on a retail title with IAPs. And going full F2P next time is one of them. When running a company employing thousands of people this is not a difficult decision.

  21. If industry can survive only by using manipulative, near fraudulence monetization strategies mayby it shouldnt exist

  22. @Ralph: no, it’s not a sliding scale, it’s a yes-or-no thing to be different games. For two games to be not clones, you have to be able to cite a game RULE (not content) which is different in the two games. “Rule” is not trivial to define but we can settle with something like “it could be implemented over a day but devs choose not to”, while “content” is something that devs always want to include, but may not have the budget to implement.

    For example EVE and WOW are different in gear destruction and looting upon both PvE and PvP death. Changing this would be trivial in both games, one could implement a ship respawn on death to EVE or looting someone’s corpse’s gear in WoW with an hour or two coding. But devs would never do that, since that would fundamentally change their game.

    I do agree with you that WoW is an EQ clone and EQ is itself a clone of some early mud, differing only in content (better graphics, better lore). But can you tell how Halo is different from Quake (and that from Duke Nukem?)

  23. Gevlon,

    First of all, I wonder again whether you read the older articles I pointed at; one of them uses examples very much like what you describe. They are here:

    I ask because the question of how much a rule changes a game is something I’ve spent a decent amount of time on, from a formal point of view. “Implemented in a day but devs choose ot to” strikes me as a terrible way to define it, I’m afraid.

    My second observation is that adding full loot to WoW or ship respawn on EVE would certainly take a minimum of many days to do; not because of just the code which is pretty small I agree, but because of all the design considerations that go with it. I know, because I speak from experience here. “Trivial” is somewhat true for the basics of the rule change, but the impact is enormous.

    As far as original Halo vs original Quake, obviously they are variants of the core FPS game. Halo added vehicles. That’s a basic and huge rule change. Halo also migrated to controllers, which fundamentally changes the haptics and control scheme and which prior to Halo was believed mostly impossible to do well, with the sole exception of Goldeneye 64 kinda pulling it off. Both, of course, owe much to predecessors such as Marathon and of course Doom.

  24. Artem, the things I suggested are neither manipulative, nor fraudulent, by nature.

  25. @Raph

    One data set that I see missing is what the initial cost is projected to be under the assumption that developers still use “design documents” when pitching games for funding. If a game is given an initial budget based on a design document, the projected costs -should- be relatively accurate in terms of the economics that are in place at the time of its design proposal. Have you seen any data that shows, or supports the notion that industry specific demons such as “feature creep” and other inept design decisions might not be behind the rising costs of games? If a game starts out with a budget of $85 million based on 2015 data, and the development cycle is 3-4 years, surely the devs/pubs know that development/marketing costs will rise over the course of those 3-4 years..? But what are the industry norms for over budget work? At some point you have to rob Peter to pay Paul, and money that could have went to another game is instead eaten by a grue.

  26. i am talking about game industry overall, not about your article directly. I should say that in my first comment, my bad.

  27. NoGuff, developers do use design documents, but design docs are notoriously bad for estimating development costs. In fact, within a larger publisher it may well come in the other order — a team may be told they get to make a game with a particular budget and of a certain genre, and they work from there, rather than from the idea to the budget.

    There are many issues with design documents, not the least being their inability to capture things like “game feel” which require iteration on controls. Often a chunk of a game’s development is outright R&D, which as we know is most successful when it’s not really tightly timebound. Feature creep is a well-known bugaboo, but hardly games-specific… it happens in other forms of software too.

  28. I Agree about of Each game has a reported development cost which, importantly, excludes marketing spend. So this is mostly the cost of salaries and various forms of overhead such as tools. When costs were reported in currencies other than the dollar (Euro, yen, even zlotys) I went back to the year of release, and converted the cost to a dollar value using the exchange rate prevailing in December of that year. I then took all dollar values and adjusted them for inflation so that we are comparing actual cost in today’s money.

  29. @Raph: I read your “every genre is just one game” long ago. Then I read it again now. And thanks to that, I’ve cracked the “what is clone, what is original” question:

  30. This is good news to me.
    Things have been settling down to stagnation and it’s past time for change and better gaming.
    I always expected this to be apparent and advancing to happen sooner. But it didn’t, so now it may be forced down their damned throats.
    Or maybe they’ll be obstinate and simply choke themselves off, because I don’t have much faith in the leadership of the industry who simply don’t seem to know how to make the advances you describe. Especially in multiplayer and it’s difficulties, and that is the key ingredient in my mind.
    If all else fails, sooner or later some smaller group might show up and become the industry’s new golden child.

  31. Did you do any formal stat tests on this? The data looks awfully dirty based on the graphs posted here. It’s a surprising conclusion that cost per megabyte is the same regardless of budget, and I’m wondering what the error factor is.

  32. The data IS incredibly dirty, and the spread (even on cost per megabyte) is already an order of magnitude within a given year. So I didn’t bother doing formal stat tests on it.

    That said it’s not surprising that cost per megabyte is pretty similar—the primary driver is salaries.

Sorry, the comment form is closed at this time.