Three ways to help validate Game Ideas
For so much of my career I’ve had a role where I needed to make some call on the commercial potential of a game. Whether it was trying to pick the winners in the mobile Java era or consulting (usually to find a way to help rescue a failing project) and especially now in my role as CSO of a Living Game Publisher.
In this article I’m going to try to share the way that I look at games now, including my own efforts in design, as well as when trying to identify and prepare projects for launch. For full disclosure, my approach to this is biased towards Living Experiences, meaning those games on mobile, PC or console which aim to immerse players in ongoing active play over months or years.
There are three different perspectives I like to take:
- Direction: Why, What and for Whom?
- Design: Mechanic/Context/Metagame
- Data: Forecasts, Testing and Performance
Direction (Why, What and for Whom?)
When looking at a game the first thing I want to understand is what is providing the direction that underpins the concept, and that drives its (usually commercial) purpose.
This starts with trying to understand why this game is being made. This is as much about appreciating what motivates the developer as how the game is expected to attract, retain, and convert the player (to spending). Are the developers trying to create a genre-busting disruptive innovation that will blow my mind? or are they trying to recapture the way that Mario made them feel twenty years earlier? Some teams are motivated by peer review or by cold hard cash, others by Metacritic rating.
I also need to understand what they are trying to build. This isn’t about Genre or Platform, although that can provide an easy shorthand of what to expect and will be useful later when looking at data. I’m more interested to know if the game will be linear one-off game with a start, middle and end or something which evolves and develops over time through ongoing updates, configurations, and content releases. I want to understand whether it’s about short-term escape from routine or about a lifestyle choice and a commitment to a community. These kinds of distinctions I find are fundamental when assessing how a game might perform and will also shine a light on other factors, such as if the team have the skills to realise the game’s potential. It also frames other questions such as the costs, resource requirements, planning, development, and testing strategy required. If a developer is making a simple hypercasual twitch game the criteria will be very different than say, an open sandbox experience which adds new missions daily/weekly/monthly.
However, in many cases the most important question of direction is who the game is being made for. The target audience for the game should (almost) never be the game designer. If we are validating a game, we need the understand the potential audience in enough detail so we can appreciate why those players will care enough to part with their precious time and hard-earned cash. We need to understand what (as far as possible) they feel, think, say, and do around games like this and importantly where you can legitimately find them through social media and marketing. This means we need to understand their interests, characteristics, and the channels where they already are and build reasons for them to trust your team and brand.
Assuming we understand the direction of the game, we need to understand the design and how this ‘resolved’ the requirements set out for the team.
I start by isolating the core Mechanic (what the player does) into a loop which usually should be possible to express with 4 steps. For me I’m looking for a 1. Start Condition that sets up a 2. Challenge for the player 3. Resolution and ultimately receive a 4. Reward.
Simple enough, but we’ve found that alone doesn’t really help identify what makes the game fun. Instead, we need to find a way look at how these steps take the player through an emotional journey. There is no one way to do this, but we find it useful to look at how the start condition begins with the player in a calm or relief state but at the same time creates a high level of anticipation.
As the game introduces the challenge, we might expect to see the player move emotionally from Relief to Tension i.e. awareness of the potential for failure. However, if the game fails to sustain the level of anticipation that was set up in the players mind at the start, this can be a cause for churn. At the resolution stage we look for the emotional journey to go from anticipation to some kind of fear of missing out. We still want there to be a tension, but now combined with an awareness of the consequences of that failure – i.e. missing out on a future reward or experience. The final stage when we have completed the action concludes that pattern by revealing your reward, but even here we need to see the players’ state to move to a sense of calm or relief. This is useful as it helps us judge the potential for players to enter the ‘Flow State’ made famous by Csikszentmihalyi and sought by many a game designer. However, combining this with thinking about the switch between anticipation and fear of missing out helps us understand if the game can tease players interest in playing again (and again). If instead the reward stage resolved the player’s emotional experience completely, where is the motivation to replay?
A similar principle applies to the second layer, the Context Loop, meaning the layer of the game design that provides the reason for the player to repeat that mechanic and how it provides reasons to play again.
Here again we need to consider the emotional journey of the player as well as the key elements that drive the loop forward. I think about Purpose (or ‘why am I doing this?’) as the starting point where the player can feel anticipation and rest/relief but as we move towards progression (how we can see the impact of our actions against that purpose) we move from relief to tension. That pattern continues as we look at Optimisation (how we play and the choices we make) and move from anticipation to fear of missing out. Finally, I look at the narrative (even if just our own playing story) providing a conclusion to the loop, moving the player from Tension to Relief but maintaining a sense of fear of missing out to drive forward the next play (or next episode).
That’s as far as many designers go, but I think especially for living games this isn’t enough. We need to also consider what I call the Metagame – which means the social/lifestyle impact that the game has on the player. Only by understanding where the game fits into the players daily routine can we have confidence of attaining the real potential of the game, in my opinion.
Similar to the other two layers, we are looking at key elements moving through these same emotional states. We start with the lifestyle fit (what the player is doing outside your game) which means considering how the playing mode is affected by the device being use and how this affects the habits of the player. For example, using a phone in portrait mode means the gameplay can be done one-handed, interrupted regularly, and used discretely (or not!) in otherwise inconvenient places (like on the toilet). That’s generally not a consideration for a console, where you may instead have to negotiate with the rest of the family to take over the central TV in the main living space. The combination of device and circumstances drives a bunch of behaviour but still needs the initial state to provide a sense of calm (Relief) and Anticipation in order to set-up the conditions to sustain long-term player engagement. Adding levels of collaboration whether directly in the game through multiplayer, or through social community platforms, helps sustain or even build on the excitement, but also needs to build a tension if you are to maximise engagement. Competition can, for many games, be a massive driver of that positive fear of missing out we need to build engagement; however, this doesn’t have to be traditional direct, one-on-one confrontations like in many sports. With games we have the power to look at competition in fresh exciting ways. Finally, as we move from tension to relief (staying in fear of missing out) we should be building up that sense of zeitgeist (something in the air) around your game, which creates that hype, and inspiring debate around the game and community.
I use this model not only to work out new game ideas, but also to dissect games – and sadly too often it’s the Context and Metagame loops that aren’t fully considered. It’s also not the only way to look at games, but unlike a design based on ‘pillars’, it makes us think about the player experiences as a system.
You’ll notice that this doesn’t take into account a bunch of important questions. Is the art style compelling (fit the IP/genre/etc), how immersive is the game narrative, sound effects, etc. – All of these really matter, but I find that the dissection into the abstract loops allows me to set aside my preconceptions.
Data (Forecasts, Testing and Performance)
As much as I like to think about the direction and design, in the end what really matters is the data. This means thinking about what “Success Looks Like”.
If the game has yet to be shipped, you need to make forecasts – again another article in itself. However, if you can take data from previous games (directly or via tools like SteamSpy, AppAnnie or Reflections.io) you can build a model which can show you what success can look like. This is what at Fundamentally Games we call a ‘Top-down model’, as it shows what revenue/downloads have been achieved, even if we don’t know the costs of UA or development.
For the more diligent amongst you it can also be important to also create a ‘Bottom-up model’ to compare that ‘Top-down model’ with. This takes the assumptions you make about user acquisition costs, organic installs and revenues and build up a picture of what your game could actually deliver (assuming there was enough demand). The gap between the two can be very illuminating as it helps you understand the scale that you can work with for that game.
The second form of data, User Testing is where we collate insight and feedback from players. This is usually qualitative, e.g. about understanding how the game makes the player feel using surveys and polling. We find it useful to use 3rd party platforms like Antidote (generally more PC) and PlayTestCloud (Mobile) as this keeps a distance between us as the publisher and the players to avoid introducing too much bias in the results. This gives the developer/designer a solid, independent insight into what’s happening for players. Some AAA studios will use biometrics and other advanced techniques, which can be amazingly useful, but generally out of the reach of most independent studios. Personally, I’ve not really found the additional benefits to outweigh the costs, but then I’ve tended to do this on relatively low budget projects. The key objective for us with this user testing is to identify would they play again, and would they recommend to a friend. The concept of recommendation is a key factor, also known as Net Promoter Score, as it helps us understand the potential for organic user acquisition.
Finally, we look at the performance data. This is the gold standard for understanding how a game will perform but doesn’t always help you understand exactly why (hence useful to combine with Testing data). The key variables I’m interested in are Cost Per Install (blending Paid and organic installs); Day 1/3/7/14/30/60 retention; Return on Advertising Spend (usually percentage return over 30 days) and % Repeat Spenders. Once more there is a whole article in looking at Performance Data, but the important thing is having a metric in mind before you run the tests and treating it as a hypothesis that can be proved or disproved.
In the article I’ve talked about 3 key approaches for understanding and validating any game by understanding the direction, design and data. However, what we use at any time usually depends on the stage of development. We aim to find problems rapidly and focus on the basic needs of the game first, so we have something to build on with the later analysis. This means we need to start by understanding who the target audience is and run a Facebook ad test using a 30s gameplay video and seeing how many clicks we get. That gives us an immediate understanding of how attractive the game might be. If that goes well, then we might do the User Testing Process, and only after that start looking at Retention tests, Betas, and soft launches.
In short, we see validating a game as an ongoing process, treating the game as a hypothesis to be tested. Agreeing KPIs in advance and improving our understanding, step by step.
If you want to know more about this approach to validate your game ideas, check out Oscar’s free webinar on Tuesday 24th May 2022 here: https://validategameideas.eventbrite.co.uk
About Oscar Clark
Oscar Clark literally wrote the book on Games as a service, and is Chief Strategy Officer at Fundamentally Games LTD, a publisher of living games committed to transparency and genuine partnership with developers.
About Fundamentally Games
Fundamentally Games Ltd was founded in 2019 to focus on bringing games to live, helping game developers with managing live operations to scale their game. Their engagement-led strategy aims to scale games faster by getting more players, doing more things, more often and for longer. Now offering publishing and UA funding with a commitment to transparency and genuine partnership with developers.
In March 2021 Fundamentally Games announced their UA Testing Program where they can fund testing and support for games early enough to make a difference.