top of page

Gamestorming #11: Evaluating the Game

  • Caity Kelly
  • Apr 19, 2019
  • 2 min read

In my game design class this week, we have been looking into developing playtesting plans for our games. These include a variety of steps--self-test, recruit various playtesters, design evaluation criteria/schemes, collect data--just to name a few. This is an importance process for taking care of any general errors that you might have missed as the designer as well as for testing whether or not your game is indeed applicable to your target audience.

During the self-testing phase, I think it is important to make note of any small errors you encounter--issues with the mechanics, errors on the materials, etc. Following this phase, it can be helpful to recruit close friends/family members to provide you with feedback. This provides an outside perspective without spending resources on recruiting non-familiar playtesters.

Once the time has come to recruit playtesters from your target audience, however, there are a number of factors to consider in order to use the playtest experience to its fullest potential. As I have mentioned in the discussion board for my class this week, the designer must plan how they will collect data on their game. My preferred methods for this are simple observation (make note of any areas in which players struggle as they test the game), followup discussion in which players give you formal feedback WITHOUT any defense from the designer (if something is not accounted for in the instruction manual of the game, it must be made clearer since the designer will not be present to give every player a personal explanation), and an anonymous post-playtesting survey.

As mentioned in my class discussion, I believe that the survey should focus on aspects of goals, feedback, and player interactivity. I would choose a Likert Scale in which players rate the items from 1 to 10; however, I would also provide a space for optional rationale in case raters would like to explain their scoring decisions. A few of the items I would include are as follows:

  • The instructional booklet provided sufficient supplementary resources.

  • The time provided for each round was balanced between challenging and frustrating.

  • The color scheme of the materials was pleasant.

  • The educational discussion component was helpful for my learning of the material.

I would also include a few items that are reverse-coded to account for playtesters who elect to answer all survey items in the same fashion. For example:

  • The gameplay did not hold my attention.

  • The material aesthetics were poorly designed.

  • The social interaction with other players was lacking.

At the end of the day, it is impossible to invent a game that appeals to every type of player. There will always be a handful of people who dislike a certain feature which, in reality, works quite well for the goals of your game. However, the evaluation process is a great way to find areas in which the majority suggest need improvement.


 
 
 

Comments


Featured Blogs

© 2017 by Twisted Realities. Proudly created with Wix.com

bottom of page