Android: Netrunner – Through Schell Lenses

Introduction

Android: Netrunner is a Living Card Game (LCG) created by Fantasy Flight Games (FFG). It has won the 2012 Best Card Game and 2012 Best Two Player Game at the 2012 BoardGameGeek Golden Geek Awards. I will be evaluating Android: Netrunner’s game design using Schell’s lens of problem solving, the elemental tetrad and of surprise.

Jesse Schell teaches Game Design at Carnegie Mellon’s Entertainment Technology Center. He also was the Creative Director of the Disney Virtual Reality Studio, where he designed, programmed and managed several projects for Disney theme parks and Disney online. Schell’s book The Art of Game Design: A Book of Lenses’, provide 100 lenses which he uses to take a step back and evaluate his designs.

Problem Solving

In the lens of problem solving, “Every game has problems to solve. To use this lens, think about the problems your players must solve to succeed at your game.” What problems does Android: Netrunner ask the players to solve? Are there hidden problems to solve that arise as part of gameplay? How can Android: Netrunner generate new problems so that players keep coming back?

The players are immersed in a battle of wits. The challenge for both players is to obtain seven agenda points before the other. A sub challenge for the corporation player is to flatline the runner. To flatline a runner, the corporation makes the runner discard all cards in their hand, usually by activation of ICE or a trap. A sub challenge for the runner player is to make the corporation discard his entire research and development deck, also known as a pick-up pile.

The hidden challenges are managing the various resources within the game. The resources are cards in hand, money, agendas, advancements, viruses and traces. Each resource takes at least one of three to four clicks to obtain, and possibly even some credits. Just because the ideal card is in your hand does not mean you have the resources to use it. The runner has the opportunity to make the corporation spend resources when doing a run. The Corporation has the opportunity to make the runner expend resources by setting traps and bluffing. Bluffing is an integral part of the corporation’s resources. Without it, the game is not challenging to the runner.

Since Android: Netrunner is an LCG, newer expansions will continually give players an added challenge of learning the new cards and learning new combinations. For players who are interested in testing their skills on a tournament level, these new expansions test their skill, give them money and give them fame.

Elemental Tetrad

The elemental tetrad in Schell’s book is the use of Aesthetics, Technology, Mechanics and Story to form a game. To use this lens of the elemental tetrad, “take stock of what your game is truly made of. Consider each element separately, and then all of them together as a whole”. Does Android: Netrunner’s design use elements of all four types?

The aesthetic appeal of Android: Netrunner is apparent in the artwork on the box, the tokens, the cards, the name of the cards and the ‘flavor’ text. This game is thoroughly engrossed in the culture of a cyberpunk dystopian society. Most of the runner cards are organic and lively, while the corporation cards are mostly sterile and cold.

The technology of Android: Netrunner is defined within the format of a LCG and some high quality cardboard pieces. Card games instantly provides the common terms of hand, pick-up pile, discard pile and a general notion of playing against another player or self. The cardboard pieces are quickly identifiable and easily referenced against a chart. The Living Card Games provide the notion that your set of cards is not complete and will be expanded upon in the future.

The more important mechanics of Android: Netrunner are asymmetric play, corporation bluffing, and multiple point of attack and defense. In the original Netrunner game, created in 1996, asymmetric gameplay was a new innovation to card games. The asymmetry provides differing strategies to each player along with immersive gameplay. All of the corporation cards are played face down, giving the corporation the ability to bluff. Without the ability to bluff, the game becomes significantly easier for the runner. A mega-corporation would have multiple branches and therefore many places to defend. The runner can attack the corporations hand (HQ), their pick-up pile (R&D), their discard pile (Archives) and any remote server the corporation installs.

The Story of Android: Netrunner provides the unifying theme of this game. The story is apparent in the terminology, the artwork, within the identity cards, and the ‘flavor’ text of each card. The terms for a player’s hand, pickup pile and discard pile differ to give the feel that a player is a suppressed victim of the society, or a mega-corporation. The gameplay mechanics further the story by pitting the runner against installed ICE on servers. The names of cards can come from other cyberpunk references. The rules of the cards are a continuation of the struggle between individuals and their quest to take down the mega-corporations. The description of the cards further the fan fiction base. And with each expansion, the story grows.
The Lens of Surprise
Surprise is so basic that we can easily forget about it. To use the lens of surprise, “remind yourself to fill your game with interesting surprises”. What will surprise players when they play Android: Netrunner? Does the story in Android: Netrunner have surprises? Do the game rules? Does the artwork? The technology? Does the rules give players ways to surprise each other? Does the rules give players ways to surprise themselves? The biggest importance of Android: Netrunner is the element of surprise. Without surprise, the corporation has minimal chance of getting the required seven agenda points to win. The story within each cards description provides a meta-story for fans to discover. There is also surprise hidden within deck building, when a player discovers a synergistic card combination across factions. By being a Living Card Game, Fantasy Flight Games has ample opportunity to continue surprising fans with more story, card combinations, and different rules.

Conclusion

Android: Netrunner is a well-designed game, hitting on all the elements on three of Schell’s lenses. Schell admits that his 100 lenses are not perfect nor complete but are tools for examining design. The lenses give a unique perspective on the game, and with a collection of varied perspectives we educate ourselves in the art form of design. I look forward to examining my favorite games through a varied collection of lenses to gain a better understanding for the art of game design.

Android: Netrunner – Heritage

Introduction

Android: Netrunner has a rich design lineage because it is based on an earlier game called Netrunner. Netrunner was created by Richard Garfield and published by Wizards of the Coast in 1996. Netrunner was unique in that it introduced asymmetric gameplay in card games. Lukas Litzinger and Fantasy Flight Games helped re-awaken Netrunner and brought it to the Android universe. The significant differences between the two was the introduction of identity cards, bad publicity no longer is a game ending condition and purging virus tokens only takes 3 clicks.

Differences

The identity cards in Android: Netrunner brought factions to each side. The runner has three factions: Anarchy, Criminal and Shaper. The corporation has four factions: Jinteki, NBN, Haas-Bioroid and Weyland Consortium. Not only did the factions bring about different play combinations, it also added additional rules to deck-building, additional story elements and the possibility for new identity cards in further expansions.

Different play combinations are important to prevent degenerate decks. The differing combinations also increase the amount of skill needed to play the game at tournament levels. In deck building, the identity cards influence the minimum deck size and how many cards from other factions can be used. The minimum deck size can be utilized to help with deck consistency. The more cards in the pool, the less chance you are to get the cards needed. Having a maximum of three copies of a single card reduces the consistency of the deck, but it allows a bigger diversity of card. Identity cards also specify how many cards of a different faction you could use in your deck. Splashing another faction into your deck allows the element of surprise to your opponent.

Bad publicity has been changed from a game ending condition to a status effect much like a virus token. Bad publicity allows the runner to gain a temporary credit when making a run on the corporation. This equalizes the amount of winning conditions for both sides while continuing to penalize the corporation and benefiting the runner.

Purging virus tokens now requires the corporation player to spend three clicks on the same turn, instead of being worded such that it could be spread across two turns. This forces a complete loss of turn for the corporation versus a slight delay. Perhaps this was done to balance some advantage the corporation had by being able to split it across two turns.

Uniqueness

Comparing what Android: Netrunner “brought to the table” versus what Netrunner has brought to the table in terms of its rules and mechanics misses the important fact that Netrunner was published in 1996, while Android: Netrunner was published in 2012. Netrunner brought asymmetric gameplay to card games. Netrunner also brought bluffing to the corporation by allowing them to play all their cards facing down, while the runner plays all of their cards facing up. Bluffing is an important strategy to the corporation that makes it an impossible game to play alone.

Influences

Android: Netrunner is re-imaging of Netrunner, therefore it was heavily influenced by Netrunner, which was originally created by Richard Garfield. Garfield also created Magic: The Gathering (MTG), therefore Android: Netrunner is influenced by Magic and the influences of magic which was Cosmic Encounter, Marbles and Strat-o-Matic Baseball.
Android: Netrunner can be played for an ante card, as Magic: The Gathering used to be played for an ante. I haven’t seen playing for an ante since the early days of Magic, and since Netrunner was only three years after Magic, I believe it could have had the same elements, because it was a Trading Card Game (TCG) and competed for the same audience as Magic. Gambling games also play for an ante, or greater prize, involve skill, and possibly some luck. The gambling element in Netrunner and therefore Android: Netrunner is where the Corporation plays a majority of their cards face down. This gives them the amazing ability to bluff and therefore trap the Runner.

The factions that were introduced into Android: Netrunner are a close resemblance to the amount of player combinations encountered in Cosmic Encounter. Richard Garfield tried to simplify the combination possibilities with the five magics of Magic: The Gathering.
Strat-o-Matic Baseball is another influence on Magic: The Gathering, Netrunner, and Android: Netrunner. The idea of the game is to collect cards with statistics on them and use them to improve your deck. The statistics were used to indicate to the players which dice they rolled. Garfield has strong views about luck versus strategy which were fully formed after creating MTG. I imagine his early influences with luck had him remove the dice roll, luck aspect from the Strat-o-Matic Baseball, but kept the notion of cards improving your overall play.

Conclusion

Android: Netrunner has a rich design lineage because it stems from Richard Garfield’s Netrunner. Netrunner wasn’t successful because it was in a flooded market of TCG’s and directly competed for the same audience of Garfield’s Magic: The Gathering. Android: Netrunner is successful because it is a Living Card Game (LCG), and the modified rules help streamline some of the more complex rules to reach more audiences, while also supporting the LCG format.

Direct Lighting

Exe Files

Directional lighting

Pix custom

High level Description

Directional lighting using DirectX9 is definitely different than CPU side ray tracing. I am excited to see how the Rendering equation is implemented GPU side. I understand that more simplifications will be made in order to get to the sixty frames per second benchmark that gamers are used to.

Since our systems may be starting to sprawl out of control, I decided to show the places that needed to be corrected and added to integrate lighting into our scene.

1. Import

In my mesh file, I had to add the surface normal to the vertex declaration file and struct. The importer accounts for the addition of this information.

Mesh.h

cMeshBuilder.cpp

2. Exporter

cMayaExporter.cpp

Since we are accounting for having normals from the geometry, we need our exporter to write them.

3. Shaders

After visual testing is confirmed, and the project still compiles and runs, we move to correcting our shaders. In the vertex shader, we translate the surface normal from model space to world space and pass that to the pixel shader. The pixel shader (fragment) is where we implement our Bidirectional Reflectance Distribution Function (BRDF). My main question about the difference between Ray Tracing and game graphics, is how do we pass all the lights into the shader, or how else do we account for all the lights in the scene?

4. Integrate

Once again, manual testing was imployed to visually confirm expected results. Now it is time to integrate The new structure into the game engine such that values can be changed in game. Since I do not currently have a scene, scene manager or a respectable transform controller, only 3 classes were needing to be added and/or modified.

Light.h

IActor.h

DEngine.cpp

Technical Details

My lights use ESDF keys to move. Similar to what you expect from WASD, just right shifted for the convenience of home key identification. It also opens up more key possibilities. I used JP’s suggestion of moving an offset on the xz plane and normalizing the result to update the lights direction. I left the movement of the box to be the ESDF keys as an indicator of where the light would be pointing.

For testing my lights, in DEngine.cpp of Demeter project change lines 81-83

Mandatory pictures

Pix debug pixel

For debugging a pixel, I am required to have the below picture to show me code and not assembly. Since it currently doesn’t do that even when I am in debug mode, I will look into it further and update the site when it has been corrected. Honestly, I don’t mind seeing assembly. I understand the importance of determining why a pixel isn’t what it is suppose to be. I have had many black pixels in ray tracing, mostly due to a NaN number being multiplied through. I expect pix pixel debugging in non assembly code will provide a convenience not afforded to me in ray tracing. Too many rays passing through any given pixel to debug any pixel conveniently.

Pix debug pixel assembly

Time Estimate

  • Coding: 5 hrs
  • Write up: 1.5 hrs

Maya Exporter

Exe Files

Assignment 09

Pix custom

High level Description

FBX object loader. Fullscreen. Settings.

The Maya exporter code given to us is great for converting Maya 2013 objects to an engine specific mesh format. The exporter makes it super easy to create more complicated geometry and put it in our game. I am still weighing the maya plugin option against an obj parser. An obj parser allows artist to use whatever program they are familiar with, as long as it is exported as an obj. The maya exporter is still useful in this situation, they artist would have to load their obj into maya first, and then export it again using the custom format. So what I propose possibly only saves a few seconds per object and isn’t tied to a particular program, or version of a program.

This project is my first opportunity to use the built in ini parsing from the Windows API. We were given lua code to parse the settings, and by not using the lua code, I lose flexibility, but gain conciseness. An argument against it, is that I am tied into the Windows API, but since I am currently using DirectX w/o any plans to change, I am tied to the Windows API anyways.

Saving parsed values isn’t helpful unless the values are used later. Implementing a fullscreen mode along with window size was a feature that has been in the back-burner for a long time.

Technical Details

Meshes to overwrite in Assets: mcube.jmsh, pyramid.jmsh

PIX is a valuable tool for debugging graphics. To keep the information it reports manageable, I
added the groupings of Set Material(per-view and per-Material), Set Material (per-instance) and Draw Mesh.

Pix custom
PIX custom expanded

Steps taken from the commit log

  • Added MayaExporter to project
  • Implemented Settings
  • Implemented Fullscreen

The biggest problem I encountered for this assignment was the lack of motivation. I’ve never encountered this before, and am perplexed at how to prevent this from occuring again in the future.

Time Estimate

  • Coding: 10 hrs
  • Write up: 1 hrs

Textures

Exe Files

Assignment 08

binary mesh

High level Description

This assignment further develops our build pipeline and adds textures to our scene. As the system becomes more
complex, it gets harder to mentally map where all the necessary changes are. Good architecture helps with the mapping of
these changes. For the most part, I feel that my architecture is keeping up with this demand, and although I do have some
hackery, it is aimed at being simple hackery with the goal of making easy changes to align it with better architecture once
I understand how the pieces are used. An example of such hackery is my ActorPlane and ActorCube class. These classses are a
hack to easily control the textures, materials and movement of each. Because they extend Actor, once I have time to add a scene
loader and actor controller class, the hackery can be relinquished in those areas.

binary mesh

Technical Details

To change the texture of the plane or the cube, you must change the hardcoded-values in ActorPlane.cpp or
ActorCube.cpp in the Demeter project.

Below are the critical steps taken to get from assignment 07 to assignment 08

  1. Refactor – Optimized mesh loading
  2. Added a texture builder to project
  3. Added UV to JSON mesh data
  4. Added texture class
  5. Integrated texture into material, render engine and game engine
  6. Refactor – Removed PrimitiveMesh by moving key methods into Mesh class

Time Estimate

  • Code: 15 hrs
  • Write up: 0.5 hr

Binary Mesh File

Assignment 07

Exe Files

High level Description

This assignment is to create a binary mesh format, change the engine to load that binary mesh format, and
create a builder that changes a “human” readable mesh file to the newly created binary mesh format.
Below is the “human” readable and binary format of my mesh
human readable mesh
binary mesh

Close inspection of my “human” readable format reveals that it is JSON. I know the requirement is to do
it in lua, but the parsing is much easier as JSON. It seems that all we are only using lua to parse
lua tables, which is equivalent to JSON in my opinion. The binary representation takes the cube.mesh.json
file from 1777 bytes to 284 bytes.

Notepad++ was having a hard time reading hex. Sublime does a much better job at viewing binary data as hex
and was giving me the values I expected.
mesh write code
mesh read code

The major refactoring done between assignments was the addition of easier exporting libraries and a
Profile build. The minor refactoring was some cleanup of redundant code to prepare for this assignment.

Technical Details

The biggest issue’s for me was not having a proper hex viewer and the extraneous code lua requires to parse
a JSON like file. So I used a JSON parser and reduced the code necessary in cMeshBuilder.cpp

  • Refactord Primitives to use inheritance and polymorphism
  • Refactord Actors to use inheritance and polymorphism
  • Added code to read and write binary mesh files
  • Added code to load Mesh fiels in Primitive classes
  • Added code for MeshBuilder to convert mesh from JSON to binary

Time Estimate

  • Reading: 0.25 hrs
  • Coding: 16 hrs
  • Write up: 1 hr

Asset lists & Pre-compiled binary shaders

Assignment 06

Exe Files

High level Description

This assignment is to add a shader building tool to our build pipeline. We will also be passing a lua
file of the assets to build instead of an explicit list. This list will be called AssetsToBuild.lua and
will look like:

AssetsToBuild.lua

I chose to use what JP had as an example because it seemed pretty flexible for further additions. This
‘human’ readable format allows for quick editing of assets to our game. It also provides the ability to
compile assets offline.

The control scheme hasn’t changed from Assignment 5. For some reason I had the urge to use my
finger on the laptop trackpad to illustrate the keyboard control layout below.

Controls

  • Box Movement – Forward = E
  • Box Movement – Backward = D
  • Box Movement – Left = S
  • Box Movement – Right = F
  • Box Movement – Up = Q
  • Box Movement – Down = A
  • Camera Movement – Forward = I
  • Camera Movement – Backward = K
  • Camera Movement – Left = J
  • Camera Movement – Right = L
  • Camera Movement – Up = P
  • Camera Movement – Down = ;
  • Camera Yaw – Left = U
  • Camera Yaw – Right = O
  • Camera Pitch – Up = Y
  • Camera Pitch – Down = H

Technical Details

I started by implementing a simpler AssetsToBuild.lua file. The simpler file just
used the generic builder, which just copies the files to their ‘built’ location. Most of the files were given,
so besides integration, only a project’s build parameters were changed.

The next step was to parse the more advanced AssetsToBuild.lua file. I used lua print statements to ensure
the items I wanted were concatenated and passed correctly.

Finally, the last step was moving from runtime shader compilation to build-time compilation. This hung me
up for a few days because I thought the compiled shader was an ID3DXBuffer instead of a void*. Once this issue
cleared up, it was a simple switch within my shader class.

Time Estimate

  • Code: 7 hrs
  • Write up: 1 hrs

Moving to 3D

Assignment 05

Exe Files

High level Description

Overview

This assignment is taking the lessons learned from the previous assignment and adding a 3D camera with movement.
It also introduces a new PIX tool for debugging shaders, the image can be found in the required images section.
Because of changing requirements, I had the option of taking Linear Algebra or Introduction to Video Games. I am so
glad I picked Linear Algebra. The combination of Linear Algebra, Ray Tracing and my undergrad senior project made
understanding what code I needed to implement World/View/Projection space super easy.

World space, to me is the matrix transformation needed to get an object from ‘object space’ to world space.
The analogy is, it is easier for a modeller to build a model at their desk, then to build it in the middle of an
elaborate set.

Camera/Eye space is essentially where the eye/lens is. It has a position, a direction of looking, and some notion
of up. Just because the object is rightside up, doesn’t mean that the eye isn’t upside down or sideways.

Projection space is best thought of as a projection screen. Just as a mirror can make a person look shorter, taller
or closer than they appear, the projection space takes an 3D image and ‘flattens’ it onto a 2D image. Early art classes
teach perspective, which is essentially a projection of 3D onto a 2D space.

Controls

  • Box Movement – Forward = E
  • Box Movement – Backward = D
  • Box Movement – Left = S
  • Box Movement – Right = F
  • Box Movement – Up = Q
  • Box Movement – Down = A
  • Camera Movement – Forward = I
  • Camera Movement – Backward = K
  • Camera Movement – Left = J
  • Camera Movement – Right = L
  • Camera Movement – Up = P
  • Camera Movement – Down = ;
  • Camera Yaw – Left = U
  • Camera Yaw – Right = O
  • Camera Pitch – Up = Y
  • Camera Pitch – Down = H

Technical Details

Not much to go here. It was mostly smooth sailing. My only problem was doubting my original view matrix
implementation because of its original placement and rotation. I used a lookAt matrix until I realized my problem
then reverted back to my original with better placement and rotation.

  • Added the z component to my existing square
  • Added the debug information
  • Made my square a cube
  • Created a PrimitiveCube and PrimitivePlane class
  • Integrated PrimitiveCube and PrimitivePlane
  • Updated Shader
  • Added Model/View/Projection matrices
  • Added and updated controller scheme for camera and controls

There is a lot of refactoring I need to do over break. Once this refactoring is complete, I will have integrated my
last semesters game engine into this render engine project.

Required Images

Vertex Shader Debug

Time Estimate

  • Code: 5 hrs
  • Write up: 1 hrs

Constant Table & Generic Builder

Assignment 04

Exe Files

High level Description

Project 4 img
Project 4 img

There are three goals to this assignment

  • To get input into our game
  • To use shader constants
  • To utilize a generic builder that will be responsible

Input

Control scheme

  • Up: W or Up Arrow
  • Left: A or Left Arrow
  • Down: S or Down Arrow
  • Right: D or Right Arrow

Shader constants

Shader constants are variables the shader uses for its calculations. The values can either be
set once per material or every draw call. When the value is updated per draw call, it is referred
to as a per-instance constant. This affords me a quick way to change the color of the material, for
all models, or just one model. It is also how we are moving the gradient sqaure

Generic Builder

With the addition of the generic builder, our asset conditioning framework is complete. To the best
of my knowledge, further additions to this area are to implement specific asset conditioning, where
conditioning, is optimizing an asset for a game.

Technical Details

The majority of my time was spent refactoring code. I created a shader class. The shader class contains
a compiled shader pointer, the constant table pointer, and the DirectX shader. It was designed to be easier
to support the use of OpenGL or DirectX. The material class now has pointers to a shaders. The lua responsible
for parsing my custom matl file format has been moved to its own class. The DirectX code responsible for
loading and compiling shaders has been moved to shader.

After spending a lot of time refactoring shader/material code, I decided to put off proper isolation of input.
The code is exactly where I do not want it to be, however, to put it where I would like it to be, I would need to
create at least 7 other classes.

For determining shader constants, I am just doing the bare minimum:

 constants
 {
   g_colorModifier = {1.0, 0.5, 0.0},
 },

Because of this, my code suffers inflexibility of whenever a constant is added, the code looking for it
also needs to be added. It also puts a dependency of all 3 values being necessary. What I would like to move
toward is something that tells me how many values to expect, what a default value would be if something is
missing and what it is used for. I’ve refactored some of the lua code to make tables and values within them
easier to get. It may even become part of a bigger library.

Time Estimate

  • Refactoring: 4 hours
  • Coding: 3 hours
  • Write up: 2 hours

The Quad Colored Box

Assignment 03 – The Quad-Colored Box

Exe Files

Colorful Box

High Level Description

The goal of this assignment is to have ‘human readable’ asset files
and to introduce them into our pipeline. A ‘human readable’ file is one that does not require familiarization with code,
xml or other specialized language. Spreadsheets can be considered ‘human readable’. Lua data tables are very similar to
JavaScript Object Notation(JSON) and they have the same advantage that a spreadsheet does, users can see the relationship
between values and objects fairly quickly and without any prompting.

Lua Tables as Material Files

We were given the liberty of creating our own naming convention and file format as long as it returned a
valid, well formed lua table. Below is the format I settled on.

Lua Material Format

If somebody else wanted to add another shader, my hope is that this format is ‘human readable’ enough that
they would know where it should go, and that it should have a name.

Material Files

I’ve put ‘human readable’ in quotes thus far, because most of the time I am quite literal. I can read Wavefront’s
material format with minimal problems. I also had to implement the symbols for my Raytracing assignments.

Wavefront Material Format

Wavefront file format

Questions

  • Where do shaders fit in a material definition? It is not in wavefront’s material definition.
  • While I understand having easily changed and grouped data, editing a complex mesh by hand would be
    more than tedious. How will this readability transfer to object files? Or other complex asset?

Required Images

For a more detailed description see Assignment 2‘s writeup.

Vertex details

Vertex count reduced from 6 to 4

DrawIndexedPrimitive

Technical Details

  • I needed a lua data table converter and where do I store the values? – A Material class that would
    store a vertex shader and a fragment shader. In the same file was the lua code to convert the lua table
    to a Material class.
  • Where does the loader go? – I put this before the LoadVertexShader and LoadFragmentShader sections
    in graphics.cpp, and modified their parameters from void to taking the shader path. This should
    give me flexibility in the long run, and allows me to only load the material once.
  • Changed my project from Unicode to Multi-Byte – Besides having to write L before every quote and
    some std::wstring stuff, converting from lua const char* to const wchar_t* seemed like it was going
    to be an ongoing pain. So at the sake of having my customers being able to write their name in their
    own language, I opted for easier coding.
  • I changed my Game’s Dependencies as per a disucssion with the instructor. The workflow is definitely
    different and now I am having slight troubles fighting my build habits. Instead of just hitting F5 from
    a clean build, I must press F7 and then F5.

Time Estimate

  • Reading: 1 hr
  • Coding: 2 hrs
  • Write up: 3 hrs