Overview
Chris Campbell
<torgo@attbi.com>
Before you go to the grocery store, you probably make a list of the things you need so you won't forget anything. As a part of making the list, you check the pantry to see what items you have and which ones you need.
The manager in charge of the game development quality assurance process needs to plan in a similar manner.
At all times, the manager needs to know exactly how many bugs already exist in the "pantry," which bugs still need to be found, and how to find them. It's not an easy job, but every game's survival in the marketplace depends on the final quality of the product, so it is essential to do it right.
This article is designed to provide you with ideas on how to start a bug-tracking system and how to formulate ways to find the bugs using test plans and statistical analysis. You will be surprised to learn how easy a system can be to set up, but be ready for the never-ending task of finding bugs.
The First Step Isn't Finding, It's Tracking
Most people mistakenly believe that the primary job of the quality assurance manager is to find software bugs.
However, before a manager can find the bugs, he must also have a system in place to trace the bugs when found and track them until they are fixed. Unless you have a tracking system in place, a development team will spin its wheels trying to fix errors while missing new ones.
Why is tracking so important? Obviously, bug tracking allows you to log bugs as you find them so they are not forgotten and can be fixed. The real power in tracking comes when you use the data collected to form patterns and make some simple assumptions on where they might occur in the future. You can also identify
programmers who are prone to making coding mistakes and those who ignore fixing requests.
Charts and graphs can be made showing combinations of data to your advantage. Unless you have a way to collect and store data, you will be missing out on many time-saving clues. For example, what if you wanted to know how many bugs were being found each month in a given area (Figure 4.7.1)?
Figure 4.7.1: New bugs found each month in each game development area.
By using time as your x-axis, you can rapidly see areas that might raise alarms. In this example, the number of bugs found in the 3D engine keeps growing. The manager needs to determine why, and whether more
resources are needed. On the other hand, there are few bugs found in the Design area at the end of the chart.
Maybe the team has solved nearly all of the design issues, but perhaps more attention needs to be focused in that area so it does not become neglected.
Without using a report of some type, this data is hard to see. You can also track your personnel to make sure they are fulfilling their responsibilities to correct errors as they are found (Figure 4.7.2).
Figure 4.7.2: Open bugs at the end of each month by developer.
In June, Monica and Louis are having trouble correcting their bugs at the end of the month. The manager needs to know if this is because of lack of motivation on the developer's part or some other reason. We can also see that Louis consistently closes out his bugs each month. Charts and reports make information like this readily available. Other reports can be generated to show average length of time a bug stays open, developer productivity, and success and failure rates of test cases. All this requires is a bug-tracking database.
Creating a Bug Tracker
What do you need to start a bug-tracking database? You can talk to several vendors who sell software packages that can do the job. If you have the money and are short on time, this can be a good way to get started. In most cases, however, you can save money by doing it yourself with a simple database using SQL or Access as the front end. It's even possible to make a simple tracking system using a spreadsheet or e-mail based system, although this type of system would not be as flexible.
Begin the database creation by outlining on paper what you want your bug entry form to look like. From there, you can determine what is required in the database fields. You would want to include areas for the date and time the bug was found and the version of the program in which the bug was found. Include areas for a
description of the bug, the steps to reproduce the error, and any other special notations about the specific bug.
It is highly recommended that you include an area for saving metadata such as log files or screenshots.
Anything that would help the programmer replicate and understand how the bug was created should be
included on this form. In fact, try to get feedback from the programmer about what is needed on the form during the creation process.
Case Study 4.7.1 contains a sample bug report form.
Case Study 4.7.1: How to Report Bugs
Your Company Name: ________________________ Bug Number: _________
Reported by: ________________________________ Assigned to: __________
Status: (Open/Closed/Hold) Priority to fix: (High/Low) Program: _______________ Version: _____________
Error Type: ________ Severity: _____
Programming error1. Fatal
Design issue2. Serious Art3. Minor
Database Hardware DirectX AI/Scripting Documentation Attachments? (Y/N) Description of problem:
_____________________________________________________
_____________________________________________________
_____________________________________________________
_____________________________________________________
Steps to reproduce:
_____________________________________________________
_____________________________________________________
_____________________________________________________
_____________________________________________________
Resolution: ________ Fixed in Version: __________
Fixed Hold
Not able to reproduce Not fixed due to design Not fixed due to other restraint Withdrawn by tester
Need more info Comments:
_____________________________________________________
_____________________________________________________
_____________________________________________________
_____________________________________________________
Resolved by: ___________________ Fix Tested by: ________________
Once you have a rough draft on paper, you can quickly identify the database fields needed to store the
information. Just by using the Wizard interface in Access, you can easily turn this form into a powerful tracking tool. It is also important to remember that a tracking tool is only beneficial if it is used consistently. Unless your team members use the tool daily, it will quickly become worthless. It is the quality assurance manager's job to ensure that no one treats bug tracking lackadaisically.
Test Case Planning: The Headhunter Method
Quality assurance can be tiring, tedious work. Do what you can to make it fun, but also ensure that the job is getting done. One way to remove the tedium from tracking bugs down is to treat it as a challenge. The following four-step method can help you in this regard.
Step One: Focus on the Target
Test case design starts at the earliest design phases of a game. If the game design calls for lots of role-playing action, you can start gearing your test cases toward heavy database test cases. If it calls for a first-person shooter, then in the initial phases of game testing you can expect to work with more API and game engine testing.
You also want to orient your testing toward the current milestone's requirements and to quantify your results. By using metrics, you can quickly establish baselines for each milestone and see where the project stands from a quality standpoint at any time.
For example, if the milestone requirement states that the game engine needs to be in a "playable state," you quickly need to define "playable." It could mean that the game runs at 20 frames per second or more. It could also mean that 30 out of 40 planned features are functional. Work with the designers and programmers to quantify the expected results of your test.
Step Two: Lay Out the Bait
Now that you have selected a goal, plan the test's execution accordingly. The basic rule is: if something can be measured, it can be tested. Thus, it is often best to work backward when preparing a test case. Ask yourself: if the game were to fail in this area, how would I catch it?
In addition, don't be afraid to combine multiple test cases into one, but be careful not to make your test scenarios so long or complicated that you can't analyze the results.
For example, consider boundary testing. Here you are testing the input and output limits of the program. How many commands can you queue up before the interface breaks? Exactly how many items can you carry? What if your text input box contains more than 255 characters?
Focus on visible state transitions. Every time you perform an action that changes the range of choices available to the player or modifies the display, you have made a state transition. The category includes menu systems, chat windows, display screens, and GUI changes. Test each option in every menu.
Testing the input/output interfaces can also reveal bugs. When the game is accessing the hard drive or CD- ROM, send a heavy load of commands. Can the I/O interface handle the load? What about networking? If too many packets are dropped or a connection is lost, how does it affect the game? Check for malformed data instructions or packets. Seeing how your game can handle extreme cases can quickly show areas of weakness in your programming.
Step Three: Set the Trap
You've planned your test case and now you are ready to execute. The first consideration is your test environment. The machine you are testing on should be as generic and pristine as possible to avoid any potential contamination problems from previous tests and exotic hardware. Therefore, in a test lab, it is
advisable to set up a "ghost hard drive" image to copy to other machines when needed to start a new test case.
Make sure you have the tools necessary to perform the test and properly record a log of the test. There are plenty of tools freely available on the Internet that can capture screens, network packets, and chart running API
threads.
Set your test run's objectives ahead of time. If testing in a multiplayer environment, make sure everyone involved is trained in proper testing procedures and has received precise instructions concerning their roles.
Nothing is more frustrating than one person forgetting what to do at a critical moment in the test.
Step Four: Capture the Bug
When a bug is found, make sure that you have enough information to document it. Were screen captures made? Are the logs complete? The more information you have, the better. If time permits, try to replicate the bug as many times as possible. Vary the conditions to see if you can narrow the variables that cause the bug.
If no bug was found, the game might be working, or your plan might be flawed. Remember, you test to find bugs, not to see if everything in the game is working correctly. Go back to the drawing board. Alter the conditions somewhat in the area that you are testing. Throw everything you have at that area of the game. If your tests continue to reveal no bugs, save your test cases for another day for regression testing.
Retracing Your Steps
The fact that a bug wasn't found one day doesn't mean that it can't be found the next—or in the next version of the game. This is where regression testing comes into play.
Regression testing means retracing your steps and rerunning all your test cases. Perform it at successive milestones, and especially when code is locked down, at which point you should be able to run all of your test cases without a single failure. If one does occur, a bug has been inserted since the last round of regression testing. Fix the code and try again until you are able to achieve a perfect success rate.
Don't Forget to Test the Installation Routine
Once you have built your gold master, don't forget to design a few test cases for the installation and removal routines. With console design, the installation routine isn't important, but it can cause major headaches on personal computers. More than one software recall has been caused by an uninstaller that erased the entire hard drive!
If you have any extra material being placed on the master, make sure that they conform to company standards and formats. While the inclusion of static art assets for the enjoyment of the user wouldn't require a lot of formalized testing, don't forget to test other dynamic content such as screensavers, videos, and even music to ensure that they work with most platforms and players.
Other Resources and Methods to Consider
Once the method is in place, it becomes a matter of finding resources to execute the preceding plan. A common question is how many quality assurance staff should a company hire? Microsoft is often cited as having a 1:1 tester-to-developer ratio. Gaming companies rarely have the financial resources to achieve more than 1:3 or 1:7; the key lies in finding the balance for your company. Hire too many testers, and they won't have enough work to do; hire too few and there will be gaps in your testing strategy.
You should also use quantitative methods to monitor the progress of your testing. Defect seeding consists of purposefully inserting bugs into the code to see how many of them are found and to estimate how many actual bugs remain in the program. The formula [McConnell01] generally states that Total Defects equal the number of planted defects divided by the number of found defects, times the new defects found:
For example, if 30 defects are planted throughout the code and your testers find 14 of those along with 40 new ones, it is likely that the code contains about ( ) × 40 = 85 defects, of which 45 remain to be found.
Conclusion
At some point, you need to release your product to the public. When does testing stop? For practical reasons, the criterion is usually a specific amount of time during which bugs of a certain priority level have not been found, or a mathematical equation like that previously mentioned estimating the number of bugs remaining in the code. Other companies stop testing when they feel adequate test coverage has been given to the game design. And sadly, some companies stop when they run up against the final milestone deadline. Plan ahead and choose a criterion—and do so when you design your test cases, not when the deadline catches up with you.
Careful planning and accurate testing will help you avoid the stigma of a post-release patch. By testing often and testing early with a tracking system in place, a quality assurance manager can avoid many pitfalls that can delay a game's path to the marketplace.
References
[McConnell01] McConnell, S., "Gauging Software Readiness with Defect Tracking," available online at www.stevemcconnell.com/bp09.htm
[Rice02] Rice, R., "The Elusive Tester to Developer Ratio," available online at www.riceconsulting.com/tester_to_developer_ratio.htm
Software Assurance Technology Center, NASA, available online at http://satc.gsfc.nasa.gov
International Organization for Standardization, "ISO 9000," available online at www.iso.ch/iso/en/iso9000- 14000/iso9000/iso9000index.html
Index
A
Absolute Quality, 22 Acclaim, 67, 198–201 Accolade, 71
acquisition of companies, 117–124 Activision, 20, 66, 263
Actualize, 72
advance/R&D defined, 46 adver-gaming, 155
advertising agencies, role of, 23 agents, game, 203–208
Agent service, WirelessDeveloper, 142. See also game agents Aggregator Sales model
general discussion, 131, 132, 136 Java 2Micro Edition (J2ME), 143 RealOne Arcade, 153–154
Wireless Application Protocol (WAP), 140 Aggressive Inline, 199
Aihoshi, Richard, 176, 177, 179
AIM (Alternative Investment Market), 71 Alien, 263
Aliens vs. Predator, 263 alpha milestone, 255
Alternative Investment Market (AIM), 71 amortization, R&D, 26
Anarchy Online, 277, 278, 282 Anderson, Janet, 68
angel financing, 17
anti-piracy techniques, 236–238
application programming interfaces (APIs), 24 Arcane, 40
Argonaut Software, 70, 72, 73, 74
art and animation service providers, 19–20 art design documents, 251–252
art & design teams, role of, 253–254, 257 Artifact, 150, 151
art production, 19, 300–301 Asheron's Call, 155, 276, 277, 278 ATI, 25
audit rights, 219
Average Revenue Per User (ARPU), 135
Index
B
Babbages, 30
bad debts, handling, 166–168 Baggaley, Sean Timarco, 81, 83–91 balance sheets, 103
bank funding, 130 Barrett, Mark, 167–168 Bartlett, Ed, 173, 185–202
Best Alternative To Negotiated Agreement (BATNA), 9 Best Buy, 30
Beta Breakers Software Quality Assurance Labs, 22 beta milestone, 256–257
Big Red, 69
billing. See also costs content aggregators, 143 freelance business, 20, 163–164 J2ME, 141
SMS messages, 135–136
Wireless Application Protocol (WAP) messages, 137, 140 Bioware Corporation, 17–18
Black Cat, 71
Blockbuster Video, 30 Blue Sky, 71
Bonnell, Bruno, 68
box & docs process, 267–268 breakeven point defined, 46 Bristol Coin Equipment (BCE), 70 Britsoft Paradox, 65, 72–74 Brown, Martyn, 191–192 bug trackers
creating, 319–321 debugging, 21 defect seeding, 324 defined, 286
Headhunter Method, 321–323 regression testing, 323 Bullfrog Productions Ltd, 89 bundle deals, 52
Burnout 2: Point of Impact, 199, 200 burn rate, 186
Buscaglia, Thomas, 173, 209–219
business management of development company. See also business plans common errors, 89–91
communication, 86–87 management basics, 83–85 reputation, 87–88
business plans
business models, online combined models, 153–157 content updates, 238 copy protection, 236–238
episodic digital distribution, 148–149 order tracking, 238
sales strategies, 235–236 traditional retail model, 145–147 virtual communities, 149–152 Web and retail sales, 3, 147–148 Web sites, 233–235
writing
capital structure, 104–105
company information, 97–98, 119–122 contents of, 94–98, 118–119
development plan, 98–100 financial plan, 100–103
market conditions, summarizing, 95–97 modulating, 95
purpose of, 93–94
unique selling points (USPs), 189, 190 valuation methods, 120–122
Businesswire, 180 buyer controls, retail, 31 buy-sell agreements, 122
Index
C
Caen, Eric, 69 Caen, Hervé, 69
Cambron, Beverly, 173, 175–184 Cambron, Melanie, 57, 243, 307–315 Campbell, Chris, 244, 317–324 cancellation, risk of, 43–44 Cannell, Travis, 175, 179 Capcom, 66
capital structure, 104–105 Card Dash, 154
Cardoza Entertainment Inc., 288 Carrier Sales model, 131, 132, 143 cash flow statements, 101–102 Castle Wolfenstein, 107 casual piracy, 236–238 Cavanagh, Jane, 70 Centipede, 154 Centre Gold, 67, 70 Chen, Sande, 145–157 cherry picking, 31 Chronic Logic, 88 Chubb, Jeremy, 198–201 Claessens, Tanya, 35, 50 Clockwork, 72
closed platforms, 24 closeout games, 28 code ownership, 286 CogniToy, 147–148 collections, 166–168 Collins, Van, 148 Columbia Pictures, 263
combined business models, 153–157 commercial failure, risk of, 44
commission, game agents, 206
communication, 62, 63, 86–87, 286, 310 Compaq Computer Corporation, 25 completion bond funding, 17
concept development, 245–246 confidentiality, 213
console platforms, 26–27, 49 console publishers, 20–22, 267 consoles
console platforms, 26–27, 49 console publishers, 20–22, 267 Microsoft Xbox, 18, 26, 27, 191 Nintendo Game Boy Advance, 24, 27 Nintendo Gamecube, 18, 20, 23, 24, 26, 191 Sega Dreamcast, 27
Sony Playstation 2, 18, 20, 26, 191
content aggregators, 136, 140, 143. See also Aggregator Sales model content specialists, 160
contracts. See also legal issues; licenses business plans, 95
buy-sell agreements, 122 code ownership, 286
contracts and due diligence review, 122–123 contract testing, 22
cost-based, 33–34
development, 17, 47–48, 186–187
developer-specific considerations, 215–219 importance of contracts, 210–211
letter of intent, 210
payment schedules, 209–210 royalties, 210
standard sections, 212–215 game agents, 205–208
monetary aspects of, 33–35, 101 negotiating tools, 8
Non-Compete Agreement (NCA), 121 outsourcing, 285–286
prototyping, 186 publishing, 9–10, 187 royalty-based, 34–35, 39 work for hire, 186
worldwide publishers, 227
contracts and due diligence review, 122–123 contract testing, 22
convergent iteration, 248–250 Cook, Simon, 67
cooperative channel inventory management, 21 copy protection, 236–238
Core Design, 70 Cornwall, Charles, 69
cost and margin pricing model, 41 cost-based contracts, 33–34 costs. See also billing
art and animation providers, 19–20
game development, 16, 36 of goods, 46
motion capture, 19
"open to buy" budget, 32 price reductions, 52 pricing of games, 236
profitability of projects, 45–47
country-by-country model of publishing, 221, 223, 228–232 creative producer personality type, 60–61
creative risk management, 100 cross-collateralization, 45, 218 Cryo Interactive, 68, 71, 73, 75
customer service. See also employees; volunteers general discussion, 51
planning, 274–275 policies, 275 staffing, 275–281
Index
D
Dark Horse Interactive, 71 deals. See contracts
debt funding, 101. See also finances debugging of game, 21
defect seeding, 324
delivery media manufacturers, role of, 27–28 Dell Computer Corporation, 25
Delphine Software International, 74 demo, creating, 189–191, 250, 264–265 Denki Ltd., 72
design documents art, 251–252 content of, 188–189 game bible, 210
general discussion, 5–6, 56, 262, 285 localization, 299
technical, 251
design teams, role of, 253–254, 257 developer, headaches of, 38
developer/producer relationships, 55–64 developer/publisher relationship, 12 Develop magazine, 196
development, guiding business plans
capital structure, 104–105 contents of, 94–98, 118–119 development plan, 98–100 financial plan, 100–103 purpose of, 93–94 producers, 47–48, 55–64
development companies. See game development companies development contracts, 186. See also contracts
development kits, proprietary, 18
development software/hardware, license to use, 26 development team desirable attributes, 63
Di Cesare, Chris, 155, 156 DID, 72
Digital Integration, 71 Digital Tome, 148–149 direct expenses defined, 46 Direct Sales model