1. Trang chủ
  2. » Công Nghệ Thông Tin

Wiley effective methods for software testing

1K 372 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 1.004
Dung lượng 3,19 MB

Nội dung

Effective Methods for Software Testing Third Edition Effective Methods for Software Testing Third Edition William E Perry Effective Methods for Software Testing, Third Edition Published by Wiley Publishing, Inc 10475 Crosspoint Boulevard Indianapolis, IN 46256 www.wiley.com Copyright © 2006 by Wiley Publishing, Inc., Indianapolis, Indiana Published simultaneously in Canada ISBN-13: 978-0-7645-9837-1 ISBN-10: 0-7645-9837-6 Manufactured in the United States of America 10 3MA/QV/QU/QW/IN No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning or otherwise, except as permitted under Sections 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 646-8600 Requests to the Publisher for permission should be addressed to the Legal Department, Wiley Publishing, Inc., 10475 Crosspoint Blvd., Indianapolis, IN 46256, (317) 572-3447, fax (317) 572-4355, or online at http://www.wiley.com/go/permissions Limit of Liability/Disclaimer of Warranty: The publisher and the author make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation warranties of fitness for a particular purpose No warranty may be created or extended by sales or promotional materials The advice and strategies contained herein may not be suitable for every situation This work is sold with the understanding that the publisher is not engaged in rendering legal, accounting, or other professional services If professional assistance is required, the services of a competent professional person should be sought Neither the publisher nor the author shall be liable for damages arising herefrom The fact that an organization or Website is referred to in this work as a citation and/or a potential source of further information does not mean that the author or the publisher endorses the information the organization or Website may provide or recommendations it may make Further, readers should be aware that Internet Websites listed in this work may have changed or disappeared between when this work was written and when it is read For general information on our other products and services or to obtain technical support, please contact our Customer Care Department within the U.S at (800) 762-2974, outside the U.S at (317) 572-3993 or fax (317) 572-4002 Library of Congress Control Number: 2005036216 Trademarks: Wiley and related trade dress are registered trademarks of Wiley Publishing, Inc., in the United States and other countries, and may not be used without written permission All other trademarks are the property of their respective owners Wiley Publishing, Inc., is not associated with any product or vendor mentioned in this book Wiley also publishes its books in a variety of electronic formats Some content that appears in print may not be available in electronic books This book is dedicated to my wife Cynthia, who for many years has been “testing” my ability to live in accordance with our marriage vows She taught me that testing is a lifelong process, that testing is necessary to ensure that you are meeting your objectives, and that testing can be fun if it is performed correctly Thank you, Cynthia What you have taught me is incorporated into many of the concepts in this book About the Author William E Perry holds degrees from Clarkson University, University of Rochester, and Rochester Institute of Technology Bill also holds the following professional certifications: CPA (Certified Public Accountant), CIA (Certified Internal Auditor), CISA (Certified Information Services Auditor), CSQA (Certified Software Quality Analyst), and CSTE (Certified Software Tester) He has been an examiner for the Malcolm Baldrige National Quality Award, and served on standards committees for NIST (National Institute of Standards and Technology), IEEE (Institute of Electrical and Electronics Engineers), AICPA (American Institute of Certified Public Accountants) and ISACA (Information Systems Audit and Control Association) In 1980, Bill founded the Quality Assurance Institute (QAI), a professional association for testers QAI offers professional certification for Quality Assurance, Software Testing, Software Project Leaders and Business Analyst Professional More than 27,000 individuals have been certified since the inception of the program Bill has authored more than 50 books, many published by John Wiley & Sons He recently founded the Internal Control Institute (ICI) ICI and St Petersburg College recently formed the Internal Control Center of Excellence to share best internal control practices, hold conferences on emerging internal control practices, and to offer e-learning courses and a professional certification in internal control vi Credits Executive Editor Robert Elliott Production Editor Felicia Robinson Editorial Manager Mary Beth Wakefield Production Manager Tim Tate Vice President and Executive Group Publisher Richard Swadley Vice President and Executive Publisher Joseph B Wikert Graphics and Production Specialists Carrie Foster Mary J Gillot Lauren Goddard Denny Hager Joyce Haughyey Stephanie D Jumper Rashell Smith Quality Control Technicians John Greenough Brian H Walls Proofreading and Indexing Techbooks Project Coordinator Michael Kruzil vii Contents Introduction xxv Part I Assessing Testing Capabilities and Competencies Chapter Assessing Capabilities, Staff Competency, and User Satisfaction The Three-Step Process to Becoming a World-Class Testing Organization Step 1: Define a World-Class Software Testing Model Customizing the World-Class Model for Your Organization Step 2: Develop Baselines for Your Organization Assessment 1: Assessing the Test Environment Implementation Procedures Verifying the Assessment Assessment 2: Assessing the Capabilities of Your Existing Test Processes Assessment 3: Assessing the Competency of Your Testers Implementation Procedures Verifying the Assessment 3 8 13 13 14 14 16 Step 3: Develop an Improvement Plan Summary 16 18 Part II Building a Software Testing Environment 35 Chapter Creating an Environment Supportive of Software Testing Minimizing Risks 37 38 Risk Appetite for Software Quality Risks Associated with Implementing Specifications Faulty Software Design Data Problems 38 39 39 39 ix Index software quality requirements, acceptance testing, 497 software security testing See also security accidental versus intentional losses, 738–739 adequacy evaluation, 761 baseline information accurate and precise information, 741 analysis, 750 availability, 741 baseline awareness training, 747 categories, 739 collection method training, 747 corporate language adjustments, 741 data collection methods, 744–747 forms completion training, 747–748 methods, 743 objectives, 751 one-time data collection procedures, 741 reasons for, 740 resources protected concerns, 743 security sources, 740 status, reporting, 749 support concerns, 744 team member selection, 742–743 training concerns, 743 what to collect, 740 central computer sites, 746 check procedures, 762 communication lines and networks, 746 examples of, 223 fraud studies, 761 guidelines, 762 hacking, 757 how to use, 228 input, 735 objectives, 228, 734 office equipment, 746 output, 762 overview, 733 penetration points business transaction control points, 755–756 characteristics of, 756 developing, 757–760 development activities, 753–754 first priority, 760 interface activities, 752 operations activities, 754–755 second priority, 760 staff activities, 751 third priority, 760 protection points, 746 remote computer sites, 746 results, 760–761 risk assessment, 752 storage areas, 746 vulnerabilities central processors, 738 computer operations, 738 computer programs, 736–737 data and report preparation facilities, 737 data handling areas, 738 digital storage facilities, 738 discussed, 735 impersonation, 737 input data, 736 IT operations, 736 media, 737 non-IT areas, 738 online terminal systems, 738 operating system access and integrity, 737 output data, 736 physical access, 736 programming offices, 738 test processes, 736 when to use, 228 work papers, 763 workbench concept, 734–735 software summary, document development, 172 959 960 Index software systems, V-concept testing, 76–77 software version changes, operational testing, 509–511 source code analysis, postimplementation analysis, 579 source-data, preventive controls, 658–659 space allocation, monitoring, 773 span of operation criteria, documentation, 174 special-value testing, 237 specifications implementing, risks associated with, 38–39 system test plan standards, 81–82 variance from, 65 spiral testing methodology, 588 RAD testing, 638 sponsor responsibilities, 590 SRS (System Requirements Specifications), 595 SSS (system security specialist), 591 staff administrative/organizational barriers, 865 client/server testing readiness, 615 competency barriers, 865 efficiency measures, 852–853 profile information, 214 experience evaluation, 600 people management through, 871 software security testing, 751 verification testing concerns, 294 stakeholder perspective, improvement barriers, 861–864 standards development activities, 753 documentation process, 770 policy criteria, 45 system test plan, 79–82 unit test plan, 83 start dates Project Status reports, 468 timelines, 838 start time delays, inspection process concerns, 255 starting early, test plan development, 262 startup failure, post-implementation analysis, 579 state machines, functional testing, 238 state of requirements, 592 state of the art technology, tool use, 108 statement testing of, 239 static analysis program verification, 55 programming testing, 324 verification testing, 293 statistics operations process, 773 profile information, 215 statistical testing, 241 status See project status calculation stop it from happening concept, stakeholder perspective, 862 storage areas, software security testing, 746 storage efficiency, best practices, 850 strategies business innovation, 878 converting to testing tactics, 83–85 risk assessment, 56–57 system development phases, identifying, 56 test factors, selecting and ranking, 56 V-concept testing, 74–75 strengths, building on, 857–860 stress testing examples of, 223 how to use, 224 objectives, 224 test data for, 430 tools associated with, 104 validation testing, 435 Web-based systems testing, 804, 808 when to use, 224 Index stress/performance scripting, 431 strong controls, internal controls testing, 677 structural desk debugging, 325 structural testing See also functional testing advantages/disadvantages, 69 branch testing, 239 conditional testing, 234 COTS software testing, 703–704 defined, 69 expression testing, 240 feasibility reviews, 70 multiplatform environment testing, 723–725 path testing, 240 reasons for, 69 requirements reviews, 70 statement testing, 239 stress testing examples of, 223 how to use, 224 objectives, 224 test data for, 430 tools associated with, 104 validation testing, 435 Web-based systems testing, 804, 808 when to use, 224 structural analysis, 238–239 types of validation techniques, 69–70 verification techniques, 69–70 work papers, 87–91 structure components, acceptance testing, 492 substantiation objectives, cycle control objectives, 675 success factors effectiveness measures, 852 people management, 871 verification testing, 293 summaries, report, 479–480 Summary Status reports, 466–468 symbolic execution tools, 107, 239 syntactical desk debugging, 325 system access, application risks, 304 system analyst perspective, software development methodologies, 593 system boundary diagrams, 500–501 system building concepts, workbench concept, 72 system chains, revised testing approach, 50 system component identification, test plan development, 221 system control objectives, internal controls testing, 674 system development life cycle See SDLC system development processes, data warehouse testing, 770–771, 779 system independence, best practices, 850 system log tools, 107 system maintenance, test tactics, 75 System Requirements Specifications (SRS), 595 system security specialist (SSS), 591 system skills, tool selection considerations, 111, 113–114 system testing as functional tests, 70 validation, 221 Web-based systems testing, 807 system/subsystem specifications, document development, 173 T talking the talk, management support, 51 target dates, project status calculation, 194, 468 teachable processes, 154 teams agile testing process, 820–821 appointing, 168–171 assessment baseline development, 10 verification testing, 317 961 962 Index teams (continued) baseline information gathering, 742–743 combination, 170–171 composition, 169 cultural barriers, 875 design reviews, 321 external, 170 internal, 169–170 non-IT, 170 report measurement, 465 risk team establishment, 302–303 tester agility, 842 walkthroughs, 313 technical interface activities, 752 technical reviews, configuration management, 603 technical risk assessment, work papers, 93–96 technical skills, tool selection considerations, 111, 114 technical status, Summary Status reports, 467 techniques versus tools, 103 technological developments, revised testing approach, 50 technology issues, COTS software testing challenges, 690 telephone and network switching equipment, test plan development, 222 tenure, tool managers, 120 termination analysis, postimplementation analysis, 579 test case generators, Web-based systems testing, 809 test data generator tools, 107 test factors programming testing, 326–328 selecting and ranking, 56 verification testing, 299–302 work papers, 62 test managers, 167–168 test matrix batch tests, 248 defined, 245 online system test, 248 risk assessment, 57 software methods, 246 structural attributes, 246 test matrix example, 246 verification tests, 248–249 test plan development administrative components, 250–251 automated test tools, 221 check procedures, 262 constraints, 214 contingency plans, 222 core business areas and processes, 221 cost/schedules, 214 customer and user involvement, lack of, 210 data exchange issues, 222 databases built/used, 214 deliverables, 213 facility requirements, 222 flexibility, 262 implementation, 214, 251 input, 212 inspection, 254–258 interfaces, to other systems, 214 lack of testing tools, 210 lack of training concerns, 210 legal/industry issues, 214 lose-lose situations, 211 objectives, 210, 213, 245 output, 262 over-reliance, 210 potential failures, assessing severity of, 221 projects, profiling, 212–215 RAD (rapid application development), 210 readability, 262 resources needed, 221 reviews, 262 Index risk assessment, 27, 215–218 schedule issues, 222 staff competency, 214 starting early, 262 statistics, 214 system component identification, 221 telephone and network switching equipment, 222 test matrix development, 245–248 testing concerns matrix, 219–220 testing technique selection, 222–223 unit testing and analysis, 235 us versus them mentality, 210 validation strategies, 221 V-concept testing, 157–158 work papers administrative checkpoint, 273–274 batch tests, 267 general information, 271 inspection report, 276–280 milestones, 272 moderator checklist, 275 objectives, 264 online system tests, 268 quality control, 283–287 software module, 265 structural attribute, 266 test matrix, 270 verification tests, 269 workbench concept, 211 written example, 252–254 test processes vulnerabilities, 736 work papers, 24–27 test room configurations, multiplatform environment testing, 723 test scripts developing, 430–432 discussed, 104 execution, 433 integration scripting, 431 maintaining, 434 pseudo-concurrency scripting, 431 regression scripting, 431 results analysis, 433 stress/performance scripting, 431 tools, 107 unit scripting, 431 test to business effectiveness, 578 test verification, SDLC, 53, 55 testability, software quality factors, 844 test-data approach, internal controls testing, 669–672 testers acceptance testing roles, 44 change recommendation roles, 44 competency baseline development, 14–16 CBOK (Common Body of Knowledge), 125–128 competent assessment, 128 CSTE (Certified Software Tester) certificate, 125, 127–128 fully competent assessment, 128 job performance roles, 126–127 negative attitudes, 130 not competent assessment, 128, 130 training curriculum development, 128–130 work papers, 28–32 developmental improvement roles, 44 documentation specification requirement roles, 44 motivation factors, 51 process improvement roles, 44 requirements measures, 594 risk appetite, 38 tester’s workbench See workbench concept Testing Action report, 477 testing guidelines See guidelines, testing testing methodology cube, 83, 85 testing metrics, post-implementation analysis, 578 testing strategies See strategies testing tactical dashboard indicators, third priority, probable penetration points, 760 963 964 Index throughput testing, Web-based systems testing, 804 time improvement ideas, variability measures, 841 time-compression efforts best practices, 825 calendar-day efficient, 823–824 challenges, 824 implementation procedures, 886–888 readiness criteria, 826 solutions to, 825 V-concept testing, 826–827 work papers, 828–830 workbench concept, 834 timelines change estimation, 837 critical path definition, 836 documentation, 180–181 end dates, 838 in control processes, 832 input, 837 no-rework days, 839 out of control processes, 832 output, 837 project name, 838 report generation, 837 start dates, 838 work papers, 896–905 workday, 839 timeouts, Web-based systems testing, 807 tool managers assistant, 119 mentors, 119 need for, 117 positions, prerequisites to creating, 118 responsibilities of, 117, 119–120 skill levels, 118–119 tenure, 120 tools automatic, 104 boundary value analysis, 105 capture, 105 cause-effect graphing, 105 checklist, 105 code comparison, 105 compiler-based analysis, 105 confirmation, 105 control flow analysis, 105 correctness proof, 105 costs, 114–116 data dictionary, 105, 770 data flow analysis, 105, 238–239 database, 105 design reviews, 105 design-based functional testing, 105 desk checking, 106 disaster testing, 106 documentation, 124 error guessing, 106, 721–722 examination, 105 executable specs, 106 fact finding, 106 flowchart, 106 inspection, 106 instrumentation, 106 integrated test facility, 106 lack of, 210 life cycle phase testing, 109–111 logging, 107 manual, 104 mapping, 106 matching to its use, 109 matching to skill levels, 111–114 modeling, 106 most used, 108 parallel operation, 106 parallel simulation, 107 peer review, 107 playback, 105 ratio, 107 relationship, 107 replacements, 120 risk matrix, 107 scoring, 107 selection considerations, 108–109, 121–123 Index snapshot, 107 specialized use, 108 state of the art technology, 108 stress testing, 104 symbolic execution, 107, 239 system log, 107 techniques versus, 103 test data, 107 test script, 107 tracing, 108 use case, 108 utility program, 108 walkthrough, 108 traceability best practices, 849 requirements measures, 594 tracking systems, project status calculation, 190 traditional system development, test tactics, 75 training barriers/obstacle solutions, 868 baseline awareness, 747 best practices, 850 collection methods, 747 concerns, baseline information, 743 curriculum, 128–130 development activities, 753 failures, operational testing, 524 forms completion, 747–748 lack of, 210 management support needs, 51 manual support testing, 229 material, operational testing, 519–522 tool usage, 116–117 transaction flow testing, 672–673 transaction processing events, 414, 724 transactions authorization, 755 destruction, 756 origination, 755 retrieval, 756 turnaround time, stress testing, 223 usage, 756 transferability, critical success factors, 696 treasury, cycle control objectives, 675 troubled projects, inadequate estimation, 181–182 turnaround documents, preventive controls, 659 U unauthorized access, 766 undetected defects, 66 unit scripting, 431 unit testing as functional tests, 70 quality control, 244 test plan standards, 83 validation, 221 Web-based systems testing, 807 unknown conditions COTS software testing challenges, 689 undetected defects, 66 update controls, enterprise-wide requirements, 767 upgrades, tool manager duties, 119–120 us versus them mentality, 210 usability requirements measures, 594 software quality factors, 844 Web-based systems testing, 800, 806, 808 usage enterprise-wide requirements, 768 training testers in, 116–117 transaction, 756 use cases acceptance testing, 500–503 tools, 108 validation testing, 412 user acceptance as functional tests, 70 implementation procedures, 885 Web-based systems testing, 808 user education, client/server testing readiness, 614 965 966 Index user manuals, document development, 174 users functional testing phases, 70 outside users, revised testing approach, 50 participation, post-implementation analysis, 578 profile information, 213 reaction evaluation, 576 roles and responsibilities, 6, 498, 590 skills, tool selection considerations, 111–112 utilities and commands, operations activities, 754 utility program tools, 108 V validation testing acceptance testing, 221 audit trails, 435 authorization, 434 check procedures, 439 compliance, 434 concerns, 410 correctness, 435 coupling, 434 design goals, defining, 414 disaster testing, 436 ease of operation, 436 ease of use, 435 execution, 434–436 file design, 413–414 file integrity, 435 functional testing, 69–70 guidelines, 439–440 input, 411 inspections, 436 integration testing, 221 maintainability, 436 objectives, 410 output, 439 overview, 408 payroll application example, 416–429 performance, 434 portability, 436 preventive controls, 659–661 recovery testing, 435 reliability, 434 results, documenting conditions, 436, 438–439 deviation, 437 effect, 436, 438 security, 435 service levels, 435 stress testing, 435 structural testing, 69–70 system testing, 221 test data creating, 415–416 entering, 414 sources, 412 for stress testing, 430 test plan development, 221 test scripts developing, 430–432 execution, 433 levels of, 430 maintaining, 434 results analysis, 433 transaction-processing programs, 414 unit testing, 221 V-concept testing, 158 work papers audit trails, 445 compliance, 443, 450 correctness, 451 coupling, 455 ease of operations, 456 ease of use, 452 file integrity, 444 functional testing, 442 maintainability, 453 performance, 448 portability, 454 problem documentation, 457 quality control, 458 recovery testing, 446 Index security, 449 stress testing, 447 test script development, 441 workbench concept, 410–411 value received, administrative/organizational barriers, 866 variability measures agile testing and, 820 check procedures, 834 competency measures, 835 discussed, 831 external/internal work processes, 835 quality control, 841 reduction, 835 rework factors, 835 root cause identification, 840 time improvement ideas, 841 timelines change estimation, 837 critical path definition, 836 end dates, 838 in control processes, 832 input, 837 no-rework days, 839 out of control processes, 832 output, 837 project name, 838 report generation, 837 start dates, 838 workday, 839 workbench concept, 833–834 variance from specifications, 65 V-concept testing acceptance testing, 158–159 analysis, 158 customization, 160–161 development project types, 75 discussed, 72–73 facts, managing, 162 importance of, 68 multiplatform environment testing, 725–726 objectives, 159–160 operational testing, 158–159 organization, 157 overview, 156 post-implementation analysis, 159 process management, 161–162 project phases, 79 project scope, 77 reporting, 158 results management, 162 risk identification, 77–79 software system types, 76–77 strategic objectives, determining, 74–75 strategies, converting to testing tactics, 83–85 test plan development, 157–158 test plan standards, 79–82 testing methodology cube, 83, 85 time-compression efforts, 826–827 unit test plan standards, 83 validation testing, 158 verification testing, 158 workbench concept with, 162–163 vendors COTS implementation risks, 689 interface activities, 752 reputation importance, 690–691 tool manager duties, 119 verification testing adequate control assessment, 310 application risks, 304–308 base case, 293 baseline development, 13 check procedures, 330 concerns, 294 control objectives, 303 debugging, 292 design deliverables, inspecting, 322 design phase, 296–297 design reviews, 320–322 discussed, 298 functional testing, 69–70 guidelines, 331–332 input, 296–297 967 968 Index verification testing (continued) inspections, 292 for large documents, 248–249 objectives, 293 output, 331 programming phase, 297, 323–324 programming testing acceptance testing, 324 complexity of, 323 desk debugging, 325–326 dynamic testing, 324 importance of, 323 peer reviews, 328–330 static analysis, 324 test factor analysis, 326–328 project leader assessment, 317 requirements phase, 296 requirements tracing, 292, 314–315 reviews, 292 risk matrix, 293, 302 scoring success factors, 316–318 static analysis, 293 structural testing, 69–70 success factors, 293 team assessment, 317 test factor analysis, 310–312, 318–319 test factors, 299–302 V-concept testing, 158 walkthroughs, 292, 312 work papers access defined, 347, 366 audit trails, 363, 394 authorization rules, 342, 359–360, 391–392 business system design review, 377–380 computer applications risk scoring form, 349–353 computer processing control procedure, 354 computer systems design review, 381–385 contingency planning, 364, 395 coupling, 404 data integrity controls, 357–358, 389–390 design compliance, 367–371 design criteria, 374–375 failure impact, 345 file integrity, 343, 361–362, 393 functional specifications, 334 interface design, 373 maintenance specifications, 336 needs communicated, 376 online processing controls, 355–356 operation procedure development, 405 operational needs, 340 output control, 355 performance criteria, 339 portability needs, 337 program compliance, 398–402 quality control, 348, 387–388, 405 reconstruction requirements, 344 requirements compliance with methodology, 333 security implementation, 397 service levels defined, 346, 365, 396 system interface factors, 338 tolerances, 341 usability specifications, 335 workbench concept, 294–295 verification, validation, and testing (VV&T), 589 version changes, operational testing, 509–511 video cards, Web-based systems testing, 805 viruses, Web-based systems testing, 803 vulnerabilities central processors, 738 computer operations, 738 computer programs, 736–737 data and report preparation facilities, 737 data handling areas, 738 digital storage facilities, 738 discussed, 735 Index impersonation, 737 input data, 736 IT operations, 736 media, 737 non-IT areas, 738 online terminal systems, 738 operating system access and integrity, 737 output data, 736 physical access, 736 programming offices, 738 test processes, 736 VV&T (verification, validation, and testing), 589 W walkthroughs of customer/user area, 212–213 final reports, 314 presentations, 313–314 questions/recommendations, responding to, 314 rules, establishing, 312–313 team selection, 313 tools, 108 verification testing, 292, 312 WAN (wide area network), 800 waterfall methodology, 587 WBS (Work Breakdown Structure), 603 weak controls, internal controls testing, 678 weaknesses, minimizing, 857–860 Web sites QAI (Quality Assurance Institute), 15, 125 Software Certifications, 125 Web-based systems testing access control, 803 authorization, 803 back-end processing, 800 bandwidth access, 805 browser compatibility concerns, 800, 805–806 caching, 805 calculation correctness, 804 check procedures, 809 code verification concerns, 800 compatibility concerns, 804–805, 808–809 component testing, 807 concurrency testing, 804 correctness concerns, 800, 804 crashes, 807 dropped lines, 807 dynamic page generation, 806 e-mail functions, 806 file downloads, 806 graphics filters, 805 guidelines, 810 hardware compatibility, 805 HTML tools, 809 input, 801–802 integration concerns, 800 integration testing, 807 load testing, 808 lost connections, 807 memory, 805 monitors, 805 multimedia support, 805 navigation correctness, 804 output, 810 overview, 799 performance concerns, 800, 803–804 performance testing, 808 print handling, 805 recoverability concerns, 807 regression testing, 808 reliability concerns, 806 reload pages, 805 risk assessment, 802–803 security concerns, 800, 803 site validation tools, 809 stress testing, 804, 808 system testing, 807 test case generators, 809 throughput testing, 804 timeouts, 807 unit testing, 807 969 970 Index Web-based systems testing (continued) usability concerns, 800, 806 usability testing, 808 user acceptance testing, 808 video cards, 805 viruses, 803 work papers quality control checklist, 815 risk assessment, 812 testing tool selection, 814 testing types, 813 workbench concept, 800–801 weighted criteria score, documentation, 175–177 wide area network (WAN), 800 Work Breakdown Structure (WBS), 603 work flow, COTS software testing, 698–699 work papers acceptance testing acceptance criteria, 527 acceptance plan form creation, 544–545 automated application criteria, 566–567 change control forms, 546 checklist, 550–551 data change forms, 547–548 deletion instructions, 538–539 installation phase, 531 inventory material, 552–553 production change instructions, 536–537 program change form completion, 540–541, 549 program change history, 534–535 quality control checklist, 560–563 recovery planning data, 532–533 system boundary diagram, 528 system problem form completion, 542–543 test cases, 530 training checklist form completion, 558–559 training failure notification, 568–569 training module form completion, 556–557 training plan form completion, 554–555 training quality control checklist, 564–565 use cases, 504–507, 529 agile testing process, 923–927 barriers communication barriers, 920–922 cultural barriers, 919 stakeholder analysis, 915–918 best practices, 906–908 CBOK (Common Body of Knowledge) individual competency evaluation, 149 new information technology, 148 project management, 135–138 security procedure assessment, 146–147 software controls, 146 test environment, building, 133–135 test planning, 138–143 test status and analysis, 143–144 test team competency evaluation, 150 testing principles and concepts, 132–133 user acceptance testing, 144–145 client/server systems testing client data, 627–628 footprint chart, 631 quality control checklist, 632 readiness results, 630 security, 626–627 standards, 628–629 system installation, 625 COTS (commercial off-the-shelf) software testing completeness tests, 707–708 functional testing, 710 quality control checklist, 712–715 structural testing, 711 Index data warehouse testing access control, 785 activity process, 797 audio trails, 784 concerns rating, 788, 795–796 continuity of processing, 792 data, placing in wrong calendar period, 787 documentation, 791 fraud, 789 inadequate responsibility assignment, 781 inadequate service levels, 786 incomplete data concerns, 782 management support concerns, 794 performance criteria, 793 quality control checklist, 798 reviews, 790 update concerns, 783 documentation completeness, 202 estimation, 203–205 quality control checklist, 207–209 weighted criteria calculation, 201 footprint chart, 23 internal controls testing documentation, 679 file control, 683 input controls, 679–681 output control, 682 program and processing controls, 681–682 methodologies, software development analysis footprint, 609 self-assessment, 607–608 multiplatform environment testing concerns, 728 configurations, 728 quality control checklist, 731–732 validity, 729–730 post-implementation analysis, 582 RAD (rapid application development) testing applicability checklist, 644 conceptual model development, 646–647 logical data model development, 647 production system development, 651–652 production system release, 652 quality control checklist, 653 scope and purpose of system definition, 645–646 specifications, revising, 650–651 system development, 648–649 test system release, 652 reports defect reporting, 484–485 quality control, 486–487 writing guidelines, 488–489 risk score analysis, 99 self-assessment, 909–914 size risk assessment, 97–98 software security testing, 763 software testing environment, 19–22 structural testing, 87–91 technical risk assessment, 93–96 test factor ranks, 58–59, 61 test plan development administrative checkpoint, 273–274 batch tests, 267 general information, 271 inspection report, 276–280 milestones, 272 moderator checklist, 275 objectives, 264 online system tests, 268 quality control, 283–287 software module, 265 structural attribute, 266 test matrix, 270 verification tests, 269 tester competency assessment, 28–32 time-compression efforts, 828–830 timelines, 896–905 tool use documentation, 124 selection considerations, 121–123 971 972 Index work papers (continued) validation testing audit trails, 445 compliance, 443, 450 correctness, 451 coupling, 455 ease of operations, 456 ease of use, 452 file integrity, 444 functional testing, 442 maintainability, 453 performance, 448 portability, 454 problem documentation, 457 quality control, 458 recovery testing, 446 security, 449 stress testing, 447 test script development, 441 verification testing access defined, 347, 366 audit trails, 363, 394 authorization rules, 342, 359–360, 391–392 business system design review, 377–380 computer applications risk scoring form, 349–353 computer processing control procedure, 354 computer systems design review, 381–385 contingency planning, 364, 395 coupling, 404 data integrity controls, 357–358, 389–390 design compliance, 367–371 design criteria, 374–375 failure impact, 345 file integrity, 343, 361–362, 393 functional specifications, 334 interface design, 373 maintenance specifications, 336 need communicated, 376 online processing controls, 355–356 operation procedure development, 405 operational needs, 340 output control, 355 performance criteria, 339 portability, 337, 372, 403 program compliance, 398–402 quality control, 348, 387–388, 405 reconstruction requirements, 344 requirements compliance with methodology, 333 security implementation, 397 service levels defined, 346, 365, 396 system interface factors, 338 tolerances, 341 usability specifications, 335 Web-based systems testing quality control checklist, 815 risk assessment, 812 testing tool selection, 814 testing types, 813 workbench concept acceptance testing, 494 client/server systems testing, 613 COTS software testing, 692 data warehouse testing, 766–767 input component, 71 internal controls testing, 667 multiplatform environment testing, 719–720 multiple workbenches, 72 organization, 166 output components, 71 post-implementation analysis, 572–573 procedures to check, 71 procedures to do, 71 process advantages, 163 RAD (rapid application development) testing, 635–636 results analysis, 460 software security testing, 734–735 system building concepts, 72 test plan development, 211 Index testing tools and, 71 time-compression efforts, 834 validation testing, 410–411 variability measures, 833–834 with V-concept testing, 162–163 verification testing, 294–295 Web-based systems testing, 800–801 workday timelines, 839 working documents, 178 world-class testing organization baseline development assessment teams, 10 capabilities assessment, 13–14 cause-effect diagram, drivers, environment assessment, footprint charts, 10 implementation procedures, results assessments, 11–12 verification, 13 discussed, improvement planning, 16–18 PDCA (plan-do-check-act) cycle, 4–5 software development process, software testing model definition discussed, organization, self assessment, wrong specifications, defects, 65, 471 973

Ngày đăng: 18/04/2017, 11:07

TỪ KHÓA LIÊN QUAN