| Collegiate Sports Paging System Test Plan Version 1.0 
 Revision History 
 Table of Contents
 Introduction  | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Document (and version / date) | Created or Available | Received or Reviewed | Author or Resource | Notes | 
| Vision Document | n Yes o No | n Yes o No | Context Integration | |
| Supplemental Specification | n Yes o No | n Yes o No | Context Integration | |
| Use Case Reports | n Yes o No | n Yes o No | Context Integration | |
| Project Plan | n Yes o No | n Yes o No | Context Integration | |
| Design Specifications | o Yes n No | o Yes o No | ||
| Prototype | n Yes o No | n Yes o No | Context Integration | |
| Project / Business Risk Assessment | n Yes o No | n Yes o No | Context Integration | 

The listing below identifies those items (use cases, functional requirements, non-functional requirements) that have been identified as targets for testing. This list represents what will be tested.
Verify that subscriber information can be entered and retrieved.
Verify that content and categories can be inserted and displayed.
Verify that advertiser profiles and account information can be entered and displayed.
Verify that subscriber-specific usage information is tracked.
Verify that subscribers see the information for which they have requested paging.
Verify that pages go to subscribers when content arrives.
Verify that automatic content insertion works.
Verify that editor approval causes non-automatic content to be inserted.
Verify that subscribers who have lapsed subscriptions do not receive pages.
Verify that content marked as archived is not re-displayed to subscribers.
Verify that obsolete content is deleted.
Verify that advertiser reports are accurate.
Verify that advertiser reports can be received in Word, Excel, or HTML.
None.
Navigate through all use cases, verifying that each UI panel can be easily understood
Verify all online Help functions
Verify that all screens conform to the WebNewsOnLine standards.
Verify response time of interface to Pager Gateway system.
Verify response time of interface from existing WebNewsOnLine web server.
Verify response time when connected using 56Kbps modem.
Verify response time when connected locally (on the same LAN).
Verify system response with 200 concurrent subscribers.
Verify system response with 500 concurrent subscribers.
Verify system response with 1,000 concurrent subscribers.
Verify system response with 5,000 concurrent subscribers.
Verify system response with 10,000 concurrent subscribers.
Verify system response with 50,000 concurrent subscribers.
Verify system response with 100,000 concurrent subscribers.
Verify system response with 200,000 concurrent subscribers.
None.
Verify pages sent out within 5 minutes when single content element arrives.
Verify pages sent out within 5 minutes when content arrives every 20 seconds.
Verify that non-subscribers cannot access subscriber-only information.
Verify that non-editors can not approve content.
Verify that advertisers see only their own advertising content.
None.
Verify operation using Netscape V4.x browser.
Verify operation using Microsoft Internet Explorer V5.x
None.

| Test Objective: | Ensure database access methods and processes function properly and without data corruption. | 
| Technique: | 
 | 
| Completion Criteria: | All database access methods and processes function as designed and without any data corruption. | 
| Special Considerations: | 
 | 
| Test Objective: | Ensure proper target-of-test functionality, including navigation, data entry, processing, and retrieval. | 
| Technique: | Execute each use case, use case flow, or
    function, using valid and invalid data, to verify the following: 
 | 
| Completion Criteria: | 
 | 
| Special Considerations: | None. | 
| Test Objective: | Verify the following: 
 | 
| Technique: | Create / modify tests for each window to verify proper navigation and object states for each application window and objects. | 
| Completion Criteria: | Each window successfully verified to remain consistent with benchmark version or within acceptable standard | 
| Special Considerations: | Not all properties for custom and third party objects can be accessed. | 
| Test Objective: | Verify performance behaviors for designated
    transactions or business functions under the following conditions: 
 | 
| Technique: | Use Test Procedures developed for Function or
    Business Cycle Testing. Modify data files (to increase the number of transactions) or the scripts to increase the number of iterations each transaction occurs. Scripts should be run on one machine (best case to benchmark single user, single transaction) and be repeated with multiple clients (virtual or actual, see special considerations below). | 
| Completion Criteria: | Single Transaction / single user: Successful
    completion of the test scripts without any failures and within the expected
    / required time allocation (per transaction) Multiple transactions / multiple users: Successful completion of the test scripts without any failures and within acceptable time allocation. | 
| Special Considerations: | Comprehensive performance testing includes
    having a "background" workload on the server. There are several methods that can be used to perform this, including: 
 Performance testing should be performed on a dedicated machine or at a dedicated time. This permits full control and accurate measurement. The databases used for Performance testing should be either actual size, or scaled equally. | 
| Test Objective: | Verify performance behaviours time for designated transactions or business cases under varying workload conditions. | 
| Technique: | Use tests developed for Function or Business
    Cycle Testing. Modify data files (to increase the number of transactions) or the tests to increase the number of times each transaction occurs. | 
| Completion Criteria: | Multiple transactions / multiple users: Successful completion of the tests without any failures and within acceptable time allocation. | 
| Special Considerations: | Load testing should be performed on a dedicated
    machine or at a dedicated time. This permits full control and accurate
    measurement. The databases used for load testing should be either actual size, or scaled equally. | 
| Test Objective: | Verify that the target-of-test successfully
    functions under the following high volume scenarios: 
 | 
| Technique: | Use tests developed for Performance Profiling
    or Load Testing. Multiple clients should be used, either running the same tests or complementary tests to produce the worst case transaction volume / mix (see stress test above) for an extended period. Maximum database size is created (actual, scaled, or filled with representative data) and multiple clients used to run queries / report transactions simultaneously for extended periods. | 
| Completion Criteria: | All planned tests have been executed and specified system limits are reached / exceeded without the software or software failing. | 
| Special Considerations: | What period of time would be considered an acceptable time for high volume conditions (as noted above)? | 
| Test Objective: | Application-level Security: Verify that an
    actor can access only those functions / data for which their user type is
    provided permissions. System-level Security: Verify that only those actors with access to the system and application(s) are permitted to access them. | 
| Technique: | Application-level: Identify and list each actor
    type and the functions / data each type has permissions for. Create tests for each actor type and verify each permission by creating transactions specific to each user actor. Modify user type and re-run tests for same users. In each case verify those additional functions / data are correctly available or denied. System-level Access (see special considerations below) | 
| Completion Criteria: | For each known actor type, the appropriate function / data are available and all transactions function as expected and run in prior function tests | 
| Special Considerations: | Access to the system must be reviewed / discussed with the appropriate network or systems administrator. This testing may not be required as it maybe a function of network or systems administration. | 
| Test Objective: | 
 | 
| Technique: | 
 | 
| Completion Criteria: | 
 | 
| Special Considerations: | 
 | 
The following tools will be employed for this project:
| Tool | Version | |
| Defect Tracking | Project HomePage | |
| Project Management | Microsoft Project | 

This section presents the recommended resources for the Collegiate Sports Paging System test effort, their main responsibilities, and their knowledge or skill set.
This table shows the staffing assumptions for the project.
| Human Resources | ||
| Worker | Minimum Resources Recommended | Specific Responsibilities/Comments | 
| Test Manager / Test Project Manager | 1 ( Collegiate Sports Paging System project manager) | Provides management oversight Responsibilities: 
 | 
| Test Designer | 1 | Identifies, prioritizes, and implements test
    cases Responsibilities: 
 | 
| Tester | 4 (provided by WebNewsOnLine) | Executes the tests Responsibilities: 
 | 
| Test System Administrator | 1 | Ensures test environment and assets are managed
    and maintained. Responsibilities: 
 | 
| Database Administration / Database Manager | 1 (provided by WebNewsOnLine) | Ensures test data (database) environment and
    assets are managed and maintained. Responsibilities: 
 | 
| Designer | 2 | Identifies and defines the operations,
    attributes, and associations of the test classes Responsibilities: 
 | 
| Implementer | 4 | Implements and unit tests the test classes and
    test packages Responsibilities: 
 | 
The following table sets forth the system resources for the testing project.
The specific elements of the test system are not fully known at this time. It is recommended that the system simulate the production environment, scaling down the accesses and database sizes if / where appropriate.
| System Resources | |
| Resource | Name / Type | 
| Database Server | |
| —Network/Subnet | TBD | 
| —Server Name | TBD | 
| —Database Name | TBD | 
| Client Test PC's | |
| —Include
    special configuration —requirements | TBD | 
| Test Repository | |
| —Network/Subnet | TBD | 
| —Server Name | TBD | 
| Test Development PC's | TBD | 

| Milestone Task | Effort | Start Date | End Date | ||
| Plan Test | |||||
| Design Test | |||||
| Implement Test | |||||
| Execute Test | |||||
| Evaluate Test | 

For each test executed, a test result form will be created. This shall include the name or ID of the test, the use case or supplemental specification to which the test relates, the date of the test, the ID of the tester, required pre-test conditions, and results of the test.
p>Microsoft Word will be used to record and report test results.
Defects will be recorded using the Project HomePage via the Web.

Below are the test related tasks:
| Plan Test | 
| Identify Requirements for Test | 
| Assess Risk | 
| Develop Test Strategy | 
| Identify Test Resources | 
| Create Schedule | 
| Generate Test Plan | 
| Design Test | 
| Workload Analysis | 
| Identify and Describe Test Cases | 
| Identify and Structure Test Procedures | 
| Review and Access Test Coverage | 
| Implement Test | 
| Record or Program Test Scripts | 
| Identify Test-Specific functionality in the design and implementation model | 
| Establish External Data sets | 
| Execute Test | 
| Execute Test Procedures | 
| Evaluate Execution of Test | 
| Recover from Halted Test | 
| Verify the results | 
| Investigate Unexpected Results | 
| Log Defects | 
| Evaluate Test | 
| Evaluate Test-Case Coverage | 
| Evaluate Code Coverage | 
| Analyze Defects | 
| Determine if Test Completion Criteria and Success Criteria have been achieved |