Basics of Functional Testing

Requirement Based Functional Testing

Function (Integration) test is usually the first test phase that a test organization is responsible for during any given release. Requirements based Function Test is one approach to Function (Integration) test - it is a powerful and effective testing approach, which will significantly reduce the number of undetected defects (faults) being released into production. The premise is that a well-formulated set of functional requirements give the Test Designers (see .Testing and The Role of a Test Designer Tester.) a definitive bases for test case design.

What is Function Test?
"The objective of function test is to measure the quality of the functional (business) components of the system". Tests verify that the system behaves correctly from the user / business perspective and functions according to the requirements, models, storyboards, or any other design paradigm used to specify the application. The function test must determine if each component or business event: performs in accordance to the specifications, responds correctly to all conditions that may be presented by incoming events / data, moves data correctly from one business event to the next (including data stores), and that business events are initiated in the order required to meet the business objectives of the system.

What is a Requirement?
A requirement is a capability or function that must be delivered by a system component or components. A functional requirement is a specific business need or behavior as seen by an external user of the system.


Test Cycle for Requirements based Function Test

An effective test cycle must have a defined set of processes and deliverables. The primary processes / deliverables for requirements based Function test are: Test Planning, Partitioning / Functional Decomposition, Requirements Definition / Verification, Test Case Design, Traceability (Traceability Matrix), Test Case Execution, Defect Management, and Coverage Analysis. Which processes and deliverables apply to any given testing situation are dependent on available resources (people, source materials, time, etc.) and the mandate of the test organization.

Test Planning
During planning the Test Lead with assistance from the test team defines the scope, schedule, and deliverables for the function test cycle. The Test Lead delivers a Test Plan (document) and a Test Schedule (work plan) - these often undergo several revisions during the testing cycle.

Partitioning - Functional Decomposition
Functional decomposition of a system (or partitioning) is the breakdown of a system into its functional components or functional areas. Another group in the organization may take responsibility for the functional decomposition (or model) of the system but the testing organization should still review this deliverable for completeness before accepting it into the test organization. If the functional decomposition or partitions have not been defined or are deemed insufficient then the testing organization will have to take responsibility for creating and maintaining the partitions. There are several commercial, shareware, and freeware products available that aid in the functional decomposition of a system and the formal delivery of the functional partitions.

Requirements Definition / Verification
Requirements definition is often the weakest deliverable in the software development process. Many development shops go directly from software concept to functional specification or worse from software concept to code without any preliminary software design deliverables. The Testing Organization needs these requirements to proceed with Function test, so if the development team is not going to deliver the requirements for verification by the testing team then the Test Team must create its own set of testable requirements. These requirements need to be itemized under the appropriate functional partition.

Test Case Design
The Test Designer / Tester designs and implements test cases to validate the product performs in accordance to the requirements (see .Testing and The Role of a Test Designer Tester.). These Test Cases need to be itemized under the appropriate functional partition and mapped / traced to the requirements being tested.

Traceability (Traceability Matrix)
Test Cases need to be traced / mapped back to the appropriate requirement. Once all aspects of a requirement have been tested by one or more test cases then the test design activity for that requirement can be considered complete. A common misconception made during this process is that all test cases that exercise a particular requirement should be mapped to that requirement - only those test cases that are specifically created to test a requirement should be traced to that requirement. This approach gives a much more accurate picture of the application when coverage analysis is done - failure of a test case does not mean failure of all the requirements exercised (as opposed to tested by) the test case.

Test Case Execution
As in all phases of testing the appropriate set of test cases need to be executed and the results of those test cases recorded. Which test cases are to be executed should be defined within the context of the Test Plan and the current state of the application being tested. If the current state of the application does not support the testing of one or more requirements then this testing should be deferred until it does justify the expenditure of testing resources.

Defect Management
As in all phases of testing any defects (see .Testing and The Role of a Test Designer Tester.) detected during test execution need to be both recorded and managed by the testing organization. During Function test each defect should be traced to a specific requirement or requirements that are not performing to specification.

Coverage Analysis
During Function test a periodic progress report should be delivered by the test organization to the project team. The bases for this report will be a coverage analysis of the requirements against test cases and outstanding defects. The objective of this analysis is to determine the percentage of the requirements that are: deemed to be untested, performing to specification (executed successfully), and not performing to specification (defects).

There are several commercial, shareware, and freeware products available that can be used to expedite the creation of all these deliverables while streamlining the testing process.


Managing Function Test

Function (Integration) testing can be an overwhelming task for an inexperienced testing organization. To assure success at the test organization and project level the scope of the testing effort needs to be rigorously defined and followed. The definition of the scope needs to be understood by the test organization and the project team - if the scope of the testing effort needs to be redefined then this must be communicated. A realistic work plan with clear deliverables and dependencies needs to be drafted and updated when any event occurs that impacts the work plan in a positive or negative fashion. The key to success is to manage the expectations of the testing team and the project team while clearly communicating the current status of the testing effort on on-going bases.

1. To know negative or positive test cases first you need to know the scenarios.
2. To know negative scenarios you need to know the validations on various fields.
3. Depending on validations, choose a test design techniques to decide on the data that you are going to use while testing.
4. Form combinations on this data.
5. Rank the combinations.
6. Select the combinations for testing.
7. Write the test cases.
 Example: Login Screen.
1. It has two fields: ID & Password.
2. ID has to be valid employee ID (numeric) and password has to be matched with a stored one.
3. User can click on Cancel if wants to discontinue. Upon clicking on Cancel the application closes. (In case web application either browser closes or the tab closes.)
4. Upon clicking on Forgot Password link another form gets displayed.
5. Upon successful login, user gains the access of system.
6. Upon 3 consecutive unsuccessful attempts the User gets blocked, user gets appropriate message and upon clicking on that message the application closes.
7. After entering ID & Password, Submit button gets enabled.
Now, what are Positive and Negative scenarios?

Positive Scenarios:
1. ID & Password matches and user clicks on SUBMIT button.
2. Clicking on Cancel Button.
3. Clicking on Forgot Password.

Negative Scenarios:
1. ID is wrong, Password is correct.
2. ID is correct, Password is wrong.
3. ID is blank, Password is correct. (In this case SUBMIT Button should not be enabled)
4. ID is correct, Password is blank. (In this case SUBMIT Button should not be enabled)
5. ID & Password both are blank. (In this case SUBMIT Button should not be enabled)
6. ID & Password both are wrong.

Implied Scenario (Default):
1. When Login screen is displayed, cursor should be blinking on ID field. (ID field should be focused)
Now, if you want to write test cases, you have to select the data. For which you have to apply Test Design Techniques. I assume that this testing is for functional (Black Box / Dynamic) testing. So, you may choose Equivalence Partitioning / Equivalence Class Technique

For ID Classe:
1. {Valid ID: Existing valid numeric ID in database}
2. {Invalid ID: Non Existing numeric ID in database, ID containing at least one value which is not a number}
For Password:
1. {Valid Password: Existing valid password in database}
2. {Invalid Password: Non existing password in database}
Now you have to select data to write the test cases.

Valid Data for ID:
A. 1234 (Assume that this is valid ID which exists in database)
Invalid Data for ID:
A. 4321 (Assume that this is invalid ID which does not exists in database)
B. 12A6 (ID containing character)
Valid data for Password:
A. Password (Assume that this is valid password for ID 1234 which exists in pairs in database)
B. Password (Assume that this is invalid password for ID 1234)
Now, you have to find out the combinations of ID and Password:
Sr. No. ID Password Valid / Invalid Combination (V = Valid, I = Invalid)
1. 1234 Password V (Correct ID, Correct Password)
2. 1234 Password I (Correct ID, Wrong Password)
3. 4321 Password I (Wrong ID, Correct Password)
4. 4321 Password I (Wrong ID, Wrong Password)
5. 12A6 Password I (Wrong ID, Correct Password)
6. 12A6 password I (Wrong ID, Wrong Password)

Comments

Popular posts from this blog

ab - Apache HTTP Server Bench Marking Tool