Performance Test Engineers do their job really well. But for a long term project, it is much important to accomplish the task by spending optimum effort. “Who cares about metrics? My job is scripting, execution and result analysis” this is the thought that every performance test engineers have in their mind (including me). But is this justifiable? Definitely not, it is anything wrong to take additional responsibilities and help team leads or managers to assist them in effort logging, prepare metrics for the client presentation etc. The main issue here is only few testers really aware about metrics funda.
Hence, in this blog post, I am going to list out the Performance Testing Metrics which are aligned with each Performance Testing Life Cycle (PTLC) that needs to be captured and analyzed during the project tenure.
What is the difference between Metrics and Measurement?
Measurement is an entity that defines entity in the form of size, quantity, count, dimension etc. e.g. number of defects found in the system.
Metrics is an entity which derives from measurement. E.g. number of defects per person per hour found in the system.
Non-Functional Requirements Elicitation and Analysis
- Effort Spent on Non-Functional Requirements Analysis
- Review effort on Non-Functional Requirements Analysis
- Non-Functional Requirements document defect count
Performance Test Strategy
- Performance Test Strategy preparation effort
- Performance Test Strategy review effort
- Performance Test Strategy defect count
- Test Scripts count
- Performance parameters list
Performance Test Design
- Test Scripting effort
- Test Script Rework effort
- Test Script Review effort
- Test Scripts per day per person
- Test Data Set up effort
- Test Environment set up effort
- Test Script Defects Count
Performance Test Execution
- Test Script Execution Effort
- Test Script defects count during execution
- Test Script execution cycles count
Performance Test Result Analysis
- Number of Graphs and Chart generated
- Analysis and Tuning effort
- SLA percentage change log
- SLA met log
- Number of Cycles completed
Benchmarks and Recommendations
- Preparation of Results / Benchmarks and Recommendations document effort
- Result /Dashboard document review effort
If you are enjoying our articles, please subscribe for our free weekly newsletter.
- Brotli Compression in Performance Testing - October 6, 2017
- Unboxing HPE StormRunner 2.7 - September 27, 2017
- What’s new in Apache JMeter 3.3? - September 25, 2017
- Measuring Client-side performance using Performance APIs - September 19, 2017
- Unboxing HPE LoadRunner 12.55 - August 20, 2017