Select Page
Performance Testing

Performance Testing Terminologies

Being a leading QA Company, we write blogs on all prominent software testing topics and tools using our real-world experience. So stay sharp by subscribing to our Newsletter.

Performance Testing Terminologies Codoid Blog
Listen to this blog

Application software has a business driven agenda, which form the dominant driving force behind developing and managing them. A layman might find software similar, but there exists several differences between them – they greatly differ in performance statistics. In the tech business industry, performance testing is essential for keeping track of the software performance. It also forms the basis of many service level agreements between development vendors and clients. This is a highly critical process and involves rigorous performance testing along with load testing and stress testing for detailed insights about the performance and scalability.

Performance testing is all about testing the mobile network or an application software against an evaluation rubric containing numerous terminologies and physical quantities of time and efficiency measure. The software is run in an actual deployment environment. It is done to make sure that there are proper performance metrics achieved.

Here are all the necessary terminologies you may require while executing performance testing.

Connect time

Connect time is defined as the overall time taken by the application software to set up a TCP connection between the server and client – with the client requesting operations and retrieval of data. An application software that is built on a client-server architecture has to set up a connection between the server and client through TCP.

Any data passage occurs only after a successful TCP; which is a must. All the further information and requests can only be exchanged between a client and the server after this critical handshake is successful. It is important to note that the transfer of data does not occur through the TCP/IP protocol, but it happens in the data layer of the OSI model. Here small packets of data in bytes are transferred through package routing. It is monitored initially by the handshake and then driven by requests.

Latency Time

Latency time is the duration of a round trip when information is requested and then it reaches the target along with the return time. Latency is one of the major factors which makes the users feel that particular application software is operating slower or maybe lacking as well.

This generally occurs because of a mechanism due to which data packets hop at several interlinked nodes. It is a route from the data center to the client fetching them. It measures the latency from the first moment of sending a request to fetch request until the first byte of data is received at the user’s end. Latency time is a cumulative measure of the latency and connect time both.

Elapsed Time

Elapsed time is an overall measure of how fast or slow an application software works. It is measured as the time at which the first byte of data is sent, and the last byte of information is received at the client’s machine. Server processing time can be calculated as a difference of the total latency time, and the connect time taken after TCP handshake. Download time for any piece of data is the difference between the elapsed time and the overall latency exhibited by the application software.


Throughput is the efficiency measure of any application software. It is computed as the number of work requests that are handled by the software in unit time. It is an essential measure while carrying out a performance test on the application software under study. Generally, it comes in a certain number of a request being served within a second or for example number of calls in a KPO/BPO.

Performance thresholds

Performance thresholds are the permissible values of all the physical quantities being measured in performance testing. These include all the above discussed primary time measures and also several others like response time, throughput, resource utilization, disk I/O and network I/O, etc.

Saturation Limit

Saturation is a point during application software performance testing when the server becomes incapable of handling any further requests per unit time. It is the peak of performance which the software is expected to reach and is considered to be a state of optimized resource utilization.

In Conclusion

Performance testing is highly critical and should be given due importance for all software. Complete knowledge of physical quantities and terminologies involved is essential for executing performance testing and should be given due importance for all software. It is critical to keep in mind that extremities of a particular or a set of physical quantities might not help the software achieve maximum resource utilization. Consistently optimized figures of performance are necessary for quality performance on a long term basis, which only a top class software testing company can ensure – connect with us for this reason, and many more.


Submit a Comment

Your email address will not be published. Required fields are marked *

Talk to our Experts

Amazing clients who
trust us