25AssignmentsComparisonTest
In December 2007, the University of Michigan performance testing team conducted a series of tests on the 2.4.x and 2.5 version of the Assignments tool. The tests were designed to provide a point of comparison. The 2.4 test version was based on Sakai 2.4.x v37145.
The following three scripts were included in the Assignments-specific testing:
- Instructor creating assignments
- Students submitting completed assignments
- Instructors grading assignments and submitting grades
Both builds were subjected to three types of tests:
- The University of Michigan standard load test scenario. This runs roughly 20 scripted transactions spanning many Sakai tools. The total load generated by this test is projected to be 125% of our peak activity.
- The portion of the previous load test related only to the Assignments tool.
- A stress test to find the limits of the Assignments tool per the three scripts listed.
Attached are the results from test #3:
Observed response times showed a significant improvement over the previous version of Assignments. Time to open the Assignments tool during a stress test of the previous version averaged between 1.5 - 9 seconds. That same transaction is taking between 1 - 1.5 seconds using the 2.5 build. As the virtual users are executing as fast as possible, this led to a total throughput jump of 60% over the previous version. CPU utilization on the DB server is also up about 10% versus the previous test. But this is well within expected values given the higher transaction rate. 'DB user transactions' have been cut by 50%, even with the higher transaction rate.
Attached are the graphs which display time to open the assignments tool during the stress test for 24 and 25. Note: the three lines indicate time to open Assignments for the three different test scripts.
The following graphs show a 50% jump in throughput during the stress test for 24 and 25.