The Atlassian Community can help you and your team get more value out of Atlassian products and practices.
The purpose of this benchmark is to capture the latency for the most commonly used areas of the Jira Align SaaS application. This is NOT a load or scalability test. It is built for tracking performance improvement trends over time.
Metrics are gathered on a release basis to detect changes made during each sprint. These changes could directly or indirectly affect performance.
The performance tests leverage the following information from our production system.
The performance tests were designed from data across our SaaS environments.
NOTE! This distribution covers selected areas, not all application pages, and represents all customer test and production instances.
Page |
Usage Distribution |
Totals on area |
Story - Grid |
9% |
20% |
Story - Form |
8% |
|
Story - Create |
3% |
|
Backlog |
16% |
16% |
Epic - Grid |
7% |
16% |
Epic - Form |
6% |
|
Epic - Create |
3% |
|
Features - Grid |
7% |
16% |
Features - Form |
6% |
|
Features - Create |
3% |
|
Capabilities - Grid |
4% |
9% |
Capabilities - Form |
3% |
|
Capability - Create |
2% |
|
Team Room - Kanban Board |
5% |
8% |
Team Room - List View |
3% |
|
Objectives - Grid |
2% |
4% |
Objectives - Form |
1% |
|
Objectives - Create |
1% |
|
Program Board |
4% |
4% |
Program Room |
3% |
3% |
Teams - Grid |
1% |
2% |
Teams - Form |
1% |
|
Teams - Create |
0% |
|
Work Tree |
1% |
1% |
Portfolio Room |
1% |
1% |
These actions are taken from our site usage analysis tool (Pendo) and correspond to the performance (latency) tests that have been created. To get the number of visitors per page we took actual page load counts for all users in the SaaS environment over the last 90 days.
Usage counts for work object access and creation are somewhat fabricated. We do not have statistics for all work items actions. We know how many times grids and forms were opened and when new work objects were created, but not both form opening and create actions for all work items.
Virtual User Wait Times Profiles
Instead of performing actions quickly, one right after another, we load pages as a user would. User profiles with time range between actions are broken down as follows:
Timing |
Simulated Action |
% of Users tied to this Timing |
3-7 seconds |
Power a user getting something done quickly |
8% |
8-15 seconds |
Average user browsing the application |
80% |
16-30 seconds |
Distracted user |
10% |
31-180 seconds |
Interrupted user |
2% |
Once the user profile is selected the wait time will range between each action. For example:
In our tests we allocate virtual users to perform actions within the application. A virtual user imitates a live users' workflow through the application. The virtual users follow the user actions and user profiles to perform the workflow outlined below.
By running these tests we expect to see dependencies and the impact of actions between different areas and users' concurrent actions.
Test Scenarios
General grid tests:
General form tests:
General create tests:
Page Load Performance results and trends will be published separately on a release basis with details about the data profile and test environment.
Iryna Dziadyk
Atlassian TeamThe roadmap challenge for large scale agile enterprises Regardless of the agile framework you use, the agile enterprise has a massive scale with the challenge to connect hundreds of teams and thous...
1 comment