With Goliath Performance Monitor 12.1.0, the End User Experience Scorecard reports are now available within Goliath Performance Monitor right from the Reports page. You can use these scorecard to obtain objective data about the overall performance of your Citrix and Horizon environments.
How EUE Scores are Calculated
The Scorecard reports include individual scores for each user for all sessions over the timeframe of the report as well as overall scores.
Individual User Scores
The individual scores for each user are calculated using a scale based on the defined metric thresholds. These can be adjusted to accommodate your individual environment. The default values are displayed in the report wizard and they are also displayed in the generated report.
Overall Scores
The overall metric scores at the top of the report are calculated by looking at all user session metric scores, then using the average to generate each overall metric score, based on the defined thresholds. The single overall EUE score is calculated based on the average of all individual users' EUE scores.
GPM 12.1.1 Update:
User session data now disregards 0s and null values when computing individual ICA Latency, ICA RTT, Connection Speed & Network Latency averages. If any user's averages result in 0, they will be displayed as "--" in the report table. If any user averages are "--," the EUE score won't be calculated, and it will be shown as "--" due to partial data. The "Overall EUE Score" is now calculated based on user EUE scores compared to overall metric scores, with "--" scores ignored from the calculation.
GPM 12.1.3 Update:
Additional column and filtering options were added to the Citrix Scorecard report with 12.1.3 to allow users to further pinpoint user experience issues in their environment. You can now display and filter by Client IP, Workspace Version, and Client Device name.
In addition with 12.1.3, the Citrix scorecard now includes an option to display an issues table that includes total users with EUE issues as well as the total number of issues, by metric type, found in the report. Along with this option, the existing Overall Scores table and Users table are now also displayed as options in the report to allow customers to create the type of report they would like to see. Any metric issue that falls within the Poor or Fair thresholds will be counted in the Issue totals table.
Scorecard Reports
- Citrix XenApp End User Experience Scorecard
- Citrix XenDesktop End User Experience Scorecard
- Citrix XenApp Logon Duration Scorecard
- Citrix XenDesktop Logon Duration Scorecard
- VMware Horizon BLAST End User Experience Scorecard
Based on industry best-practices and displayed using a scale of 0-100 (where 100 is best), overall score and individual end user experience metric scores can be viewed quickly to understand which users may be experiencing issues over the selected time period.
Color-coded categories reveal bottlenecks in your environment enabling you to understand quickly what's causing end user experience issues.
To Create a Scorecard Report
- Within the Goliath interface, go to the Reports page and click New.
- Depending on the scorecard you would like to run, click in to the Citrix or Horizon section and select the report template from the list.
- For Citrix, select from:
-
- Citrix XenApp End User Experience Scorecard
- Citrix XenDesktop End User Experience Scorecard
- Citrix XenApp Logon Duration Scorecard
- Citrix XenDesktop Logon Duration Scorecard
-
- For Horizon, select from:
-
- VMware Horizon BLAST End User Experience Scorecard
- On the next step, you'll be able to select which sections of the scorecard to show and the columns to display as well as adjust the thresholds.
The Citrix report sections you can display are:
- Overall Scores and Averages: display a table with overall end user experience score as well as scores for each average metric. Select only this option for an executive summary version of the report.
- Issue Totals: display a table with the total number of issues within the report, overall and by metric. Select this option and the User Details Table with Users with Issues Only selected to create a report highlighting users with end user experience problems.
- User Details Table: display a table of all users, their end user experience score and their average metrics. You can also select Users with Issues Only to display only users with metrics that fall within the Poor and Fair categories. This option is useful for larger environments with many users.
For Citrix reports, you can select to filter the reports by:
- User Name
- Active Directory Organizational Units
- Client IP
- Client Device Name
- Workspace Version
With these options you can easily compare different areas of your company to see how they are performing and where you may need to focus on addressing issues.
You also have the option to display additional optional columns in the report. Scroll down the wizard dialog to display these:
In cases where users may have many IP addresses or Client device names associated with their sessions, be sure the option to Display Multiple... is not checked as this can alter the appearance of PDF output. - Next, select the related devices that represent the group of user sessions you would like to report on. For example, for Citrix published app sessions, select all of the relevant XenApp servers. You can also check Select all Applicable Devices and Goliath will automatically select all devices related to the report type.
- At the next step you'll provide a report name and description and optionally assign a report owner.
- Next, setup any sharing options. Enter the email addresses of the users who should receive the report each time it is run. You can also select how to export the report and if you want to store it in a separate location.
- Finally, select how often to run the report. Typically, users generate scorecard reports weekly or monthly and provide them to multiple levels within the organization to provide objective data about the overall performance.
Scorecard Metrics
All scores are generated on a scale of 0-100 with scores in the Excellent range for each metric earning the highest value.
The Citrix End User Experience Score categories are displayed below. Each report threshold can be adjusted at Report creation time. Please note, the averages listed at the top of the scorecard are the average of all of the user sessions, not an average of the users averages listed in the table below the scoring section.
Metric | Description |
ICA Round Trip Time (ICA RTT) | Displayed as milliseconds, round trip time between 0ms and 200ms generates an excellent score (100) score. |
ICA Latency | Displayed as milliseconds, round trip time between 0ms and 100ms generates an excellent score (100). |
Network Latency | Displayed as milliseconds, round trip time between 0ms and 100ms generates an excellent score (100). |
Connection Speed | Displayed in Mbps, any connection speed over 50Mbps will generate an excellent score (100). |
The Citrix Logon Duration categories are below.
Metric | Description |
Initial Logon (seconds) | Displayed in seconds, excellent logon scores are below 31 seconds by default. |
Reconnect (seconds) | Displayed in seconds, excellent reconnect scores fall below 11 seconds, be default. |
The Horizon BLAST EUE categories are below.
Metric | Description |
Bandwidth Received (Mbps) | Displayed in Mbps, excellent BW received scores start at 7 Mbps. |
Bandwidth Transmitted (Mbps) | Displayed in Mbps, excellent BW transmitted scores start at 7 Mbps. |
Estimated Incoming Bandwidth (Mbps) | Displayed in Mbps, excellent estimated BW scores start at 7 Mbps. |
Network Latency (ms) | Displayed in milliseconds, excellent network latency is measured between 0-100 ms. |
Packet Loss (ms) | Displayed in milliseconds, excellent packet loss is measured between 0-5 ms. |
Round Trip Time (ms) | Displayed in milliseconds, excellent RTT is measured between 0-200 ms. |