Introduction
You can view the results of individual Application Tests and compare up to five Application Test instances. The Application Test Results page provides a comprehensive overview of measurements and detailed events for each Test. You can also download both single and multiple Test results as PDF reports for easy sharing and analysis. See the instructions below for more details.
Single Application Test
When an Application Test is completed, you can start looking at the results. To do this, in the Login Enterprise sidebar menu > Results, navigate to Application testing.
The Overview page will display a list of previously executed Application Tests. The columns display various information, such as:
|
Time |
Name |
App. Failures |
App. Perf |
Comment |
|---|---|---|---|---|
|
Date and time of the Test. |
Name and sequence number of the Test. |
The number of failed applications relative to the total number of applications tested. |
The number of out-of-bound measurements, compared to the configured thresholds. The number on the right shows the total number of configured measurements. |
The configured comment for the Test. |
To view a summary of the Test, click the arrow button next to each Test result to expand it.
You can view the details of the events that took place during the Test. Clicking on Viewing details opens a pop-up with more information.
For Application errors, you can click to expand the details, click the download button to download the log file, or click the camera icon to open the screenshot, if applicable.
Tip: You can also retrieve Application Testing results using the Public API.
Results Page
To view the Test results for a single Application Test, select a Test and click View. The Results overview page will display the Platform summary and Application summary for the selected Test.
You can also compare Application Tests by selecting checkboxes for multiple Tests. For details, see Comparing Multiple Application Tests.
Top Table
Details the Test summary with the information about the Test described above.
Overview Page
Platform Summary
|
Logon performance |
||||
|---|---|---|---|---|
|
|
Actual |
Threshold |
Execution |
Performance |
|
The actual time it took to log in. |
The login time threshold set in the Test. |
Green if the login was successful, regardless of the time taken. |
|
|
|
Latency |
|
|||
|
|
Actual |
Threshold |
Execution |
Performance |
|
The actual latency in milliseconds (ms). |
The latency threshold set in the Test, in milliseconds (ms). |
|
|
|
Application Summary
|
Application(s) |
|||
|---|---|---|---|
|
|
Execution |
Performance |
Screenshot |
|
|
|
|
|
Measurements (Timers) |
|||
|
|
Actual |
Threshold |
Status |
|
The actual time recorded for the measurement. |
The threshold set in the Test for the specific measurement. |
|
|
By default, the measurements (timers) include the app start time, which is automatically recorded when the START function in the workload scripts is executed. This function is used by default to launch target apps. Additionally, any custom timers defined in the workload scripts will also be displayed in the UI. To learn how to configure custom timers, see the Scripting Functions Overview.
Hiding, showing, and sorting columns
You can customize your Application Test results table by adding, hiding, or sorting columns. Here's how to manage your columns:
-
Configuration icon: In the top right corner of the Application Test results table, you'll find a configuration icon. Click it to open a pop-up for column management.
-
Adding columns: You can add additional columns to your table. The following columns are available:
-
App Failures
-
App Performance
-
Comment
-
Test duration
-
Connector
-
-
Hiding columns: If there are columns you don’t want to see, you can hide them using the configuration options.
-
Sorting columns: To sort columns, drag and drop them into your preferred order.
Note: The Time and Test Name columns are fixed and cannot be hidden or moved.
Your configuration settings are saved in your local storage. This means your column preferences will persist when you navigate away from the page or log out and back in.
The configuration may reset if you use a different browser or clear your browser history.
Application Test Report
Generated Automatically
An Application Test PDF report with the results of a single Test is automatically generated once the Test is finished. You can download the PDF report by clicking the Download button next to the Test:
Note: This is the PDF report for which you can set email notifications. For more information, see the Report Settings.
Generated Manually
You can also generate a new single Application Test PDF report manually or using the Public API. To generate a PDF report in the Login Enterprise UI:
-
In the Application Testing results, click “>“ (right arrow) next to the Test you’re interested in.
-
Click Generate PDF report (A browser pop-up will open).
-
In the browser pop-up, click Save to save the PDF report.
Tip: Alternatively, select the checkbox next to the specific Test, and in the top menu toolbar, click Generate PDF report.
Note: Email notifications for a single Application Test report generated manually aren’t supported.
Multiple Application Tests
You can select up to 5 Application Tests to compare.
Options for Comparison
-
Select a Baseline Test: Choose a baseline Test to which the other Application Tests will be compared.
-
Add/Remove Tests from comparison: Use the checkboxes to add or remove Application Tests from the comparison.
Results Page
The results are divided into two parts, just like for a single Application Test:
-
Platform Summary: Displays the login time and latency, along with the percentage difference between the compared Tests. For details, see the Platform Summary.
-
Application Summary: Compares the results of the Application scripts. For details, see the Application Summary.
Note: The results of the baseline Test are pinned in the first column. The other Test columns are sorted from oldest to newest.
If the results do not fit the screen, scrollbars will appear in the table. The baseline Test column remains pinned and does not move with the scrollbar.
The results of the Baseline Test are the primary reference. All Applications and timers from the baseline are displayed. If a selected Test has additional Applications or timers not present in the Baseline Test, those will only be shown when that Test is designated as the baseline.
The PDF report for the multiple Application Test comparison isn’t available yet.
Generating an Application Test Report
You can generate and download a PDF report with the results comparing up to 5 Application Tests. In the Login Enterprise UI, you can do this in one of the following ways:
a. Using the toolbar on the Application Test results page:
b. Using the Generate PDF report button on the Compare page:
Note: An Application Test report comparing multiple Tests is not generated automatically. You can generate the report manually using either the UI, as described earlier, or the Public API. Also, email notifications for this type of report aren’t supported.
Configuring an Application Test Report
You can choose what you would like your Application Test PDF report to include. Once you click Generate PDF report, you can configure the following:
|
Setting |
Description |
Options / Details |
Notes |
|---|---|---|---|
|
Introduction text |
Include general information about Login Enterprise at the beginning of the report. |
|
— |
|
Custom text |
Add your own text to the PDF report. Useful for describing what was tested, why, or summarizing conclusions. |
|
|
Downloading a PDF Report via the Public API
There are two types of PDF reports: old and new.
Old PDF report
-
Endpoint:
/publicApi/v8-preview/reports/{reportId}/pdf -
Automatically generated after each Test Run.
-
Available for single Application Test Runs only. Comparison reports are not supported.
New PDF report
-
Endpoint:
/publicApi/v8-preview/reports/application-test-runs/pdf -
Not saved automatically. You must generate it via the Public API.
-
Supports single and multiple Application Test Run IDs, allowing you to create comparison reports.
For more information, see Accessing the Public API.
Application Test PDF Report Breakdown
The PDF report summarizes the findings obtained from the Application Testing process. It provides an overview of the Test specifics, results, and measurements. The report is divided into the following sections:
-
Introduction: Provides an overview of Application testing and the metrics used during this type of testing.
-
Test Specifics: Provides an overview of the key details of the test setup:
-
Product version: The Virtual Appliance version.
-
Connector: The specific Connector used for the Test.
-
Launcher group(s): The Group(s) of Launchers involved in the Test.
-
Test duration: The total duration of the Test.
-
Date: The date on which the Test was conducted.
-
Workload: The type of Workload or Test scenario used.
-
-
Test Summary: Highlights key metrics for the Test:
-
Application failure: Displays the number of failed Applications. For example: script failed to run.
-
Application performance: Shows the number of Applications where the actual time exceeded the performance threshold.
-
-
Platform Summary: Provides insights into the platform's performance based on latency and login performance:
-
Login performance: Indicates the performance during login operations.
-
Latency: This shows any latency issues that occurred during the Test.
-
Login Performance and Latency include the following:
-
Test name: The name of the Test conducted.
-
Bar chart: A visual representation of the performance data.
-
Actual time: The time taken for the specific Test.
-
Threshold: The predefined threshold, if applicable.
-
Execution: The result of the Test execution.
-
Performance: The performance analysis based on the Test outcome.
-
-
-
-
Application Results: Dives deeper into the results for each Application with the following subsections:
-
Summary: A high-level result of the Application's performance during the Test.
-
Measurements: Includes a table for each measurement or timer taken during the Test. Each table will include:
-
Test name: The name of the Test.
-
Bar chart: A visual representation of the performance data.
-
Actual time: The actual time recorded for the measurement.
-
Threshold: The threshold value (if applicable).
-
Status: The status of the Test.
-
Screenshots: This section includes screenshots taken on a particular application under test.
-
-
Additional Resources
-
For information on the Login performance, see Configuring Logon Components.
-
To learn about Latency, see Monitoring Latency.