Apidog Docs
πŸ‡ΊπŸ‡Έ English
  • πŸ‡ΊπŸ‡Έ English
  • πŸ‡―πŸ‡΅ ζ—₯本θͺž
HomeLearning CenterSupport CenterAPI References
HomeLearning CenterSupport CenterAPI References
Discord Community
Slack Community
X / Twitter
πŸ‡ΊπŸ‡Έ English
  • πŸ‡ΊπŸ‡Έ English
  • πŸ‡―πŸ‡΅ ζ—₯本θͺž
  1. Run test scenarios
  • Apidog Learning Center
  • Get started
    • Introduce Apidog
    • Basic concepts in Apidog
    • Navigating Apidog
    • Quick Start
      • Overview
      • Specify a new endpoint
      • Make a request to the endpoint
      • Add an assertion
      • Create a test scenario
      • Share your API documentation
      • Explore more
      • Send a request and save as an endpoint
    • Migration
      • Overview
      • Manual import
      • Scheduled import
      • Import options
      • Export data
      • Import from...
        • Import from Postman
        • Import OpenAPI (Swagger) spec
        • Import cURL
        • Import Markdowns
        • Import from Insomnia
        • Import from apiDoc
        • Import .har file
        • Import WSDL
  • Design APIs
    • Overview
    • Create a new API project
    • Endpoint basics
    • Components
    • Common fields
    • Global parameters
    • Endpoint change history
    • Batch endpoint management
    • Configure multiple request body examples
    • Schemas
      • Overview
      • Create a new schema
      • Build a schema
      • Generate Schemas from JSON etc.
    • Security schemes
      • Overview
      • Create a security scheme
      • Use the security scheme
      • Security scheme in online documentation
    • Advanced features
      • Custom endpoint fields
      • Import endpoints as test steps
      • Endpoint status
      • Appearance of parameter lists
      • Endpoint unique idenfication
  • Develop and Debug APIs
    • Overview
    • Generate requests
    • Send requests
    • Endpoint cases
    • Dynamic values
    • Validate responses
    • Design-first Mode & Request-first Mode
    • Generate code
    • Environments & variables
      • Overview
      • Using variables
      • Environments & services
    • Vault secrets
      • Overview
      • HashiCorp Vault
      • Azure Key Vault
      • AWS Secrets Manager
    • Pre/Post processors
      • Overview
      • Assertion
      • Extract variable
      • Wait
      • Database operations
        • Overview
        • MySQL
        • MongoDB
        • Redis
        • Oracle Client
      • Using scripts
        • Overview
        • Pre processor scripts
        • Post processor scripts
        • Public scripts
        • Postman scripts reference
        • Calling other programming languages
        • Using JS libraries
        • Visualizing responses
        • Script examples
          • Assertion scripts
          • Using variables in scripts
          • Using scripts to modify request messages
          • Other examples
    • Dynamic values Modules
  • Mock API data
    • Overview
    • Smart mock
    • Custom mock
    • Mock priority sequence
    • Mock scripts
    • Cloud mock
    • Self-hosted runner mock
    • Mock language (Locales)
  • Automated tests
    • Overview
    • Test reports
    • Test scenarios
      • Create a test scenario
      • Pass data between requests
      • Flow control conditions
      • Import endpoints/endpoint cases from other projects
      • Sync data from endpoints/endpoint cases
      • Export test scenarios
    • Run test scenarios
      • Run a test scenario
      • Data-driven testing
      • Run test scenarios in batch
      • Scheduled tasks
      • Manage the runtime environment of APIs from other projects
    • Test APIs
      • Integration testing
      • Performance testing
      • End-to-end testing
      • Regression testing
    • Apidog CLI
      • Overview
      • Installing and running Apidog CLI
      • Apidog CLI Options
    • CI/CD
      • Overview
      • Integrate with Jenkins
      • Integration with Gitlab
  • Publish API Docs
    • Overview
    • API Technologies Supported
    • Quick share
    • View the API documentation
    • Publish docs sites
    • Custom layouts
    • Custom domain
    • SEO settings
    • LLM-friendly Features
    • Advanced Settings
      • Documentation Search
      • CORS Proxy
      • Integrating Google Analytics with Doc Sites
      • Folder tree settings
      • Visibility settings
      • Embedding values in document URLs
    • API Versions
      • Create API versions
      • Publish API versions
      • Share endpoints with API versions
      • Overview
  • Send requests
    • Overview
    • gRPC
    • Use request proxy agents for debugging
    • SOAP/WebService
    • GraphQL
    • WebSocket
    • Socket.IO
    • SSE debugging
    • Create requests
      • Request History
      • Request basics
      • Parameters and body
      • Request headers
      • Request settings
      • HTTP/2
    • Authentication and authorization
      • Overview
      • CA and client certificates
      • Authorization types supported by Apidog
      • Digest Auth
      • OAuth 1.0
      • OAuth 2.0
      • Hawk Authentication
      • Kerberos
      • NTLM
      • Akamai EdgeGrid
    • Response and cookies
      • Overview
      • API response in Apidog
      • Create and send cookies
      • Debug requests
      • Save the request as an endpoint
  • Branches
    • Overview
    • Create a new sprint branch
    • Test APIs in a branch
    • Design API in a branch
    • Merge sprint branches
    • Manage sprint branches
  • Apidog MCP Server
    • Overview
    • Conntect API Specification within Apidog Project to AI via Apidog MCP Server
    • Conntect Online API Documentation Published by Apidog to AI via Apidog MCP Server
    • Conntect OpenAPI Files to AI via Apidog MCP Server
  • Best practices
    • How to handle API signatures
    • How to access OAuth 2.0 protected APIs
    • Apidog collaboration workflow
    • Managing authentication state in Apidog
  • Administration
    • Onboarding Checklist
      • Basic Concepts
      • Onboarding Guide
    • Managing Teams
      • Managing Teams
      • Managing Team Members
      • Member Roles & Permission Settings
      • Team Activities
      • Team Resources
        • General Runner
        • Team Variables
        • Request Proxy Agent
      • Real-time Collaborations
        • Team Collaboration
    • Managing Projects
      • Managing Projects
      • Managing Project Members
      • Notification Settings
      • Project Resources
        • Database Connection
    • Managing Organizations
      • Single Sign-On (SSO)
        • SSO Overview
        • Configure Microsoft Entra ID
        • Configure Okta
        • Configure SSO for an Organization
        • Managing user accounts
        • Mapping Groups to Teams
      • SCIM Provisioning
        • Intro to SCIM Provisioning
        • Microsoft Entra ID
        • Okta
      • Organization Resources
        • Self-hosted Runner
  • Billing
    • Overview
    • Credits
    • Unable to use credit cards?
    • Managing subscriptions
    • Upgrade plan
  • Data & Security
    • Where is Apidog's data stored, and how is data security ensured?
    • How is user data stored? Will this data be public? Or will it be private? Will all data be stored in the cloud?
    • When sending requests, do they go through the Apidog server? Is data security ensured?
  • Add-ons
    • API Hub
    • Apidog Intellij IDEA plugin
    • Browser Extension
      • Chrome
      • Microsoft Edge
    • Request Proxy
      • Request proxy in Apidog web
      • Request proxy in shared docs
      • Request proxy in Apidog client
  • Account & preferences
    • Account settings
    • Generate OpenAPI access token
    • Language settings
    • Hot keys
    • Network proxy configuration
    • Data backup
    • Updating Apidog
    • Deleting account
    • Experimental Features
  • References
    • API-Design First Approach
    • Apidog OpenAPI/Swagger Specificaiton Extensions
    • JSONPath
    • XPath
    • Regular Expressions
    • JSON Schema
    • CSV File Format
    • Install Java Environment
    • Runner deployment environment
    • Apidog flavored Markdown
  • Apidog Europe
    • Apidog Europe
  • Support Center
  1. Run test scenarios

Run a test scenario

Once you have constructed the test scenario, you can run it to generate a test report.

Selecting suitable running method#

Apidog provides multiple ways to run test scenarios, suitable for different testing requirements.
1.
Local visual execution:
This method is initiated from a local machine and is suitable for small-scale, quick testing. It is particularly effective when development and testing occur simultaneously, allowing for real-time monitoring and immediate adjustments.
2.
Local Apidog CLI execution:
Ideal for handling large-scale data or iterative scenarios, this method offers increased execution speed. It is capable of running offline, which is suitable for environments with restricted resources or those that do not require a graphical user interface.
3.
CI/CD execution:
This execution style is integrated within the CI/CD pipeline, making it a great option for automated integration and continuous deployment processes. It is especially useful in settings where tests are frequently run to confirm the stability of APIs after each code update.
4.
Self-hosting runner execution:
Teams can set up the Apidog Runner on their own servers, leveraging more robust computing resources for testing. This method includes support for scheduled tasks, making it ideal for scenarios that need regular testing or those with significant testing demands.
TIP
In test scenarios, if environment/global variables are used, the actual values of these variables may differ depending on the execution method chosen, which could lead to inconsistent test results. Learn more.
Let's start with the local visual execution.

Getting started#

1
Go to the desired test scenario and select the environment in which you want the requests to run.
CleanShot 2024-07-24 at 19.43.17@2x.png
2
Click on "Run".
3
You will see a test report displaying information such as pass rate, execution time, and other data for the current run. You can expand each request to view validations and assertions.
image.png
4
Click on "more" to inspect the actual request and response details.
image.png
Learn more about test reports.

Run options in test scenarios#

When running a test scenario in Apidog, several settings and options can be configured to tailor the test execution to specific requirements.
functional-test-settings.png
Below, we explore the critical aspects of these settings:

Environment#

Specify the service (base URL) to which requests in the scenario should be directed and the variable set to be used. Learn more details at environments & services.
Note that custom requests have their own full URL and will NOT be directed to the environment set in this context, unlike imported steps.

Test data#

The test scenario supports importing external test data sets. When the test scenario runs, the system will loop through all data sets in the data file and assign the data in the data sets to the corresponding variables, see data-driven testing for details.

Iterations#

Configure the number of times all steps within the scenario will be executed in a loop.
If there's a significant amount of data to process, it's recommended to use the Apidog CLI for execution instead of the Apidog client to optimize performance.

Threads#

Execute all steps in multiple threads where data between threads remains isolated to prevent interference.
Note that this feature is in Beta and may require further performance optimization. For rigorous load testing, it is advised to use the Performance test functionality instead.
Threads are not supported in the CLI.

Runs on#

The machine that actually consumes hardware resources to run test scenarios. All requests initiated in the test scenarios will be sent from the machine specified here. Therefore, differences in the network environment of the requesting machine may lead to varying test results.
NOTE
This setting will not be saved as part of the test scenario's run configuration. Each time, the local machine will be used by default to run the test scenarios. Additionally, this setting will not take effect during batch runs or CLI executions. In these cases, requests will be initiated using the resources of the current machine.
When specifying a machine to run test scenarios, if the test scenario involves files (such as files sending, database connections, external programs, SSL certificates, etc.), all required files must be stored locally on the specified machine for proper functionality.

Notifications#

Enabling this feature will send notifications to specified recipients once the manual test scenario is complete. The notification will include an overview of the test results and a link to the detailed report. You can configure whether to send the notification as soon as the test finishes or only when a failure occurs, helping to minimize unnecessary alerts. Refer to notification settings for more detailed information.

Shared#

By enabling the "Share" option on the right side of theAdvanced Settings, the test report generated after each test scenario run will be automatically shared with other members of the project. You can view all test reports that have been shared within the team under theSharedtab in theTest Reportssection. Refer to test reports for more details.
If the current test scenario includes steps with endpoints imported from other projects, you can refer to this guide: Manage the runtime environment of APIs from other projects

Advanced settings#

On error#

Configure how the test should handle errors, which can include assertion failures, data format validation failures, server errors, etc. The options are:
Ignore: Continue executing the next step when an error occurs (default setting).
Continue: End the current iteration and start the next one when an error occurs.
End execution: Stop the entire run when an error is encountered.

Delay#

Set a pause between sending each step to manage and control the execution speed.

Save request/responses#

By default, Apidog saves every request and response. In cases where requests or responses are significantly large (potentially several MBs), they might take up considerable disk space. You can enable this option to not save every request and response but only save assertion and validation results.
Alternatively, you can choose to save only failed requests and responses to conserve space.

Keep variable values#

This option is enabled by default, ensuring that the current value of global and environment variables, when modified during the test, retains the last modified result. If this option is disabled, global and environment variables will not change after the test run; they will retain the value they had before the run.
Local variables are not affected by this setting and will be cleared after each run.

Run with stored cookies#

In the lower right corner of Apidog, the Cookies icon leads to cookie management. Apidog automatically saves cookies when making API requests. If you want to use the saved cookies during test scenario execution, enable this option.

Save cookies after run#

Similar to above, if you wish to update the saved cookies after executing a test scenario, enable this option.
By configuring these advanced settings in Apidog, you can fine-tune your test executions to meet your specific testing needs, ensuring efficient and precise test runs.

Runtime settings under test scenario design mode#

If you are in the test scenario design mode, the relevant runtime configurations are collapsed to the right side of the "Run" button. Hover the mouse over this settings button to see the detailed runtime configurations for this test scenario.
image.png

Runing functional tests#

After running functional tests, you will be directed to the test scenario execution page. The pie chart in the image below provides an overview of the test results, updating in real-time as the test scenario runs. Below the pie chart, you’ll find the detailed test steps being executed, with the status of each step displayed during the run.
run-functional-test.jpg
Once the functional test run is complete, you can click on the relevant endpint to view its metrics and status during the test. This includes the endpoint name, request method, request URL, response status code, response time, response content, data validation, and assertion results. For more details, please refer to test reports.

Rules for using environment/global variables across different running methods#

Environment and global variables are persistent, meaning they can be saved for long-term use across multiple runs or different test scenarios. However, the actual variable values used may vary depending on the selected running method. For example:
If a test scenario uses an environment variableToken, it might run successfully when executed within the client, but fail when run using a self-hosted Runner because the value of the Token is incorrect.
This discrepancy can occur because the actual value of the environment variable Token is taken from the locally stored value within the client during execution. However, when running in a self-hosted Runner, the Runner does not have the same locally stored Token, leading to a failure.
To address this issue, Apidog provides a set of rules for managing values of the environment/global variable across different run methods:
Running MethodEnvironment/Global Variable UsageVariable Storage Location
Local (Client, Web)Uses the current values of the environment/global variables for execution.Stored locally. Can be manually modified or updated through pre/post processors. Visible in the Environment Management > Environment/Global Variables > Current Values.
CLI, CI/CDFor real-time online execution:
  1. Uses the initial values of the environment/global variables for execution (default).
  2. Specify the use of environment/global variable values stored in a file on the machine running the test scenario via the --variables path option. Learn more.
For execution with exported data:
  1. Uses the environment/global variable values included in the exported file for execution.
  1. Initial values are stored in Apidog Cloud and can only be modified manually within the client.
  2. Stored in the file specified by the --variables path option, allowing manual modification or updates through pre/post processors.
  3. Stored in the exported file. Can be manually modified or updated through pre/post processors.
Self-hosted Runner
  1. Uses the initial values of the environment/global variables for execution (default).
  2. Uses the locally stored values within the Runner for execution. Learn more.
  1. Initial values are stored in Apidog Cloud and can only be modified manually within the client.
  2. Stored within the specified Runner. Can be manually modified or updated through pre/post processors. Visible through the product user interface or in the file within the Runner at /opt/runner/variables.

Running test scenarios with endpoints from other projects#

When a test scenario includes endpoints imported from other projects, the request URLs for these endpoints during execution are determined by the configuration you set in theEnvironment associations.
For example:
If the "Develop Env" of the current project is associated with the "Prod Env" of another project, When running the test scenario using "Develop Env", the imported endpoint will be sent to the URL from the "Prod Env". All other endpoints will use the URL from the "Develop Env" of the current project.
environment-association.jpg

Implementing various tests#

The mentioned steps represent the basic execution of a test scenario. When setting up test scenarios, you can incorporate various advanced settings to fulfill diverse testing requirements.
Data-driven testing: Conduct tests using diverse data sets to validate system behavior across various scenarios.
Performance testing: Evaluate system performance under varying load conditions to assess scalability and responsiveness.
Scheduled tasks: Establish structured plans detailing scope, approach, and timing of testing activities.
CI/CD integration: Automate build, test, and deployment processes to ensure reliable and frequent software releases.
Modified atΒ 2025-02-05 07:40:47
Previous
Export test scenarios
Next
Data-driven testing
Built with