Apidog Docs
🇺🇸 English
  • 🇺🇸 English
  • 🇯🇵 日本語
HomeLearning CenterSupport CenterAPI References
HomeLearning CenterSupport CenterAPI References
Discord Community
Slack Community
X / Twitter
🇺🇸 English
  • 🇺🇸 English
  • 🇯🇵 日本語
  1. Test scenarios
  • Apidog Learning Center
  • Get started
    • Introduce Apidog
    • Basic concepts in Apidog
    • Navigating Apidog
    • Quick Start
      • Overview
      • Specify a new endpoint
      • Make a request to the endpoint
      • Add an assertion
      • Create a test scenario
      • Share your API documentation
      • Explore more
      • Send a request and save as an endpoint
    • Migration
      • Overview
      • Manual import
      • Scheduled import
      • Import options
      • Export data
      • Import from...
        • Import from Postman
        • Import OpenAPI (Swagger) spec
        • Import cURL
        • Import Markdowns
        • Import from Insomnia
        • Import from apiDoc
        • Import .har file
        • Import WSDL
  • Design APIs
    • Overview
    • Create a new API project
    • Endpoint basics
    • Components
    • Common fields
    • Global parameters
    • Endpoint change history
    • Batch endpoint management
    • Configure multiple request body examples
    • Schemas
      • Overview
      • Create a new schema
      • Build a schema
      • Generate Schemas from JSON etc.
    • Security schemes
      • Overview
      • Create a security scheme
      • Use the security scheme
      • Security scheme in online documentation
    • Advanced features
      • Custom endpoint fields
      • Import endpoints as test steps
      • Endpoint status
      • Appearance of parameter lists
      • Endpoint unique idenfication
  • Develop and Debug APIs
    • Overview
    • Generate requests
    • Send requests
    • Endpoint cases
    • Dynamic values
    • Validate responses
    • Design-first Mode & Request-first Mode
    • Generate code
    • Environments & variables
      • Overview
      • Using variables
      • Environments & services
    • Vault secrets
      • Overview
      • HashiCorp Vault
      • Azure Key Vault
      • AWS Secrets Manager
    • Pre/Post processors
      • Overview
      • Assertion
      • Extract variable
      • Wait
      • Database operations
        • Overview
        • MongoDB
        • Redis
        • Oracle Client
      • Using scripts
        • Overview
        • Postman scripts reference
        • Pre processor scripts
        • Post processor scripts
        • Public scripts
        • Calling other programming languages
        • Using JS libraries
        • Visualizing responses
        • Script examples
          • Assertion scripts
          • Using variables in scripts
          • Using scripts to modify request messages
          • Other examples
    • Dynamic values Modules
  • Mock API data
    • Overview
    • Smart mock
    • Custom mock
    • Mock priority sequence
    • Mock scripts
    • Cloud mock
    • Self-hosted runner mock
    • Mock language (Locales)
  • Automated tests
    • Overview
    • Test reports
    • Test scenarios
      • Create a test scenario
      • Pass data between requests
      • Flow control conditions
      • Import endpoints/endpoint cases from other projects
      • Sync data from endpoints/endpoint cases
      • Export test scenarios
    • Run test scenarios
      • Run a test scenario
      • Data-driven testing
      • Run test scenarios in batch
      • Scheduled tasks
      • Manage the runtime environment of APIs from other projects
    • Test APIs
      • Integration testing
      • Performance testing
      • End-to-end testing
      • Regression testing
    • Apidog CLI
      • Overview
      • Installing and running Apidog CLI
      • Apidog CLI Options
    • CI/CD
      • Overview
      • Integrate with Jenkins
      • Integration with Gitlab
  • Publish API Docs
    • Overview
    • API Technologies Supported
    • Quick share
    • View the API documentation
    • Publish docs sites
    • Folder tree settings
    • Custom layouts
    • Visibility settings
    • Endpoint SEO settings
    • Custom domain
    • Embedding values in document URLs
    • Documentation Search
    • Integrating Google Analytics with Doc Sites
    • CORS Proxy
    • API Versions
      • Overview
      • Create API versions
      • Publish API versions
      • Share endpoints with API versions
  • Send requests
    • Overview
    • gRPC
    • Use request proxy agents for debugging
    • SOAP/WebService
    • GraphQL
    • WebSocket
    • Socket.IO
    • SSE debugging
    • Create requests
      • Request History
      • Request basics
      • Parameters and body
      • Request headers
      • Request settings
      • HTTP/2
    • Authentication and authorization
      • Overview
      • CA and client certificates
      • Authorization types supported by Apidog
      • Digest Auth
      • OAuth 1.0
      • OAuth 2.0
      • Hawk Authentication
      • Kerberos
      • NTLM
      • Akamai EdgeGrid
    • Response and cookies
      • Overview
      • API response in Apidog
      • Create and send cookies
      • Debug requests
      • Save the request as an endpoint
  • Branches
    • Overview
    • Create a new sprint branch
    • Test APIs in a branch
    • Design API in a branch
    • Merge sprint branches
    • Manage sprint branches
  • Apidog MCP Server
    • Overview
    • Conntect API Specification within Apidog Project to AI via Apidog MCP Server
    • Conntect Online API Documentation Published by Apidog to AI via Apidog MCP Server
    • Conntect OpenAPI Files to AI via Apidog MCP Server
  • Best practices
    • How to handle API signatures
    • How to access OAuth 2.0 protected APIs
    • Apidog collaboration workflow
    • Managing authentication state in Apidog
  • Administration
    • Onboarding Checklist
      • Basic Concepts
      • Onboarding Guide
    • Managing Teams
      • Managing Teams
      • Managing Team Members
      • Member Roles & Permission Settings
      • Team Activities
      • Team Resources
        • General Runner
        • Team Variables
        • Request Proxy Agent
      • Real-time Collaborations
        • Team Collaboration
    • Managing Projects
      • Managing Projects
      • Managing Project Members
      • Notification Settings
      • Project Resources
        • Database Connection
    • Managing Organizations
      • Single Sign-On (SSO)
        • SSO Overview
        • Configure Microsoft Entra ID
        • Configure Okta
        • Configure SSO for an Organization
        • Managing user accounts
        • Mapping Groups to Teams
      • SCIM Provisioning
        • Intro to SCIM Provisioning
        • Microsoft Entra ID
        • Okta
      • Organization Resources
        • Self-hosted Runner
  • Billing
    • Overview
    • Credits
    • Unable to use credit cards?
    • Managing subscriptions
    • Upgrade plan
  • Data & Security
    • Where is Apidog's data stored, and how is data security ensured?
    • How is user data stored? Will this data be public? Or will it be private? Will all data be stored in the cloud?
    • When sending requests, do they go through the Apidog server? Is data security ensured?
  • Add-ons
    • API Hub
    • Apidog Intellij IDEA plugin
    • Browser Extension
      • Chrome
      • Microsoft Edge
    • Request Proxy
      • Request proxy in Apidog web
      • Request proxy in shared docs
      • Request proxy in Apidog client
  • Account & preferences
    • Language settings
    • Data backup
    • Network proxy configuration
    • Hot keys
    • Updating Apidog
    • Generate OpenAPI access token
    • Deleting account
    • Account settings
  • References
    • API-Design First Approach
    • Apidog OpenAPI/Swagger Specificaiton Extensions
    • JSONPath
    • XPath
    • Regular Expressions
    • JSON Schema
    • CSV File Format
    • Install Java Environment
    • Runner deployment environment
    • Apidog flavored Markdown
  • Apidog Europe
    • Apidog Europe
  • Support Center
  1. Test scenarios

Create a test scenario

A test scenario in Apidog testing serves as the fundamental unit, analogous to a Collection in Postman. When you need to send multiple requests consecutively, build continuous test scenarios, or repeat requests with different test data, you can create a test scenario and add the necessary requests to it.
Using test scenarios in Apidog, you can efficiently meet a range of API testing requirements:
1.
Sequential Request Execution: Organize and execute multiple requests in a specified order to simulate user interactions or process flows.
2.
Test Reporting: Automatically generate reports that provide detailed visualizations of assertions and individual request outcomes.
3.
CI/CD Integration: Integrate test scenarios into CI/CD pipelines to ensure automatic testing during development cycles, facilitating early detection of issues.
4.
Performance Testing: Evaluate API performance under varying loads and generate trend analyses to identify changes in response behaviors over time.
5.
Dynamic Parameter Testing: Execute requests multiple times using dynamically generated parameters to test how the API handles variable inputs.
6.
Predefined Test Data: Utilize preset data for requests to simulate realistic operation conditions and verify API responses against expected outputs.
7.
Data Passing Between Requests: Automatically pass data from the output of one request to another, crucial for testing APIs requiring state persistence across calls.
8.
Logical Request Relationships: Configure logical conditions such as if, for, and foreach to manage the execution flow based on the outcomes of prior requests or specific conditions.
These features allow you to create versatile and effective test environments that contribute to robust and high-quality API development.

Create a test scenario#

Upon opening Apidog, navigate to “Tests” module, and then click the + next to the search bar to create a new test scenario. Select the appropriate directory for it, and set the priority to complete the creation.

Add test steps#

Once you have set up your test scenario in Apidog, you can start populating it with requests. There are several ways to add requests, each tailored to different needs and flexibility:
1.
Requests linked to the endpoint spec—These requests could be updated as the endpoint specification changes:
a. Import from endpoint spec: This method involves importing requests straight from the structured definition within the API spec. It ensures your tests align with the API's documented interface, although you may need to manually adjust request parameters to meet specific testing conditions.
b. Import from endpoint case: Utilize this option to pull requests from predetermined endpoint cases that already contain configured parameters. This is particularly useful for standardized tests that simulate real-world API usage, enabling consistency across test runs.
2.
Independent requests not associated with the API spec—These requests do not update in response to changes in the API spec. They allow for greater customization:
a. Add custom request: Craft requests from scratch to tailor test scenarios to specific requirements. This approach offers maximum flexibility to explore beyond the bounds of the existing API specification.
b. Add from cURL: Leveraging cURL lets you import or sculpt customized requests efficiently. It's a handy way to quickly create requests that mimic complex or unique API interactions without being constrained by the API spec.
3.
Reference other test scenarios:
a. Include requests from other test scenarios: This method allows you to import specific requests from different test scenarios already defined in your project.
b. Reference other test scenario: For comprehensive testing, you might need to employ the entirety of another test scenario. This capability ensures that you can integrate all related tests and configurations, avoiding the need to duplicate setup efforts and fostering unified testing strategies.

Import from endpoint spec#

You can import endpoint specs from the current project as steps in the test scenario. There are two modes when importing endpoints: "Manual" and "Automated." For more detailed instructions, please refer to Sync data from endpoints/endpoint cases.
Manual
In "Manual" mode, modifications to the endpoint documentation within the project do not have an immediate impact on the endpoints in the test steps. Synchronization of test data only occurs when testers activate the "Manual" button. It is important to note that alterations made to the test step data will not update the endpoint documentation, even when clicking for "Manual Sync". Instead, clicking this button enables the test scenarios to retrieve information from the endpoint documentation for synchronization purposes.
Automated
In the "Automated" mode, any changes in the endpoint documentation within the project will be updated synchronously in the test steps.
If you need to test endpoints from other projects in one test scenario, please refer to Import Endpoints/Cases from Other Projects to Test Steps.

Import from endpoint case#

You can choose to import endpoint cases from the current project or other projects. There are two modes when importing endpoint cases: "Copy" and "Reference".
Copy
When importing an endpoint case as "Copy", the parameters in the endpoint case will also be copied into the test steps. They will be independent of each other, and changes in each will not affect the others. Manual sync can be selected.
Reference
When importing an endpoint case as "Reference", it will directly use the endpoint case from the original project for the request.
If the test step is referenced from a case, you'll see a prompt indicating that modifying this step will also affect the original endpoint case and any other steps that use it.
prompt-changes-effects.png

Add custom request#

In a working process, you may need to call an endpoint outside of the project, such as a third-party payment endpoint.
You can add a custom API request in the test steps. The custom request can be any HTTP request, including common GET, POST, PUT, DELETE, etc.

Add from cURL#

In a real working processes, many endpoint requests are presented in the form of cURL command lines. You have the option of importing cURL requests into the test steps, with just a single click.

Include steps from other test scenarios#

You can clone the test steps or process control conditions by importing from other test scenarios within the same project.

Reference other test scenarios#

You can reference other test scenarios as a test step. There are two use cases:
1.
If your business process has some common, reusable API test steps, you can compile these steps into a small test scenario and then reference it directly in other wider test scenarios.
2.
If you need to regress the mainstream process of the entire product, you can refer to the various sub-test scenarios in the test scenario for assembly, and complete the test regression work of all mainstream processes with one click.
To prevent infinite loops and situations where the test scenario cannot stop running normally, the feature of referencing other test scenarios cannot reference the original test scenario itself.

Orchestrate the test scenario#

Clicking any test step will enter the orchestrate mode. In this mode, you have a larger operating page to better and more efficiently fill in the detailed content of each test step. The left side of the page is the overall flow of the test scenario, and the right side is the details of the selected test step. Endpoint requests and testing process control components will have different display panels.
You can adjust the order of the steps by dragging the ≡ in front of the step.
You can use the "⬆️" and "⬇️" keys to quickly switch between the selected test steps in this mode.
In the orchestrate mode, you can edit multiple steps and then click the "Save All" button in the top left corner to save all the changes.
If any step has unsaved changes, the step will be marked with a dot in the list bar on the left. Remember to always save the changes you have made.
Previous
Test reports
Next
Pass data between requests
Built with