Apidog Docs
πŸ‡ΊπŸ‡Έ English
  • πŸ‡ΊπŸ‡Έ English
  • πŸ‡―πŸ‡΅ ζ—₯本θͺž
HomeLearning Center
Support CenterAPI ReferencesDownloadChangelog
HomeLearning Center
Support CenterAPI ReferencesDownloadChangelog
Discord Community
Slack Community
X / Twitter
πŸ‡ΊπŸ‡Έ English
  • πŸ‡ΊπŸ‡Έ English
  • πŸ‡―πŸ‡΅ ζ—₯本θͺž
πŸ‡ΊπŸ‡Έ English
  • πŸ‡ΊπŸ‡Έ English
  • πŸ‡―πŸ‡΅ ζ—₯本θͺž
  1. Run Test Scenarios
  • Apidog Learning Center
  • Getting Started
    • Introduction to Apidog
    • Basic Concepts in Apidog
    • Navigating Apidog
    • Quick Start
      • Overview
      • Creating an Endpoint
      • Making a Request
      • Adding an Assertion
      • Creating Test Scenarios
      • Sharing API Documentation
      • Explore More
    • Migration to Apidog
      • Overview
      • Manual Import
      • Scheduled Import (Bind Data Sources)
      • Import Options
      • Export Data
      • Import From
        • Import from Postman
        • Import OpenAPI Spec
        • Import cURL
        • Import Markdowns
        • Import from Insomnia
        • Import from apiDoc
        • Import .har File
        • Import WSDL
  • Design APIs
    • Overview
    • Create a New API Project
    • Endpoint Basics
    • APl Design Guidelines
    • Module
    • Configure Multiple Request Body Examples
    • Components
    • Common Fields
    • Global Parameters
    • Endpoint Change History
    • Comments
    • Batch Endpoint Management
    • Custom Protocol API
    • Schemas
      • Overview
      • Create a New Schema
      • Build a Schema
      • Generate Schemas from JSON Etc
      • oneOf, allOf, anyOf
      • Using Discriminator
    • Security Schemes
      • Overview
      • Create a Security Scheme
      • Use the Security Scheme
      • Security Scheme in Online Documentation
    • Advanced Features
      • Custom Endpoint Fields
      • Associated Test Scenarios
      • Endpoint Status
      • Appearance of Parameter Lists
      • Endpoint Unique Identification
  • Develop and Debug APIs
    • Overview
    • Generating Requests
    • Sending Requests
    • Debugging Cases
    • Test Cases
    • Dynamic Values
    • Validating Responses
    • Design-First vs Request-First
    • Generating Code
    • Environments & Variables
      • Overview
      • Environment Management
      • Using Variables
    • Vault Secrets
      • Overview
      • HashiCorp Vault
      • Azure Key Vault
      • AWS Secrets Manager
    • Pre and Post Processors
      • Assertion
      • Extract Variable
      • Wait
      • Overview
      • Database Operations
        • Overview
        • MySQL
        • MongoDB
        • Redis
        • Oracle Client
      • Using Scripts
        • Overview
        • Pre Processor Scripts
        • Post Processor Scripts
        • Public Scripts
        • Postman Scripts Reference
        • Calling Other Programming Languages
        • Using JS Libraries
        • Visualizing Responses
        • Script Examples
          • Assertion Scripts
          • Using Variables
          • Modifying Requests
          • Other Examples
    • Dynamic Values Modules
  • Mock API Data
    • Overview
    • Smart Mock
    • Custom Mock
    • Mock Priority Sequence
    • Mock Scripts
    • Cloud Mock
    • Self-Hosted Runner Mock
    • Mock Language (Locales)
  • API Testing
    • Overview
    • Test Scenarios
      • Create a Test Scenario
      • Pass Data Between Requests
      • Flow Control Conditions
      • Sync Data from Endpoints and Endpoint Cases
      • Import Endpoints and Endpoint Cases from Other Projects
      • Export Test Scenarios
    • Run Test Scenarios
      • Run a Test Scenario
      • Run Test Scenarios in Batch
      • Data-Driven Testing
      • Shared Test Data
      • Scheduled Tasks
      • Manage Runtime Environment of APIs from Other Projects
    • Test Suite
      • Overview
      • Create A Test Suite
      • Orchestrate Test Suite
      • Run Test Suites Locally
      • Run Test Suites Via CLI
      • Scheduled tasks
    • Test Reports
      • Test Reports
    • Test APIs
      • Integration Testing
      • Performance Testing
      • End-to-End Testing
      • Regression Testing
      • Contract Testing
    • Apidog CLI
      • Overview
      • Installing and Running Apidog CLI
      • Apidog CLI Options
    • CI CD
      • Overview
      • Integrate with Gitlab
      • Integrate with Jenkins
      • Trigger Test by Git Commit
      • Integrate with Github Actions
  • Publish API Docs
    • Overview
    • API Technologies Supported
    • Quick Share
    • Viewing API Documentation
    • Markdown Documentation
    • Publishing Documentation Sites
    • Custom Login Page
    • Custom Layouts
    • Custom CSS, JavaScript, HTML
    • Custom Domain
    • AI Features
    • SEO Settings
    • Advanced Settings
      • Documentation Search
      • CORS Proxy
      • Integrating Google Analytics
      • Folder Tree Settings
      • Visibility Settings
      • Embedding Values in Document URLs
    • API Versions
      • Overview
      • Creating API Versions
      • Publishing API Versions
      • Sharing Endpoints with API Versions
  • Send Requests
    • Overview
    • SSE Debugging
    • MCP Client
    • Socket.IO
    • WebSocket
    • Webhook
    • SOAP or WebService
    • GraphQL
    • gRPC
    • Use Request Proxy Agents for Debugging
    • Create Requests
      • Request History
      • Request Basics
      • Parameters and Body
      • Request Headers
      • Request Settings
      • Debug Requests
      • Saving Requests as Endpoints
      • HTTP/2
    • Response and Cookies
      • Viewing API Responses
      • Managing Cookies
      • Overview
    • Authentication and Authorization
      • Overview
      • CA and Client Certificates
      • Authorization Types
      • Digest Auth
      • OAuth 1.0
      • OAuth 2.0
      • Hawk Authentication
      • Kerberos
      • NTLM
      • Akamai EdgeGrid
  • Branches
    • Overview
    • Creating a Sprint Branch
    • Testing APIs in a Branch
    • Designing APIs in a Branch
    • Merging Sprint Branches
    • Managing Sprint Branches
  • AI Features
    • Overview
    • Enabling AI Features
    • Generating Test Cases
    • Modifying Schemas with AI
    • Endpoint Compliance Check
    • API Documentation Completeness Check
    • AI-Powered Field Naming
    • FAQs
  • Apidog MCP Server
    • Overview
    • Connect Apidog Project to AI
    • Connect Published Documentation to AI
    • Connect OpenAPI Files to AI
  • Best Practices
    • Handling API Signatures
    • Accessing OAuth 2.0 Protected APIs
    • Collaboration Workflow
    • Managing Authentication State
  • Offline Space
    • Overview
  • Administration
    • Managing Teams
      • Managing Teams
      • Managing Team Members
      • Member Roles & Permission Settings
      • Team Activities
      • Team Resources
        • General Runner
        • Team Variables
        • Request Proxy Agent
      • Real-time Collaborations
        • Team Collaboration
    • Onboarding Checklist
      • Basic Concepts
      • Onboarding Guide
    • Managing Projects
      • Managing Projects
      • Managing Project Members
      • Notification Settings
      • Project Resources
        • Database Connection
        • Git Connection
    • Managing Organization
      • Managing Organization
      • Single Sign-On (SSO)
        • SSO Overview
        • Configuring Microsoft Entra ID
        • Configuring Okta
        • Configuring SSO for an Organization
        • Managing User Accounts
        • Mapping Groups to Teams
      • SCIM Provisioning
        • Introduction to SCIM Provisioning
        • Microsoft Entra ID
        • Okta
      • Plans Management
        • Billing Managers in Organizations
      • Organization Resources
        • Self-Hosted Runner
  • Billing
    • Overview
    • Credits
    • Upgrading Your Plan
    • Alternative Payment Methods
    • Managing Subscriptions
    • Moving Paid Teams to Organizations
  • Data & Security
    • Data Storage and Security
    • User Data Privacy and Security
    • Request Routing and Data Security
  • Add-ons
    • API Hub
    • Apidog Intellij IDEA Plugin
    • Browser Extension
      • Chrome
      • Microsoft Edge
    • Request Proxy
      • Request Proxy in Web
      • Request Proxy in Shared Docs
      • Request Proxy in Client
  • Account & Preferences
    • Account Settings
    • Generating OpenAPI Access Token
    • Notification
    • Language Settings
    • Hot Keys
    • Network Proxy Configuration
    • Backing Up Data
    • Updating Apidog
    • Deleting Account
    • Experimental Features
  • References
    • API Design-First Approach
    • Apidog OpenAPI Specificaiton Extensions
    • JSONPath
    • XPath
    • Regular Expressions
    • JSON Schema
    • CSV File Format
    • Installing Java Environment
    • Runner Deployment Environment
    • Apidog Markdown Syntax
    • Apidog Swagger Extensions
      • Overview
      • x-apidog-folder
      • x-apidog-status
      • x-apidog-name
      • x-apidog-maintainer
    • Apidog JSON Schema Extensions
      • Overview
      • x-apidog-mock
      • x-apidog-orders
      • x-apidog-enum
  • Apidog Europe
    • Apidog Europe
  • Support Center
HomeLearning Center
Support CenterAPI ReferencesDownloadChangelog
HomeLearning Center
Support CenterAPI ReferencesDownloadChangelog
Discord Community
Slack Community
X / Twitter
πŸ‡ΊπŸ‡Έ English
  • πŸ‡ΊπŸ‡Έ English
  • πŸ‡―πŸ‡΅ ζ—₯本θͺž
πŸ‡ΊπŸ‡Έ English
  • πŸ‡ΊπŸ‡Έ English
  • πŸ‡―πŸ‡΅ ζ—₯本θͺž
  1. Run Test Scenarios

Shared Test Data

In automated test, many test scenarios may need to use the same test data, such as user login information, product data, or configuration parameters. Apidog supports creating Shared Test Data that can be used across multiple scenarios. This reduces repetitive work, ensures data consistency, and makes test resource management more efficient.

Main Advantages of Shared Test Data:#

1.
Cross-Scenario Sharing: Maintain one set of test data at the project level for multiple scenarios to reference.
2.
Unified Management: Centralized test data storage; modifying in one place updates all scenarios that referenced the data.
3.
Consistency Guarantee: Ensures different scenarios use the same base data to avoid deviations in results.
4.
Standardized Workflow: Establishes test data standards to improve team collaboration and maintainability of the testing workflow.

Creating Shared Test Data#

Method 1: Static Test Data#

1
In your project, click Tests in the left menu, then select the Test Data tab.
image.png
2
Click the Create Test Data (Static) button and enter a name.
image.png
3
Import or manually edit the data. It supports CSV and JSON imports, manual table editing, or bulk generation.
bulk-test-data-generation.gif
4
Click Save to create a shared test data.

Method 2: Database Connection#

This allows you to pull data dynamically from a database, ideal for staying consistent with real-world data.
1
In the Test Data tab, click Create Test Data (Database).
image.png
2
If currently no database connection is configured, click Set Data Source Config to add or select a database.
3
Write a SQL query to get the test data. You can also use variables in the SQL. For example:
creating-shared-data-using-database-connection.gif
4
Click Save to create a shared test data.
Once data is pulled, it remains static until you manually refresh it.
img_v3_02v9_77e7ef9f-11e8-4d44-9816-cede3a69d97g.jpg

Using Shared Test Data in Scenarios#

Once you create shared test data, you can reference it in any test scenarios.
1
Go to a Test Scenario, and in the run configuration panel, click the Test Data dropdown to select your shared data.
image.png
2
Once referenced, you can use the data in test steps using the {{variable_name}} syntax.
image.png

Managing Shared Test Data#

Editing Shared Test Data#

In the "Shared Test Data" list, click a data name to open the edit screen. You can:
Change the data name
Add, delete, or edit data rows
Add or remove data columns (variables)
Import new data to overwrite existing data
Export the current data as CSV or JSON
After you update shared test data, all scenario cases that reference that data automatically use the latest values β€” no manual sync needed.
Editing limits for database-type shared test data:
Data content is read-only; you cannot edit individual values directly
Manual refresh is supported: data is fetched again from the linked database and overwrites the current set
You can change the test data name
You can change the database connection settings or the SQL query

Configuring Data by Environment#

Shared test data can be configured per environment. You can maintain a separate dataset for development, testing, and other environments; when you switch environments, the data for that environment is used automatically.
image.png

Using Shared Test Data in Scripts#

You can access shared test data in pre or post processor scripts:
The variable name must match the column name in the shared test data.

Shared Test Data vs Scenario Test Data#

ComparisonShared Test DataScenario Test Data
Data scopeProject-level; can be used by multiple test scenariosOnly the current test scenario
Data syncChanges automatically apply to all scenarios that reference itAffects only the current scenario
Use caseCommon base data, e.g. user info, product dataData specific to one scenario
MaintenanceLow; managed in one placeHigher; maintained per scenario

Best Practices#

1. Use the right data granularity
Use shared test data for highly reusable base data; keep scenario-specific data inside the scenario.
2. Use clear naming
Give shared test data descriptive names (e.g. "User login data", "Product list data") so the team can understand and choose them easily.
3. Clean up unused data regularly
Periodically check which shared test data is still referenced and remove data that is no longer used to keep the list tidy.
4. Use with environment variables
Keep environment-related settings (e.g. URLs, keys) in environment variables and business data in shared test data for clear separation of concerns.

FAQs#

What's the difference between shared test data and environment variables?
Environment variables are best for configuration (e.g. API base URL, keys), usually one value per variable. Shared test data is best for business data and supports multiple rows for data-driven tests. You can use both together.
Does changing shared test data affect tests that are already running?
No. A running test uses a snapshot of the data from when the run started. Your changes only affect new test runs you start after that.
How many records can shared test data hold?
Shared test data supports a large number of records; the exact limit depends on your team plan. Keeping each dataset to a reasonable size (e.g. under 1000 rows) is recommended for best performance.
Modified atΒ 2026-02-26 10:34:51
Previous
Data-Driven Testing
Next
Scheduled Tasks
Built with