Create Functional Test
There are multiple ways to create a functional test in ReadyAPI. In this topic, you can find step-by-step instructions on how to do this.
Using SmartBear AI [Beta]
ReadyAPI AI generates test cases from your API definition using natural language instructions. You import an API definition, describe the test scenario you want to test, choose a response profile, select the assertion types you need, and generate a complete test case in seconds. You can then review, edit, or run your generated test case.
Beta Notice
This feature is available in Beta and continues to evolve. We welcome your feedback and use cases, especially if you encounter errors or unexpected results.
Enable or disable SmartBear AI in ReadyAPI
ReadyAPI ships with SmartBear AI features disabled by default.
Control SmartBear AI from the ReadyAPI UI
To enable the SmartBear AI features, follow these steps:
Open ReadyAPI.
In the toolbar, select Preferences.
In the left pane, go to Integrations, and select SmartBear AI.
Select Enable SmartBear AI integration.
Select OK.
After you enable SmartBear AI, the Open SmartBear AI button becomes active in the navigation bar. Use this button to open the Generate Tests with SmartBear AI dialog. When SmartBear AI is disabled, the button remains inactive.
Control SmartBear AI in managed deployments
Organizations that deploy ReadyAPI as a managed image can also control SmartBear AI availability with a JVM option. This setting disables AI features at startup and removes AI-related options from the ReadyAPI UI.
Use the following JVM option:
-Dreadyapi.enable.ai.features=<value>To turn off all AI features, set the value to:
-Dreadyapi.enable.ai.features=false
At a glance: Generate a test case with SmartBear AI
To create a test case using SmartBear AI:
Describe your test scenario in natural language.
Select a response profile (Performance or Accuracy).
Select the assertions you want included in the generated test case.
Click Generate Test Case to create the test.
Review and resolve validation errors, if any.
Steps 6–8 outline recommended follow-up actions to take after generating a test case.
![]() |
Import an API definition
Import your API definition from a local file or through your Swagger Studio integration. After import, ReadyAPI displays the operations and schemas available for test generation.
Describe your test scenario
Enter a natural-language description of the scenario you want to test. ReadyAPI uses this description to shape the request flow, data, and assertions included in the generated test case.
Examples of scenario descriptions:
Example 1: Create a test that creates a pet, updates it, and deletes it.
Example 2: After logging in, purchase a pet, verify the stock shows the pet is sold, and log out.
Example 3: Log in, create a pet named "Fido", sell “Fido,” update the status to "sold", verify the pet status is sold, and log out.
Select a response profile
Choose how SmartBear AI balances speed and analysis:
Performance: Generates results quickly with lightweight processing. Use this profile for rapid iteration or most everyday test-generation scenarios.
Accuracy: Generates more detailed and deeply analyzed test scenarios. This profile takes slightly longer but provides more precise and comprehensive results.
Select assertion types
Add the assertion types you want SmartBear AI to include in your generated test case.
Validate HTTP Response Status: Checks whether the test step receives an HTTP status code defined in the API specification.
Validate Swagger/OpenAPI Compliance: Verifies whether the response matches the API definition.
The assertion supports the following specifications:
Swagger 2.0 (OpenAPI 2.0)
OpenAPI 3.0.0
OpenAPI 3.1.0
Validate Response SLA: Checks whether the response arrives within a defined time limit.
Generate test cases
After you complete your inputs, generate your test cases with a single action. SmartBear AI creates the test steps and assertions based on your description and the imported API definition.
The generated test case is automatically added to the Project Workspace under Functional Tests. ReadyAPI creates a new project named Project X, where X represents the next available project number. Inside the project, ReadyAPI creates an AI-Generated Test Suite 1 that contains the generated test case.
![]() |
You can then review and edit the generated test case in the ReadyAPI UI, or run it using the GUI, TestRunner, or TestEngine, depending on your testing workflow.
Review and resolve validation errors
SmartBear AI generates a test case from the imported API definition and your prompt. ReadyAPI parses the AI response and adds the generated test case to the Project Workspace Navigator for review.
In some cases, ReadyAPI displays validation or parsing error messages during this process. These messages usually indicate issues parsing the AI-generated response, such as formatting inconsistencies or incomplete data exchanged between SmartBear AI and ReadyAPI. The following image shows an example.
![]() |
Click OK to dismiss these messages and review the generated test case. If the test case fails to execute or behaves unexpectedly, include the error details when you submit a SmartBear Support request.
Review the generated test case
To review the test case after generation and confirm it behaves as expected, do the following:
Run the test case and check whether it executes successfully.
Review assertions and verify whether they pass or fail as expected.
If a test fails, check the following:
The API definition is complete and valid.
Requests and responses in the test case contain valid data.
Assertion logs to understand why the test failed and whether the failure reflects a real issue.
Disable SmartBear AI in SLM
By default, SmartBear AI features are enabled for ReadyAPI in SmartBear License Management (SLM) for your organization.
To disable SmartBear AI for all SLM users in your SLM organization, submit a SmartBear Support request with the following information:
Issue Type: Other
Request:
Disable SmartBear AI for your SLM organizationDetails: Include the exact SLM organization name
Disabling AI at the SLM level blocks all users in that organization from accessing AI features in ReadyAPI.
See Also
From Endpoint Explorer
The Endpoint Explorer provides an easy way to check the functionality of a REST API endpoint. You can send a REST request, view the response, and create a functional test for the endpoint on the spot.
To open the Endpoint Explorer:
Select Send a REST Request on the start page.
– or –
Select Endpoint Explorer on the main navigation bar:
![]() |
To create a test:
Select a request method from the Method dropdown.
Enter the URL of your API in the Endpoint field.
Click + Add header and specify headers, if needed:

Switch to the Body tab and enter a request body, if needed.
Click Send to make an API call, and then view the response:

Click Create test:

Select the project to which you want to add the test and click Next:

Select the assertions to be added to your test and click Next:

ReadyAPI will create a project and display a confirmation dialog. In the dialog, click Run to run the created functional test, or click Add Data to create a data-driven test that will use an Excel data source. If you don't want to perform any of the operations, just close the dialog.
From API definition
You can create a functional test based on a WSDL, OpenAPI, Swagger, WADL, or AsyncAPI definition or a GraphQL schema.
Select File > New Functional Test.

– or –
On the Dashboard, click Functional Test in the New Test tile:

In the New Functional Test wizard, select the API Definition option and click Start.

Select how to input your API definition. You have two options:
File
Enter the file path or URL of the API definition in the field provided.
Alternatively, click the Browse button to locate and select the file from your computer.

SwaggerHub Integration
Import an API definition from SwaggerHub. This option is available if you've integrated ReadyAPI with your SwaggerHub account. For more details on integration, see the SwaggerHub integration page.
Select one of the following options:
My API: Search for and select an API from your private SwaggerHub account.
Public API: Search for and select an API from publicly available APIs on SwaggerHub.

Tip
Use the Filters button to refine the API list:
Select specific API specifications: OAS2, OAS3, OAS3.1, AsyncAPI, or All.
Optionally, select Private or Public for My API.
Click Select All or Deselect All to manage your filter selections.
Click Next.
Specify the project to add the new test to and click Next.

Select the assertions to be added to new requests and click Next.

You can create a single test case for all requests in the API definition or a separate test case for each request:

Click Finish.
ReadyAPI will create a project and display the confirmation dialog. In this dialog, you can run the created functional test or create a data-driven test that uses the Excel data source. If you do not need either, close the dialog.
From Endpoint
You can create a functional test based on the URL of your REST service.
Select File > New Functional Test.
– or –
On the Dashboard, click Functional Test in the New Test tile.
In the New Functional Test wizard, select the Endpoint option and click Start.

Enter the URL of your REST service. You can specify multiple URLs to create separate requests.

Specify the project to add the new test to and click Next.

Select the assertions to be added to new requests and click Next.

You can create a single test case for all specified URLs or a separate test case for each request:

Click Finish.
ReadyAPI will create a project and display the confirmation dialog. In this dialog, you can run the created functional test or create a data-driven test that uses the Excel data source. If you do not need either, close the dialog.
From ReadyAPI project
If you already have an API in the workplace, you can create a test for it.
Select File > New Functional Test.
– or –
On the Dashboard, click Functional Test in the New Test tile.
In the New Functional Test wizard, select the ReadyAPI Project option and click Start.

Select requests to be added to your test suite. The requests you select must be in the same project.

You can create a single test case for all requests in the API or a separate test case for each request:

Click Finish.
ReadyAPI will create a project and display the confirmation dialog. In this dialog, you can run the created functional test or create a data-driven test that uses the Excel data source. If you do not need either, close the dialog.
Generate test suites
One of possible ways to create a functional test is to generate it from a service specification that you have in your project. This is an easy way to get a functional test with a test suite and test cases.
To generate a functional test from a service specification:
Right-click a service in the Navigator and select Generate Test Suite from the context menu.

–or–
Select a service in the Navigator, and then select API > Generate Test Suite.

In the subsequent Generate Test Suite dialog:

Select <create> from the Test Suite drop-down list to create a new test suite.
Select one of the following Style options:
Option
Description
One Test Case for each Resource
Creates one test case per resource.
Single Test Case with one Request for each Method
Creates a single test case for all resources.
Select the corresponding Request Content option either to use existing requests or to create empty requests.
Select resources to be included in the test.
Select the Generate LoadUI Test check box if you want to have a load test for each generated test case.
Click OK.
Enter a new test suite name. Click OK.

ReadyAPI will show the created functional test in the Navigator.



