Generate Tests with SmartBear AI [Beta]

ReadyAPI AI generates test cases from your API definition using natural language instructions. You import an API definition, describe the test scenario you want to test, choose a response profile, select the assertion types you need, and generate a complete test case in seconds. You can then review, edit, or run your generated test case.

Beta Notice

This feature is available in Beta and continues to evolve. We welcome your feedback and use cases, especially if you encounter errors or unexpected results.

Note

This beta release requires a license hosted in the SLM cloud environment. Support for on-premise SLM licensing is planned for upcoming releases and for general availability (GA).

Enable or disable SmartBear AI in ReadyAPI

ReadyAPI ships with SmartBear AI features disabled by default.

Control SmartBear AI from the ReadyAPI UI

To enable the SmartBear AI features, follow these steps:

  1. Open ReadyAPI.

  2. In the toolbar, select Preferences.

  3. In the left pane, go to Integrations, and select SmartBear AI.

  4. Select Enable SmartBear AI integration.

  5. Select OK.

    After you enable SmartBear AI, the Open SmartBear AI button becomes active in the navigation bar. Use this button to open the Generate Tests with SmartBear AI dialog. When SmartBear AI is disabled, the button remains inactive.

Control SmartBear AI in managed deployments

Organizations that deploy ReadyAPI as a managed image can also control SmartBear AI availability with a JVM option. This setting disables AI features at startup and removes AI-related options from the ReadyAPI UI.

  • Use the following JVM option: -Dreadyapi.enable.ai.features=<value>

  • To turn off all AI features, set the value to: -Dreadyapi.enable.ai.features=false

At a glance: Generate a test case with SmartBear AI

To create a test case using SmartBear AI:

  1. Import an API definition.

  2. Describe your test scenario in natural language.

  3. Select a response profile (Performance or Accuracy).

  4. Select the assertions you want included in the generated test case.

  5. Click Generate Test Case to create the test.

  6. Review and resolve validation errors, if any.

  7. Review the generated test case.

  8. Share feedback or report issues.

Steps 6–8 outline recommended follow-up actions to take after generating a test case.

Generate Tests with SmartBear AI dialog showing fields for API Definition, Prompt, optimization options, and assertion settings.

Import an API definition

Import your API definition from a local file or through your Swagger Studio integration. After import, ReadyAPI displays the operations and schemas available for test generation.

Describe your test scenario

Enter a natural-language description of the scenario you want to test. ReadyAPI uses this description to shape the request flow, data, and assertions included in the generated test case.

Examples of scenario descriptions:

  • Example 1: Create a test that creates a pet, updates it, and deletes it.

  • Example 2: After logging in, purchase a pet, verify the stock shows the pet is sold, and log out.

  • Example 3: Log in, create a pet named "Fido", sell “Fido,” update the status to "sold", verify the pet status is sold, and log out.

Select a response profile

Choose how SmartBear AI balances speed and analysis:

  • Performance: Generates results quickly with lightweight processing. Use this profile for rapid iteration or most everyday test-generation scenarios.

  • Accuracy: Generates more detailed and deeply analyzed test scenarios. This profile takes slightly longer but provides more precise and comprehensive results.

Select assertion types

Add the assertion types you want SmartBear AI to include in your generated test case.

  • Validate HTTP Response Status: Checks whether the test step receives an HTTP status code defined in the API specification.

  • Validate Swagger/OpenAPI Compliance: Verifies whether the response matches the API definition.

    The assertion supports the following specifications:

    • Swagger 2.0 (OpenAPI 2.0)

    • OpenAPI 3.0.0

    • OpenAPI 3.1.0

  • Validate Response SLA: Checks whether the response arrives within a defined time limit.

Generate test cases

After you complete your inputs, generate your test cases with a single action. SmartBear AI creates the test steps and assertions based on your description and the imported API definition.

The generated test case is automatically added to the Project Workspace under Functional Tests. ReadyAPI creates a new project named Project X, where X represents the next available project number. Inside the project, ReadyAPI creates an AI-Generated Test Suite 1 that contains the generated test case.

RAPI_Screenshot_Project.png

You can then review and edit the generated test case in the ReadyAPI UI, or run it using the GUI, TestRunner, or TestEngine, depending on your testing workflow.

Review and resolve validation errors

SmartBear AI generates a test case from the imported API definition and your prompt. ReadyAPI parses the AI response and adds the generated test case to the Project Workspace Navigator for review.

In some cases, ReadyAPI displays validation or parsing error messages during this process. These messages usually indicate issues parsing the AI-generated response, such as formatting inconsistencies or incomplete data exchanged between SmartBear AI and ReadyAPI. The following image shows an example.

RAPI_Screenshot_Error.png

Click OK to dismiss these messages and review the generated test case. If the test case fails to execute or behaves unexpectedly, include the error details when you submit a SmartBear Support request.

Review the generated test case

To review the test case after generation and confirm it behaves as expected, do the following:

  • Run the test case and check whether it executes successfully.

  • Review assertions and verify whether they pass or fail as expected.

If a test fails, check the following:

  • The API definition is complete and valid.

  • Requests and responses in the test case contain valid data.

  • Assertion logs to understand why the test failed and whether the failure reflects a real issue.

Share feedback or report issues

You can submit feedback directly from the Generate Tests with SmartBear AI dialog in ReadyAPI.

If the issue persists or prevents successful execution, submit a SmartBear Support request and include any validation or parsing errors you encountered.

Disable SmartBear AI in SLM

By default, SmartBear AI features are enabled for ReadyAPI in SmartBear License Management (SLM) for your organization.

To disable SmartBear AI for all SLM users in your SLM organization, submit a SmartBear Support request with the following information:

  • Issue Type: Other

  • Request: Disable SmartBear AI for your SLM organization

  • Details: Include the exact SLM organization name

Disabling AI at the SLM level blocks all users in that organization from accessing AI features in ReadyAPI.

See Also

Publication date: