bitlin / ApiTestEngine

Best practice of API test, including automation test and performance test.

Home Page:http://debugtalk.com/post/ApiTestEngine-api-test-best-practice/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ApiTestEngine

Build Status Coverage Status

Design Philosophy

Take full reuse of Python's existing powerful libraries: Requests, unittest and Locust. And achieve the goal of API automation test, production environment monitoring, and API performance test, with a concise and elegant manner.

Key Features

  • Inherit all powerful features of Requests, just have fun to handle HTTP in human way.
  • Define testcases in YAML or JSON format in concise and elegant manner.
  • Supports function/variable/extract/validate mechanisms to create full test scenarios.
  • Testcases can be run in diverse ways, with single testset, multiple testsets, or entire project folder.
  • Test report is concise and clear, with detailed log records. See PyUnitReport.
  • Perfect combination with Jenkins, running continuous integration test and production environment monitoring.
  • With reuse of Locust, you can run performance test without extra work.
  • It is extensible to facilitate the implementation of web platform with Flask framework.

Background Introduction (中文版) | Feature Descriptions (中文版)

Installation/Upgrade

$ pip install git+https://github.com/debugtalk/ApiTestEngine.git#egg=ApiTestEngine

To upgrade all specified packages to the newest available version, you should add the -U option.

$ pip install -U git+https://github.com/debugtalk/ApiTestEngine.git#egg=ApiTestEngine

If there is a problem with the installation or upgrade, you can check the FAQ.

To ensure the installation or upgrade is successful, you can execute command ate -V to see if you can get the correct version number.

$ ate -V
0.3.0

Execute the command ate -h to view command help.

$ ate -h
usage: main.py [-h] [-V] [--log-level LOG_LEVEL] [--report-name REPORT_NAME]
               [--failfast]
               [testset_paths [testset_paths ...]]

Api Test Engine.

positional arguments:
  testset_paths         testset file path

optional arguments:
  -h, --help            show this help message and exit
  -V, --version         show version
  --log-level LOG_LEVEL
                        Specify logging level, default is INFO.
  --report-name REPORT_NAME
                        Specify report name, default is generated time.
  --failfast            Stop the test run on the first error or failure.

Write testcases

It is recommended to write testcases in YAML format.

And here is testset example of typical scenario: get token at the beginning, and each subsequent requests should take the token in the headers.

- config:
    name: "create user testsets."
    import_module_functions:
        - tests.data.custom_functions
    variable_binds:
        - user_agent: 'iOS/10.3'
        - device_sn: ${gen_random_string(15)}
        - os_platform: 'ios'
        - app_version: '2.8.6'
    request:
        base_url: http://127.0.0.1:5000
        headers:
            Content-Type: application/json
            device_sn: $device_sn

- test:
    name: get token
    request:
        url: /api/get-token
        method: POST
        headers:
            user_agent: $user_agent
            device_sn: $device_sn
            os_platform: $os_platform
            app_version: $app_version
        json:
            sign: ${get_sign($user_agent, $device_sn, $os_platform, $app_version)}
    extract_binds:
        - token: content.token
    validators:
        - {"check": "status_code", "comparator": "eq", "expected": 200}
        - {"check": "content.token", "comparator": "len_eq", "expected": 16}

- test:
    name: create user which does not exist
    request:
        url: /api/users/1000
        method: POST
        headers:
            token: $token
        json:
            name: "user1"
            password: "123456"
    validators:
        - {"check": "status_code", "comparator": "eq", "expected": 201}
        - {"check": "content.success", "comparator": "eq", "expected": true}

For detailed regulations of writing testcases, you can read the specification.

Run testcases

ApiTestEngine can run testcases in diverse ways.

You can run single testset by specifying testset file path.

$ ate filepath/testcase.yml

You can also run several testsets by specifying multiple testset file paths.

$ ate filepath1/testcase1.yml filepath2/testcase2.yml

If you want to run testsets of a whole project, you can achieve this goal by specifying the project folder path.

$ ate testcases_folder_path

Supported Python Versions

Python 2.7, 3.3, 3.4, 3.5, and 3.6.

To learn more ...

About

Best practice of API test, including automation test and performance test.

http://debugtalk.com/post/ApiTestEngine-api-test-best-practice/

License:MIT License


Languages

Language:Python 100.0%