-
Notifications
You must be signed in to change notification settings - Fork 147
Write an Integration Test Case
All the tests should be located under osc/tests/integration
. Each directory under this path stands for a single test case. So you can start with creating an empty directory if you're writing a new test case.
mkdir -p osc/tests/integration/my_new_test
Each test case consists some SQL files and one config file in JSON format. It looks like below:
drwxr-xr-x. 31 832 Jan 11 14:45 ..
-rw-r--r--. 1 359 Dec 20 10:33 before.sql
-rw-r--r--. 1 529 Dec 20 10:33 config.json
-rw-r--r--. 1 31 Dec 20 10:33 drop.sql
-rw-r--r--. 1 37 Dec 20 10:33 during.sql
-rw-r--r--. 1 250 Dec 20 10:33 new.sql
Each .sql
file stands for a set of SQL operations to be executed at certain point during OSC, while config.json
will determine when these files will be executed.
The config.json
decides when and how those SQL files will be run during OSC, and which particular parameters would be passed into OSC. The content of this file looks like below:
{
"info": {
"desc": "simple table test"
},
"params": {
"force_cleanup": true,
"allow_new_pk": true,
"eliminate_dups": false,
"rm_partition": true,
"allow_drop_column": true,
"skip_pk_coverage_check": true,
"ddl_file_list": [
"new.sql"
]
},
"hooks": {
"before_init_connection": "before.sql",
"after_run_ddl": "drop.sql",
"before_replay_till_good2go": "during.sql"
},
"expect_result": {
}
}
We will go through each sessions of this file. Before we start, there's one single rule to be remembered. The integration test framework will only pick up sections that it's expecting, which are params
, hooks
, expect_result
at the moment. You can put as many description information as you want in other sections in the same file. Just like the info
section in the example above.
IMPORTANT: Be sure to make this file strictly JSON compatible. Adding comma at the end of a list will cause an JSON parse error. This is not Python!!!
All the parameters and their value will be passed into CopyPayload through **kwargs
just like we're executing OSC from CLI.
This section will be a dictionary mapping SQL files to hook points.
Key can be any supported hook point. A hook point is the name of the function in CopyPayload
prefixed with before_
or after_
. For example: "before_init_connection: before.sql"
stands for "Run all the SQLs from before.sql before init_connection()
is called in CopyPayload
.
See also: Use Hooks.
Value should be the name of the SQL file located in the same directory for which you would like to execute for this hook point.
When running integration tests, files ending with .sql
extension will be executed as a SQLHook
from dba/osc/lib/hook
.
If your test case is meant to test an Exception being raised from OSC, then you can put something like "err_key": "NO_PK_EXIST"
to this section. Otherwise you can just leave this section blank, which means you're expecting this schema change to successfully finish.
The string you put here for err_key
should be one of those defined under osc/lib/errors.py
.
There're two different types of SQL you can put inside .sql
files.
NON-SELECT
If the queries in the file are DML, DDL or whatever SQL that doesn't start with SELECT
, then they will be executed directly at the given hook execution point.
SELECT
If the file starts with a SELECT
query, then all the rest content after SELECT
will be treated as a result set assertion. For example following content in a a.sql
file means, we expect one single row of data for which id=2
, data=c
exists in the table when running query SELECT id, data FROM table1 WHERE id = 2
SELECT `id`, `data` FROM `table1` WHERE `id` = 2;
2 c
NOTE: Be sure not to use Tab autoexpand in your text editor. Column delimiter should only be "\t"
Most of the time you will need the following files for a normal integration test:
drwxr-xr-x. 31 832 Jan 11 14:45 ..
-rw-r--r--. 1 359 Dec 20 10:33 before.sql
-rw-r--r--. 1 529 Dec 20 10:33 config.json
-rw-r--r--. 1 31 Dec 20 10:33 drop.sql
-rw-r--r--. 1 250 Dec 20 10:33 new.sql
- before.sql: Executed as "before_init_connection". To make sure we have a table existing before running schema change for it
- drop.sql: Executed as "after_run_ddl". To make sure the test table we've created is dropped after running the test.
- new.sql: Contains the desired schema.
Basic
Advanced
For Bug reporters & Contributors