5 This module is for formatting and writing unit-tests in python. The general format is as follows 6 1. Use start() to start a test and give it, as an argument, the name of the test 7 2. Use whatever check functions are relevant to test the run 8 3. Use finish() to signal the end of the test 9 4. Repeat stages 1-3 as the number of tests you want to run in the file 10 5. Use print_results_and_exit() to print the number of tests and assertions that passed/failed in the correct format 11 before exiting with 0 if all tests passed or with 1 if there was a failed test 13 In addition you may want to use the 'info' functions in this module to add more detailed 14 messages in case of a failed check 17 import os, sys, subprocess, traceback, platform
22 n_failed_assertions = 0
26 test_in_progress =
False 32 We want certain environment variables set when we get here. We assume they're not set. 34 However, it is impossible to change the current running environment to see them. Instead, we rerun ourselves 35 in a child process that inherits the environment we set. 37 To do this, we depend on a specific argument in sys.argv that tells us this is the rerun (meaning child 38 process). When we see it, we assume the variables are set and don't do anything else. 40 For this to work well, the environment variable requirement (set_env_vars call) should appear as one of the 41 first lines of the test. 43 :param env_vars: A dictionary where the keys are the name of the environment variable and the values are the 44 wanted values in string form (environment variables must be strings) 46 if sys.argv[-1] !=
'rerun':
47 log.d(
'environment variables needed:', env_vars )
48 for env_var, val
in env_vars.items():
49 os.environ[env_var] = val
50 cmd = [sys.executable]
51 if 'site' not in sys.modules:
59 log.d(
'running:', cmd )
60 p = subprocess.run( cmd, stderr=subprocess.PIPE, universal_newlines=
True )
61 sys.exit( p.returncode )
62 log.d(
'rerun detected' )
63 sys.argv = sys.argv[:-1]
68 :return: the first device that was found, if no device is found the test is skipped. That way we can still run 69 the unit-tests when no device is connected and not fail the tests that check a connected device 71 import pyrealsense2
as rs
73 if not c.devices.size():
74 print(
"No device found, skipping test")
83 :param product_line: The product line of the wanted devices 84 :return: A list of devices of specific product line that was found, if no device is found the test is skipped. 85 That way we can still run the unit-tests when no device is connected 86 and not fail the tests that check a connected device 88 import pyrealsense2
as rs
90 devices_list = c.query_devices(product_line)
91 if devices_list.size() == 0:
92 print(
"No device of the", product_line,
"product line was found; skipping test" )
94 log.d(
'found', devices_list.size(), product_line,
'devices:', [dev
for dev
in devices_list] )
100 Function for printing the current call stack. Used when an assertion fails 102 print(
'Traceback (most recent call last):' )
103 stack = traceback.format_stack()
110 for line
in reversed( stack ):
111 print( line, end =
'' )
115 The following functions are for asserting test cases: 116 The check family of functions tests an expression and continues the test whether the assertion succeeded or failed. 117 The require family are equivalent but execution is aborted if the assertion fails. In this module, the require family 118 is used by sending abort=True to check functions 124 Function for when a check fails 126 global n_failed_assertions, test_failed
127 n_failed_assertions += 1
133 log.e(
"Aborting test" )
137 def check(exp, abort_if_failed = False):
139 Basic function for asserting expressions. 140 :param exp: An expression to be asserted, if false the assertion failed 141 :param abort_if_failed: If True and assertion failed the test will be aborted 142 :return: True if assertion passed, False otherwise 148 print(
" check failed; received", exp )
159 Used for asserting a variable has the expected value 160 :param result: The actual value of a variable 161 :param expected: The expected value of the variable 162 :param abort_if_failed: If True and assertion failed the test will be aborted 163 :return: True if assertion passed, False otherwise 165 if type(expected) == list:
166 print(
"check_equal should not be used for lists. Use check_equal_lists instead")
172 if result != expected:
174 print(
" result :", result )
175 print(
" expected:", expected )
186 Used to assert that a certain section of code (exp: an if block) is not reached 187 :param abort_if_failed: If True and this function is reached the test will be aborted 189 check(
False, abort_if_failed)
194 Used to assert that an except block is not reached. It's different from unreachable because it expects 195 to be in an except block and prints the stack of the error and not the call-stack for this function 199 traceback.print_exc( file = sys.stdout )
205 Used to assert that 2 lists are identical. python "equality" (using ==) requires same length & elements 206 but not necessarily same ordering. Here we require exactly the same, including ordering. 207 :param result: The actual list 208 :param expected: The expected list 209 :param abort_if_failed: If True and assertion failed the test will be aborted 210 :return: True if assertion passed, False otherwise 215 if len(result) != len(expected):
217 print(
"Check equal lists failed due to lists of different sizes:")
218 print(
"The resulted list has", len(result),
"elements, but the expected list has", len(expected),
"elements")
220 for res, exp
in zip(result, expected):
223 print(
"Check equal lists failed due to unequal elements:")
224 print(
"The element of index", i,
"in both lists was not equal")
228 print(
" result list :", result )
229 print(
" expected list:", expected )
238 def check_exception(exception, expected_type, expected_msg = None, abort_if_failed = False):
240 Used to assert a certain type of exception was raised, placed in the except block 241 :param exception: The exception that was raised 242 :param expected_type: The expected type of exception 243 :param expected_msg: The expected message in the exception 244 :param abort_if_failed: If True and assertion failed the test will be aborted 245 :return: True if assertion passed, False otherwise 248 if type(exception) != expected_type:
249 failed = [
" raised exception was of type", type(exception),
"\n but expected type", expected_type ]
250 elif expected_msg
and str(exception) != expected_msg:
251 failed = [
" exception message:",
str(exception),
"\n but we expected:", expected_msg ]
265 Used for checking frame drops while streaming 266 :param frame: Current frame being checked 267 :param previous_frame_number: Number of the previous frame 268 :param allowed_drops: Maximum number of frame drops we accept 269 :return: False if dropped too many frames or frames were out of order, True otherwise 271 global test_in_progress
272 if not test_in_progress:
274 frame_number = frame.get_frame_number()
276 if previous_frame_number > 0:
277 dropped_frames = frame_number - (previous_frame_number + 1)
278 if dropped_frames > allowed_drops:
279 print( dropped_frames,
"frame(s) starting from frame", previous_frame_number + 1,
"were dropped" )
281 elif dropped_frames < 0:
282 print(
"Frames repeated or out of order. Got frame", frame_number,
"after frame",
283 previous_frame_number)
294 Class representing the information stored in test_info dictionary 301 def info( name, value, persistent = False ):
303 This function is used to store additional information to print in case of a failed test. This information is 304 erased after the next check. The information is stored in the dictionary test_info, Keys are names (strings) 305 and the items are of Information class 306 If information with the given name is already stored it will be replaced 307 :param name: The name of the variable 308 :param value: The value this variable stores 309 :param persistent: If this parameter is True, the information stored will be kept after the following check 310 and will only be erased at the end of the test ( or when reset_info is called with True) 318 erases the stored information 319 :param persistent: If this parameter is True, even the persistent information will be erased 325 for name, information
in test_info.items():
326 if not information.persistent:
334 print(
"Printing information")
335 for name, information
in test_info.items():
336 print(
"Name:", name,
" value:", information.value)
342 Function for manually failing a test in case you want a specific test that does not fit any check function 351 global test_in_progress
352 if test_in_progress != in_progress:
354 raise RuntimeError(
"test case is already running" )
356 raise RuntimeError(
"no test case is running" )
361 Used at the beginning of each test to reset the global variables 362 :param test_name: Any number of arguments that combined give the name of this test 365 global n_tests, test_failed, test_in_progress
368 test_in_progress =
True 375 Used at the end of each test to check if it passed and print the answer 378 global test_failed, n_failed_tests, test_in_progress
384 test_in_progress =
False 389 For use only in-between test-cases, this will separate them in some visual way so as 390 to be easier to differentiate. 401 Used to print the results of the tests in the file. The format has to agree with the expected format in check_log() 402 in run-unit-tests and with the C++ format using Catch 405 global n_assertions, n_tests, n_failed_assertions, n_failed_tests
407 passed = n_assertions - n_failed_assertions
408 print(
"test cases:", n_tests,
"|" , n_failed_tests,
"failed")
409 print(
"assertions:", n_assertions,
"|", passed,
"passed |", n_failed_assertions,
"failed")
411 print(
"All tests passed (" +
str(n_assertions) +
" assertions in " +
str(n_tests) +
" test cases)")
def check_frame_drops(frame, previous_frame_number, allowed_drops=1)
def find_devices_by_product_line_or_exit(product_line)
def check_exception(exception, expected_type, expected_msg=None, abort_if_failed=False)
def print_results_and_exit()
def find_first_device_or_exit()
def unexpected_exception()
def unreachable(abort_if_failed=False)
def info(name, value, persistent=False)
def reset_info(persistent=False)
def check(exp, abort_if_failed=False)
def check_equal_lists(result, expected, abort_if_failed=False)
static std::string print(const transformation &tf)
def check_equal(result, expected, abort_if_failed=False)
def set_env_vars(env_vars)
def check_test_in_progress(in_progress=True)