#competitive-programming #test-cases #command-line #competitive #testing #command-line-tool #problem

app cp-tester

A command line tool to test competitive programming problems quickly and locally

5 stable releases

1.0.4 Oct 10, 2023
1.0.2 Sep 21, 2023

#1005 in Command line utilities

MIT license

78KB
1.5K SLoC

Competitive Programming Tester

CLI Tool for easy testing of competitive programming problems
Makes it easy to run tests using a terminal command instead of having to reupload to a website every time
My first Rust project, please give me any feedback at swaminathanalok@gmail.com

Supports C, C++, Java, and Python, however, C, Java, and Python use the versions that come installed on your PC (this could change in the future if people want)

At the moment it works with USACO Problems and allows you to download test cases with the problem link (Not the link to the test case download), or the problem ID.
If you want to download other types of problems they have to be zipped, and directly extract to test cases that are in the same directory with different file endings, and matching names to show that test cases correspond. You can also just configure the test cases to match those specifications and add a test from a folder.

Installation

Installation (requires Rust):

cargo install cp-tester

Installation without Cargo(Coming soon once I set up an action for it):
 Should be a release with executables for Windows, Linux, and Mac, but it is up to you to get them in the right directory so they can be a command.

Might add an install script later

Future Plans

 Ability to download sample cases on USACO, Codeforces, and ATCoder.
 Support for AtCoder cases.
 Support for submission of problems (Not during competitions).
 Ability to run a test once by downloading it in the run command then deleting it.

Features(Most of this information can be found by using --help):

cp-tester add - Installs tests

Adding tests:
 All test cases should be in the same directory level, and have different extensions for input and output. For example, case 10 would be 10.in and 10.out.
--link takes a link to a zip file that must extract directly to test cases
--folder takes a path to a folder (If you get an error about writing test data when adding a test from a folder, just try rerunning it, error showed up for me but I couldn't replicate it ever again so not sure how to fix)
--usaco-link takes a link to a USACO problem(The problem page not the test data link)
--usaco-id takes a USACO problem ID(cpid=ID at the end of the link)
Extensions (DONT USE A .):
--input-extension takes the input extension that will be used to find the test cases, and that will be used if the test requires file IO(Default: in)
--output-extension takes the output extension that will be used to find the test cases, and that will be used if the test requires file IO(Default: out)
Naming:
 Default name:
  For --link it is the name of the zip file that is downloaded
  For --folder it is the name of the folder
  For --usaco-link and --usaco-id the name is formatted <problem_name>_<division>_<competition><year>, such as find_and_replace_silver_jan23
--name takes a name that overrides the default name
IO:
 A test stores 2 values, input_io and output_io, which can either be STDIN/STDOUT respectively, or be file names
 The default values for these fields is STDIN and STDOUT, unless you are downloading a USACO problem using the specific flags, in which case it will be inferred.
 *This does unfortunately mean that if the test data has different extensions than the input and output, you will have to modify the test data first, but this isn't something I have seen often

cp-tester config - Interaction with the config

This is the default config file (Stored wherever dirs::config_local_dir()/cp-tester is):

{
  "default_cpp_ver": 17,
  "unicode_output": false,
  "default_timeout": 5000,
  "gcc_flags": {
    "-lm": "",
    "-O2": ""
  },
  "gpp_flags": {
    "-lm": "",
    "-O2": ""
  },
  "java_flags": {},
  "javac_flags": {}
}

print Prints the config
print-default Prints the default config
reset Resets the config to default
unicode-output determines if the test results after running the test on a file will be "PASSED" and "FAILED" or "✅" and "❌".

There are sub-commands to edit each value in the config, including adding and removing flags. If you pass two values to a set flag subcommand, they will be separated with an "=", if you pass one, only that value will be given. So if you want to, for example, pass -Xss4m to increase the stack size, just use cp-tester config set-java-flag -Xss4m.

cp-tester list - Lists tests

cp-tester list lists all test names in alphabetical order --show-io to show IO data for the tests(Default: false)
cp-tester list test <test> to list cases for a specific test.
--cases to list certain cases (Comma separated)(Default: all)
--show-input and --show-output do what you expect and are both false by default as for some tests they can be very large.

cp-tester remove - Removes tests

cp-tester remove <test_name> removes the test with that name
--all to remove all cases(Default: false)

cp-tester rename - Renames tests

cp-tester rename <old_name> <new_name> Renames test "old_name" to "new_name"

cp-tester run - Run test on a file

cp-tester run <name> --file <file> Valid file extensions are .c, .java, .py, and .cpp
--cases to specify cases to run (comma separated)(Default: all cases)
--show-input to show input(Default: false)
--compare-output to compare your program's output to the desired output(Default: false)
--cpp-ver to specify C++ version, defaults to that in the config(Default: 17)
--timeout timeout in milliseconds, defaults to that in the config(Default: 5000ms)

Test storage

Tests are stored in dirs::data_local_dir(), and the names of the tests and their IO data are in test.json, while folders corresponding to the names are in the tests/ subdirectory. This allows for test names to be loaded without their data, so that a test is only loaded when necessary.

Example usage:

You want to work on http://www.usaco.org/index.php?page=viewproblem2&cpid=991
Download the test cases:
cp-tester add --usaco-id 991
Run them on a file:
cp-tester run loan_repayment_silver_jan2020 --file path_to_solution.cpp
Result is:

Test Case 1: 1 milliseconds
PASSED
Test Case 2: 47 milliseconds
PASSED
Test Case 3: 39 milliseconds
PASSED
Test Case 4: 1 milliseconds
PASSED
Test Case 5: 1 milliseconds
PASSED
Test Case 6: 4 milliseconds
PASSED
Test Case 7: 2 milliseconds
PASSED
Test Case 8: 2 milliseconds
PASSED
Test Case 9: 2 milliseconds
PASSED
Test Case 10: 4 milliseconds
PASSED
Test Case 11: 13 milliseconds
PASSED

Passed all the cases are you can now move on!

Dependencies

~13–27MB
~445K SLoC