-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unit test rework #122
base: main
Are you sure you want to change the base?
Unit test rework #122
Conversation
This uses Ubuntu 22.04 for the runner and docker image.
…sts. Note that is important for the unit tests to have the `got` argument first and the `expected` argument second in an `is` test. Test2::Tools::Compare converts the second `expected` argument into a Test2::Tools object which is important for strict checks. Also remove the incomplete t/db/test_course_user.pl file.
The unit tests no longer use the conf/webwork3-test.dist.yml or conf/webwork3-test.yml configuration files. Instead the configuration is hardcoded into the unit test runners defined in t/lib/DBSubtest.pm and bin/dev_scripts/test_pinia_stores.pl. The point is that unit tests should not depend on some modifiable configuration. The conf/webwork3-test.dist.yml file has been deleted. You can copy conf/webwork3.dist.yml to conf/webwork3-dev.yml for development use. For morbo or the `bin/dev_scripts/build_sample_db.pl` script (previously `t/db/build_db.pl`) the file conf/webwork3-dev.yml will be used if it exists. The sample data for the database is now in JSON files instead of CSV files. This allows for structured data with valid types. This means that all of the crap that was done to convert the data from the CSV files into structured data with valid types is not needed. In perl decode_json does what is needed, and in javascript the JSON files can just be imported and typescript correctly infers the types. Note that eventually it would be nice to do what is done in t/db/001_courses.t for more of the tests. That test no longer uses the sample data in the files at all. Most of the tests really don't use much from the sample data. Furthermore, the tests can made much more precise and transparent in this way. The unit tests each deploy a temporary database instance, and load the sample data automatically. In addition, the tests can be run with all three supported databases (sqlite, postgres, and mysql). By default the tests are only run with an in memory sqlite instance. For all of the tests if the environment variable WW3_TEST_ALL_DBS is set then the tests will instead each be run three times. Once with each of the supported databases. For postgres the Test::PostgreSQL package is used to create a temporary postgres instance. For mysql a local modified version of the Test::mysqld package (t/lib/TestMysqld.pm) is used. The upstream Test::mysqld package has issues in that it must be run as root. Note that on the other hand the Test::PostgreSQL package can not be run as root. Note that for these tests to work you must have postgres and mysql (or mariadb) installed. However, they do not need to be running on the system. Since the database instances are temporary and each test uses its own instance, there is no need to be concerned with putting the database back to the way it was. It will be deleted when the test is completed anyway. The bin/dev_scripts/test_pinia_stores bash script has been replaced with the bin/dev_scripts/test_pinia_stores.pl perl script. That script runs a Mojolicious server daemon and creates a webwork3 api app instance with a provided test configuration (no longer in the lib/WeBWorK3.pm file), and then runs the jest pinia store tests. The server port can be configured either with a command line option or via the WW3_TEST_PORT environment variable. The test runs by default only with the in memory sqlite database, but if the -ta (or --test-all-dbs) option is given or the WW3_TEST_ALL_DBS environment variable is set, then the test will run three times, once with each of the three databases. Note that the Github action runs all unit tests with all databases. Previously the pinia stores tests were not run by the action, but now they are (and with all three databases). Converting the CSV files to JSON files and working with that data has illuminated some big problems that need to be fixed. First, the data returned by the webwork3 api methods currently returns data with extremely poor data structure. The data returned by the api methods should not be flattened in the way that it is currently. This is the biggest issue and most pressing. Both the api and the client side stores will need to be fixed. Second, the "get all" methods need to be removed. These are abolutely unnecessary, shouldn't exist, and should not be tested. For example, there is no reason to get all problems in the database. Problems should only be obtained in the context that they have meaning. That is in the context of the set they belong to. It may be that these were added for the purpose of the unit tests to check that things were put back the way that the were in the database. If so, these still shouldn't have been in the code. These methods were all essentially just the DBIx::Class search method which could have been used directly. In addition those checks are now not needed with this change. The getAllProblemSets method was another of these. That is removed. Note that the unit tests now run all all pushes. This is useful before making a pull request to see that the unit tests pass. The code coverage is only uploaded when a pull request is merged into main in any case, so there is no reason not to have these run.
Everything runs well (but slow) with this. It seems like with every test run, the db is started up and loaded with data. Nice that each time there is a fresh data available and you know the state of the database, I guess this method you don't need to check that the database is restored to its previous state. Is there anyway of keeping the database up for multiple tests? I'm guessing that things would run a lot faster (if that matters). I think this could go in and rebase other branches off this--that'll take some work. Clearly all of the tests have been changed, but only some minor cosmetic changes to any of the core code. I tested the ww3 interface and seems to work as before. |
Yes, it will run slower than before. It is doing a lot more testing. Yes, every test starts with a fresh new database. There is no need to restore the database to some previous state because of this. There is a way to keep the database up for multiple tests, but that would revert back to before. That is not what is desired. The point of unit tests is not to be fast. It is to check that things are working correctly. |
You have to realize that the old approach of trying to put the database back into its original state is unreliable and unmaintainable. That is the point of each test using a fresh new database built from scratch. That makes the tests take longer, but it is worth it. |
This makes sense after thinking about it more. And the time slowdown shouldn't been too bad--one can run the individual tests fairly quickly on a single database and be patient when running all of them. It's also fantastic that all 3 databases can be tested easily. Should catch some of those errors that popped up before. |
You might be interested in using fixtures for mocking database tests. In a dbic environment, we use Test::DBIx::Class and DBIx::Class::Fixtures something like this use Test::DBIx::Class {
schema_class => 'Something::Something::DB'
};
sub connect {
...
DBIx::Class::Fixtures->new( {options => ...} )->populate( {...} );
} I haven't been able to find an example of a non-dbic setup, but maybe something like Test::mysqld or even DBIx::Class::Migration might give you the database tests you're looking for without the overhead of creating the full-blown database everytime. Just a thought. |
I am just quickly putting this in for you to see. This commentary needs work. See
git log
for more details.