Inside the Sausage Factory: PART 19 (Coding the Data-dependent Unit Tests)

In our last post we leveraged NDbUnit to create our Test Fixture Setup and Teardown methods to get our database loaded with data and into a known state so that we could develop our unit tests for our Data Access Layer (DAL) to interact with it.  Now we are (finally) ready to code our actual tests against the database in order to validate that our Data Access Layer is reliable before we proceed to build any other application layers atop an uncertain and unproven foundation.

Recall that as it exists right now, we have a pretty simple DAL that is largely made up of GetAllXXX() -style functions that return collections of our main domain objects for the sake of populating our search controls in our User Interface.  In fact, its so simple that some readers may be asking "why even bother to test this at all?".  The answer, dear reader, is simple: without unit tests, you have no way of actually knowing for certain that your code does what you intended.  "It compiles" only means your syntax is right; "it passes my unit tests" means your intent has been achieved.

Since most of these GetAllXXX() functions all behave largely identically, we can assume that the tests for them will also be quite similar.  For each of these type of functions, we can effectively test for two criteria:

  1. Is the function returning all of the expected records correctly?
  2. Is the function returning the records in sorted order? (recall that these GetAllXXX() -style functions are also asking for the records to be sorted when returned from the DAL)

Coding our Tests

Let’s start with our GetAllOffices() method.  Our first unit test will simply check to ensure that all of the expected records are returned from the call to this function by just counting the number of Offices returned from the function.  Since we are in complete charge of the data in the database (having pre-loaded known data into the database using NDbUnit in our TestFixtureSetUp method), we can know with certainty the number of Offices defined in the Office table of our database and thus what to expect back from the function under test:

        [Test]
        public void GetAllOfficesCountTest()
        {
            _sql.PerformDbOperation(NDbUnit.Core.DbOperationFlag.Refresh);

            DataProvider provider = new DataProvider();

            IList<Office> offices = provider.GetAllOffices();

            Assert.AreEqual(6, offices.Count);
        }

Since we know there are six offices in our test data, we can just Assert() that the returned collection has six objects.

Our next test is to ensure that the six offices returned are in fact in correct sorted order.  This test is a bit more complex, but still quite straightforward: first we retrieve all Office records from the database, then we loop through them and test each to ensure that its in the right sort order.  As we loop, we keep track of the number of items not in the correct sort order and in the end we simply Assert() that the count of these errors (those items mis-sorted) is equal to zero.  By doing this, if the Assert() fails then the failure message can be used to report the number of items that were incorrectly sorted:

        [Test]
        public void GetAllOfficesSortedTest()
        {
            _sql.PerformDbOperation(NDbUnit.Core.DbOperationFlag.Refresh);

            DataProvider provider = new DataProvider();

            IList<Office> offices = provider.GetAllOffices();

            string compareName = string.Empty;
            int errorCount = 0;

            foreach (Office o in offices)
            {
                string currentName = o.Name;

                if (currentName.CompareTo(compareName) <= 0)
                {
                    //if we get here, it means that something didn't sort correct so increment the error counter
                    errorCount += 1;
                }

                compareName = currentName;
            }

            //if we get this far without a failure, then the test is a success
            Assert.AreEqual(0, errorCount, "Number returned is count of incorrectly sorted items");
        }

Now we need to duplicate these same tests for the other GetAllXXX() -style methods, testing in each case for the expected number of records to be returned from each corresponding table (which I won’t bother to demonstrate here).

Our next step is to consider a test for the GetPersonWithFullProfile() method.  This method takes the Id of the Person to retrieve and returns the correct Person object.  We can test for this by passing in the Id of a record we know to be in the database and then testing the returned object to ensure its the correct one (in this case by inspecting the Lastname property of the object) as so:

        [Test]
        public void GetPersonWithFullProfileTest()
        {
            _sql.PerformDbOperation(NDbUnit.Core.DbOperationFlag.Refresh);

            DataProvider provider = new DataProvider();

            Person person = provider.GetPersonWithFullProfile(2);

            Assert.AreEqual("Bohlen", person.Lastname);
        }

Our last test is for the more complex function GetDistinctPersonBySearchCriteria() that actually does the searching.  To test this we can code a unit test that inputs a populated search criteria object and then inspects the returned collection of People to ensure that the proper people are returned from the search.  Since we have many combinations of SearchCriteria values that can be used to search for a person and they all need to be tested to be certain they are all behaving as expected, its not possible to devise a single unit test that will satisfy all of these possibilities.  Instead, we need multiple tests that will ferret out any logic problems in our searching process.  As an example, this is the test for ensuring that a query by office id is functioning correctly:

        [Test]
        public void GetDistinctPersonsBySearchCriteriaUsingOfficeIdTest()
        {
            _sql.PerformDbOperation(NDbUnit.Core.DbOperationFlag.Refresh);

            DataProvider provider = new DataProvider();

            SearchCriteria searchCriteria = new SearchCriteria();

            searchCriteria.OfficeId = 3;

            IList<PersonSkillSearchView> results = provider.GetDistinctPersonsBySearchCriteria(searchCriteria);

            Assert.AreEqual(1, results.Count);
            Assert.AreEqual("Bohlen", results[0].Lastname);
        }

In this test, we populate a new SearchCriteria object with the desired OfficeId, ask the DAL to do the search for us based on that criteria, then inspect the returned collection to ensure that both the number and the objects returned are as expected.  This breaks the general accepted guideline of only having a single Assert() in each unit test, but in this case its needed to be 100% certain that the method is functioning correctly.  The last steps are to code other tests for this same DAL method, including all foreseeable combinations of SearchCriteria values and also remembering also to test for edge cases with tests like…

GetDistinctPersonsBySearchCriteriaUsingEmptyCriteriaTest() <– should return ALL distinct person records in the system

GetDistinctPersonsBySearchCriteriaUsingMatchlessCriteriaTest() <– should return NO records (intentional no-match criteria passed)

Coverage Analysis

Once all this is done, we can use another extremely useful capability of our TestDriven.NET unit test-runner: the test-with-coverage option.  Developing unit tests is only useful if one has a way of measuring the amount of one’s code that the unit tests are actually exercising.  Otherwise, just having unit tests can easily lead to a false sense of security if unbeknownst to the developer, his/her unit tests are only exercising a very small percentage of the code.  While 100% code coverage is certainly desirable, its often not realistic and often times even if such coverage could be achieved the difference between 80% unit test coverage and 100% test coverage could be a doubling or even tripling of the complexity of one’s tests (and the related effort).

As such, unit test code coverage always represents a trade-off between desired coverage metrics and the effort required to increase the metric.  Personally, I tend to feel that in general after about 80-85% there is a point of diminishing returns that is reached and so I tend to use that as a target for my own work.  Note (of course) that like any other metric, its always possible to ‘game the system’ by achieving the 85% coverage target by testing trivial sections of your code and ignoring the more complex methods — like any other metric, you get back only what you put into it so I recommend to always take this measurement with a grain of salt.

Using TestDriven.NET test-with-coverage option runs the desired tests and integrates the test-run process with the NCover utility that measures unit test execution coverage and then displays it in the NCoverExplorer interface for review and analysis.  Following is the NCoverExplorer screenshot after running the tests we have thus far…

image

Note that our assembly-under-test, SkillPortal.DataAccess, is showing in the display as 79% coverage — not bad for a first set of tests and pretty close to our target of 85%.  Notice though that the reason the entire assembly is at 79% is because not all of the properties in the SearchCriteria class have been fully tested; our actual DataProvider class (the one really doing the work we are interested in testing) is all the way up at 100% coverage, indicating that we are way beyond our target of 85%.thumbs_up

NCoverExplorer offers a wealth of other detailed information about our tests and their coverage up to and including the lower-right pane of the display above that actually shows us specifically what lines of code in the tested classes were visited and what were not (the blue vs. red highlight in the source code) allowing us to get as granular as we want in inspecting the results.  Since NCover and NCoverExplorer are the tool that tell us how many more unit tests (and for what code!) still remain to be written for our work, the integration of these tools into TestDriven.NET is a major factor in its selection as our unit test-runner tool.

Now that we have a working suite of unit tests for our Data Access Layer, we are free to move on to develop some of the higher layers in our application, confident in the fact that our DAL is functional and behaving as expected.

More on that next time~!