Inside the Sausage Factory: PART 25 (Arrested by an FxCop)

In my last post, I talked about the second part of the detour that derailed (albeit only temporarily) my progress on the SkillPortal project.  The first part of the detour took me on a journey that ended with my delivering an updated VAULT Package to the CIFactory project and the second detour took me on a journey that ended with my extending the MbUnit NAnt task library to support filtering of unit tests by category, author, and namespace.  Part three of this rather long detour involved integrating FxCop into CIFactory.

Why we Adopt Microsoft-recommended Coding Practices

At my company, we have adopted Microsoft-recommended coding practices for our .NET work for a few very good reasons:

  • Arguably who’s in a better position to make recommendations about how to code in VB.NET or C# than Microsoft?  Though we never blindly follow Microsoft’s guidance on just about anything and deviate from Microsoft recommendations in some pretty significant areas at times, its awfully hard to see how following rules like ‘camel-casing for variables’ or ‘Pascal-casing for method names’ could lead us down a bad path to ruin smile_teeth
  • There is tremendous guidance available publicly explaining these standards and most developers are more than casually familiar with them, making their learning curve pretty shallow and leading to more maintainable code over time (new developers don’t have to be handed a 200-page ‘coding-practices’ manual when they start).
  • Why deviate from such a comprehensive body of publicly-available knowledge/practices unless there is a very good reason to do so?  Reference documentation, code samples, and –yes– tooling in products like FxCop are all readily available to support these conventions.
  • The architecture of our applications is where we want our code to help differentiate ourselves in the marketplace, not our style of casing our class members smile_tongue

To help us follow .NET best-practices (admittedly as espoused by Microsoft), we leverage the freeware FxCop project provided by Microsoft to analyze our MSIL assemblies and report on areas where we may be out of compliance with Microsoft’s own recommendations.  We encourage all our developers to run FxCop against their work to get rapid feedback about whether their code complies with naming conventions, security best-practices, etc. though we are not so fascist as to require 100% clearance before anything is checked into the SCC system as seems to be the case in some other places I’ve heard of (I just don’t see this as being practical).

CruiseControl.NET (CCNET) has an FxCop task that makes it relatively trivial to run FxCop against your builds as part of the CI cycle.  As we switch over to CIFactory to assist us in popping out CI server instances at the drop of a hat smile_regular, it would be extremely useful to us if CIFactory could integrate FxCop into its builds as well.

Unfortunately, after a little research I discovered that there wasn’t a CIFactory ‘package’ already made for FxCop that I could just include in our CIFactory master templates.  Fortunately, it was relatively straightforward to build one after reviewing how some of the other available packages do their work.

Standing on the Shoulders of…someone!

Since I’m basically a lazy guy at heart — note the term ‘lazy’ has a positive connotation here in that it encourages me to build on the work of others rather than re-invent the wheel — my first thought when considering how to build a CIFactory package for FxCop was to puzzle out which of the existing CIFactory packages might perform a function very much like FxCop and then inspect that package to see how it worked to see if there was anything there worth building on.

I hit upon the NDepend package as my starting point, since (at least conceptually) it seemed to share much of the same function as my hypothetical planned FxCop package:

  • They both analyze aspects of your code
  • They both perform their analysis not on the source code, but instead on all the MSIL-compiled assemblies that are the output of the build (compile) process
  • They both produce a report in XML format that needs to find its way into the CIFactory summary report page
  • They both offer command-line-driven interaction

After closer inspection, I discovered that the NDepend package for CIFactory works more or less as follows:

  1. Copy all the build/compile output DLLs to a separate folder
  2. Dynamically build an NDepend project file that adds to the analysis list one each of the binaries that were copied into the separate folder in step 1
  3. call NDepend from the command line and pass it the dynamically-created project file from step 2
  4. capture the output from the NDepend analysis performed in step 3 into an xml file
  5. merge the XML file into the main results XML file that forms the main CIFactory build summary page (a custom XSLT transform is subsequently applied to this XML file in order to render it as HTML for display in the browser)

From this above list of actions, one can see a direct correlation to the tasks that an FxCop package might need to perform:

  1. Copy all the build/compile output DLLs to a separate folder
  2. call FxCop from the command line and pass it the folder path from step 1
  3. capture the output from the FxCop analysis performed in step 2 into an xml file
  4. merge the XML file into the main results XML file that forms the main CIFactory build summary page (a custom XSLT transform is subsequently applied to this XML file in order to render it as HTML for display in the browser)

The only missing step that the NDepend package performs that its not immediately obvious that FxCop would need to perform is that of dynamically building a ‘project’ file for FxCop to run against.

It turns out that FxCop actually does offer the ‘option’ to use a ‘project file’ (unsurprisingly with a .fxcop extension) to drive its behavior and so even that could have been done in the same manner as the NDepend package: code could be written to create the needed .fxcop file on-the-fly as part of the NAnt scripting process.  However, when I looked at the content of the .fxcop project file, it became clear to me that nearly 99% of what was in that file was also passable to the FxCop command-line utility as command-line arguments and so that’s the approach I chose to pursue for my new FxCop package for CIFactory.

Anatomy of a CIFactory Package

There is ample documentation provided on the CIFactory website that covers how to create your own packages, so I won’t bother to cover the nitty-gritty details common to all packages here but rather will just hit the highlights in my FxCop Package development process.

NAnt makes heavy use of properties to support variable parameters and CIFactory packages, being really nothing more than complex NAnt scripts themselves, do so as well.  By convention, the ‘user-adjustable variable’ values in a CIFactory Package belong in a <packagename>.Properties.xml file and the one I developed for FxCop looks like this…

<?xml version="1.0" encoding="utf-8" ?>
<project xmlns = "http://nant.sf.net/schemas/nant.xsd" name = "FxCop.Properties">
    <property name = "FxCop.BinFolder" value = "${BuildDirectory}\Packages\FxCop\bin"/>
    <property name = "FxCop.Console" value = "${FxCop.BinFolder}\FxCopCmd.exe"/>
    <property name = "FxCop.BuildFolder" value = "${BuildDirectory}\FxCop"/>
    <property name = "FxCop.ReportFolder" value = "${FxCop.BuildFolder}\Report"/>
    <property name = "FxCop.AssembliesFolder" value = "${FxCop.BuildFolder}\Assemblies"/>
    <property name = "FxCop.RulesFolder" value = "${FxCop.BinFolder}\Rules"/>
    <property name = "FxCop.ReferenceAssembliesFolder" value = "${ThirdPartyDirectory}"/>
    <fileset id = "FxCop.Target.ProjectFiles">
        <include name = "${ProductionDirectory}\**\*.*proj"/>
    </fileset>
    <property name = "FxCop.FilesToDeleteAtSetup" value = "${FxCop.BuildFolder}\**\*.*"/>
</project>

Basically, the first couple of properties (FxCop.BinFolder , FxCop.Console , FxCop.BuildFolder ) tell the script where to find FxCop within the CIFactory folder structure.  The next few tell FxCop where to output its results (FxCop.ReportFolder), where to find the MSIL assemblies to analyze (FxCop.AssembliesFolder), where to find the rules to run against the assemblies (FxCop.RulesFolder), and where to find its reference assemblies — those files referenced by the assemblies under analysis but not part of the scope of the actual analysis(FxCop.ReferenceAssembliesFolder).  The last two sections (FxCop.Target.ProjectFiles , FxCop.FilesToDeleteAtSetup) tell the rest of the scripts how to find the csproj or vbproj files that will be parsed to find all the output assemblies to be collected into the folder specified by the FxCop.AssembliesFolder property and what folder has its contents completely cleared out before each build is run.

The other important file in a CIFactory Package is where the actual meat of the action takes place, and by convention its named <packagename>.Target.xml and contains the actual NAnt targets to execute.  Our is therefore going to be named FxCop.Target.xml and starts out like this…

<?xml version="1.0" encoding="utf-8" ?>
<project xmlns = "http://nant.sf.net/schemas/nant.xsd" name = "FxCop">
    <include buildfile = "FxCop.Properties.xml"/>
    <fileset id = "FxCop.AssembliesFileSet"/>
    <property name = "Private.FxCop.HaveAssemblies" value = "false"/>

After some housekeeping (the encoding format, the XML schema definition we are going to comply with) we include the buildfile with all the properties we just set so they are available in this script, declare a NAnt fileset to hold the paths of all the assemblies that we need to analyze, and set a sort of boolean flag property (Private.FxCop.HaveAssemblies) to false so we have some way to test to see if there are any assemblies worth analyzing or not.  Later on, if we do find some, we will set this very same property to ‘true’ so we can tell we found something worth acting on.  Next we need to start defining out NAnt targets…

By convention, each CIFactory package needs a ‘setup’ and a ‘teardown’ target (very much like a unit-test-style paradigm).  In our setup, we need to either create a couple of directories if they don’t already exist or empty them out if they do exist and there’s something in them from the last build…

    <target name = "FxCop.SetUp">
        <mkdir if = "${directory::exists(FxCop.BuildFolder) == false}" dir = "${FxCop.BuildFolder}"/>
        <property name = "Common.FilesToDelete" value = "${FxCop.FilesToDeleteAtSetup}"/>
        <call target = "Common.DeleteFiles"/>
        <mkdir if = "${directory::exists(FxCop.ReportFolder) == false}" dir = "${FxCop.ReportFolder}"/>
        <mkdir if = "${directory::exists(FxCop.AssembliesFolder) == false}" dir = "${FxCop.AssembliesFolder}"/>
    </target>

We don’t actually need to do anything in our ‘teardown’ target, but just in case we do need to do something later on in life, we’ll declare an empty target anyhow (and the main build script will actually make a call to this once the package’s main targets have been executed)…

    <target name = "FxCop.TearDown"></target>

Our next target is a sort of ‘utility’ target that will be called by our main target.  Following a CIFactory Pacakge convention, its target name (Private.FxCop.CollectAssemblies) will be prefixed with Private to remind all who review it that its not intended to be called from outside this specific build script.  This target’s job is actually pretty straightforward: to examine each of the .vbproj and/or .csproj files in our solution to retrieve their output folder paths and then get the paths to all the MSIL assemblies found in these build output folders, storing each one in the NAnt fileset defined earlier…

    <target name = "Private.FxCop.CollectAssemblies">
        <strings id = "FxCop.AssemblyList"/>
        <foreach item = "File" property = "FxCop.Target.ProjectFile.Path">
            <in>
                <items refid = "FxCop.Target.ProjectFiles"/>
            </in>
            <do>
                <property name = "AssemblyName" value = "${vsproject::get-assemblyname(FxCop.Target.ProjectFile.Path)}"/>
                <ifnot test = "${AssemblyName == '' or stringlist::contains('FxCop.AssemblyList', AssemblyName)}">
                    <property name = "Private.FxCop.HaveAssemblies" value = "true"/>
                    <property name = "TargetAssemblyPath" value = "${vsproject::get-output-directory(FxCop.Target.ProjectFile.Path, Compile.ConfigName)}\${AssemblyName}.dll"/>
                    <function execute = "${fileset::include-add('FxCop.AssembliesFileSet', TargetAssemblyPath)}"/>
                    <property name = "TargetAssemblyPath" value = "${vsproject::get-output-directory(FxCop.Target.ProjectFile.Path, Compile.ConfigName)}\${AssemblyName}.exe"/>
                    <function execute = "${fileset::include-add('FxCop.AssembliesFileSet', TargetAssemblyPath)}"/>
                    <property name = "TargetPDBPath" value = "${vsproject::get-output-directory(FxCop.Target.ProjectFile.Path, Compile.ConfigName)}\${AssemblyName}.pdb"/>
                    <function execute = "${fileset::include-add('FxCop.AssembliesFileSet', TargetPDBPath)}"/>
                    <function execute = "${stringlist::add('FxCop.AssemblyList', AssemblyName)}"/>
                </ifnot>
            </do>
        </foreach>
    </target>

Note that its in this target that we are setting our ‘flag’ property (Private.FxCop.HaveAssemblies) to true if  > 0 assemblies are found so that we can branch accordingly later in our script.

For our final target (the main one that does the orchestration of all the other pieces), we first need to call the above-defined ‘utility’ target to collect all the assemblies, then do a quick if-then test to see if any MSIL assemblies were found (by testing the value of the Private.FxCop.HaveAssemblies property).  If not, make a note of it and bail out of the script like so…

    <target name = "FxCop.Calculate">
        <call target = "Private.FxCop.CollectAssemblies"/>
        <ifnot test = "${Private.FxCop.HaveAssemblies}">
            <echo level = "Warning" message = "No assemblies found to analyze for FxCop!"/>
        </ifnot>

Out next step (if our ‘flag’ property tells us that there actually are some assemblies to analyze) is to copy each one of them to our pre-declared folder for analysis…

        <if test = "${Private.FxCop.HaveAssemblies}">
            <copy todir = "${FxCop.AssembliesFolder}" verbose = "true">
                <fileset refid = "FxCop.AssembliesFileSet"/>
            </copy>

…and then in one big giant <exec…> task we simply make a call to the FxCop command-line utility, using most of the properties assigned in the FxCop.Properties.xml file as arguments to the command-line…

      <exec
          program = "${path::get-short-path(FxCop.Console)}"
          commandline = "/file:${path::get-short-path(FxCop.AssembliesFolder)}/out:${path::get-short-path(FxCop.ReportFolder)}\FxCopReport.xml /rule:${FxCop.RulesFolder} /directory:${path::get-short-path(FxCop.ReferenceAssembliesFolder)} /searchgac  /fo"
          failonerror = "false"/>

Note that as shown in the prior snippet we need to set the failonerror property to false because the way NAnt works is to check any program run with the <exec…>  task to see if it returns other than ‘zero’ and if so, considers this a ‘failure’ in our build.  Since even a warning-related message about a rule violation from FxCop will force the command-line analysis engine to return other than a ‘zero’ return code, its almost impossible to get a ‘zero’ back from any FxCop run unless you’re being a complete fascist about FxCop compliance of your developers’ coding smile_sniff and we don’t want a non-zero return code from FxCop to ‘fail’ our build on us.

The last little bit of this main target for the package is pretty straightforward stuff and handles merging the FxCop analysis results XML file into the build artifact directory and brings along any raster image files that may be needed to display the report when its passed through an XSLT transform to spit out the HTML for display in the build report…

      <property name = "Deployment.SourceFileName" value = "FxCopReport.xml"/>
      <property name = "Deployment.SourceDir" value = "${FxCop.BuildFolder}\Report"/>
      <property name = "Deployment.TargetDir" value = "${Common.ArtifactDirectoryPath}"/>
      <property name = "Deployment.TargetFile" value = "${Deployment.SourceFileName}"/>
      <call target = "Deployment.PublishFileSilently"/>
      <copy todir = "${Deployment.TargetDir}">
        <fileset basedir = "${FxCop.BuildFolder}">
          <include name = "*.png"/>
          <include name = "*.xml"/>
        </fileset>
      </copy>
    </if>
  </target>

…and then of course we close the <if> statement and then the whole <target> node is closed, wrapping the whole thing up.

Summary

Throughout the whole process of my development of this package, I have to admit that I steadily became more and more impressed with the work that Jay Flowers has done with the whole CIFactory package architecture.  It makes either creating a new package or reverse-engineering an existing one quite straightforward (assuming all the naming and organizing conventions are followed by all).

In the end, I sent my prototyped FxCop CIFactory package to Jay for him to include in a subsequent release of CIFactory (perhaps after a little more fine-tuning and smoothing out of any rough spots in my work).  His response was very positive and speaks again to my steadily-improving impression of the open source software community in re: their acceptance of community contributions from other than a core development team.

Hopefully my FxCop package (or some incarnation of it~!) will appear in some subsequent CIFactory release, but either way the package already works for me and I’m able to use it in our own work now to provide valuable additional feedback from our build server for the SkillPortal project and others in our company.

Which, thankfully, ends part 3 of the 3-part detour that shanghaied me in the middle of this project and took me on a winding journey into the bowels of the .NET open-source software world before I could return to the work at hand.

Even though it was a longer-than-hoped journey, the results were well worth it:

All of this will let us (finally) return to the main project in the next post~!