Bradley Braithwaite
1 import {
2 Learning,
3 JavaScript,
4 AngularJS
5 } from 'Brad on Code.com'
6 |
Check out my online course: AngularJS Unit Testing in-depth with ngMock.

Measuring Code Improvements in ASP.Net

on
on code quality, methodology

I wrote a blog post titled The Four Pillars of a Good Software Developer that covered some of the topics that we, as developers, need to be mindful of when writing or reviewing code. Even if your team does code reviews, it can still be a challenge for us mere human beings to retain all the useful information in books like the three previously mentioned whilst following the context of the application being reviewed, looking out for code smells, potential performance issues, memory management etc. So yeah, it’s pretty hard.

Reality suggests that most code bases in production will have some technical debt which is why I’m a huge fan of incremental development metrics. Using metrics it’s possible to get some kind of visibility of a project improving with each release.

What tools can be used in a release/deployment cycle that will automate some of the heavy lifting and provide a “metric” that all stakeholders can see with each release as an indicator that the code base is getting “better”?

An Example of “Bad” Code

First, let's consider some less than ideal code. If you’ve been working with ASP.Net for long enough, you’ve undoubtedly seen/written code like this…yes, it’s the canonical GridView (DataGrid if you’re really old school) mess*:

The .ASPX

  1. <%@PageLanguage="C#"AutoEventWireup="true"CodeBehind="Default.aspx.cs"Inherits="WhatMakesGoodCode.Default"%>
  2.  
  3. <!DOCTYPEhtml>
  4.  
  5. <htmlxmlns="http://www.w3.org/1999/xhtml">
  6. <headrunat="server">
  7.     <title>What Makes Good Code Good?</title>
  8. </head>
  9. <body>
  10.     <formid="form1"runat="server">
  11.     <div>
  12.         <asp:GridViewID="GridView1"runat="server"AutoGenerateColumns="False"OnRowDataBound="GridView1_OnRowDataBound"CellPadding="8">
  13.             <Columns>
  14.                 <asp:BoundFieldDataField="ProductTitle"HeaderText="Product"/>
  15.                 <asp:BoundFieldDataField="Description"HeaderText="Desc."/>
  16.                 <asp:BoundFieldDataField="Price"HeaderText="Wholesale Price"/>
  17.                 <asp:BoundFieldDataField="Stock"HeaderText="No. in Stock"/>
  18.                 <asp:BoundFieldDataField="ReleaseDate"HeaderText="Release Date"/>
  19.                 <asp:TemplateFieldHeaderText="RRP">
  20.                     <ItemTemplate>
  21.                        <asp:Labelrunat="server"ID="RRP"></asp:Label>
  22.                     </ItemTemplate>
  23.                 </asp:TemplateField>
  24.             </Columns>
  25.         </asp:GridView>
  26.     </div>
  27.     </form>
  28. </body>
  29. </html>

The Code Behind:

  1. using System;
  2. using System.Collections.Generic;
  3. using System.Data;
  4. using System.Data.SqlClient;
  5. using System.Linq;
  6. using System.Web;
  7. using System.Web.UI;
  8. using System.Web.UI.WebControls;
  9.  
  10. namespace WhatMakesGoodCode
  11. {
  12.     public partial class Default : System.Web.UI.Page
  13.     {
  14.         protected void Page_Load(object sender, EventArgs e)
  15.         {
  16.             SqlConnection sqlConnection1 = newSqlConnection(@"Data Source=(localdb)\v11.0;Initial Catalog=GoodCodeStore;Integrated Security=True;MultipleActiveResultSets=True");
  17.             SqlCommand cmd = newSqlCommand("SELECT * FROM PRODUCTS", sqlConnection1);
  18.  
  19.             sqlConnection1.Open();
  20.             this.GridView1.DataSource = cmd.ExecuteReader(CommandBehavior.CloseConnection);
  21.             this.GridView1.DataBind();
  22.             sqlConnection1.Close();
  23.         }
  24.  
  25.         protected virtual void GridView1_OnRowDataBound(object sender, GridViewRowEventArgs e)
  26.         {
  27.             if (e.Row.RowType == DataControlRowType.DataRow)
  28.             {
  29.                 Label Label1 = (Label) e.Row.FindControl("RRP");
  30.  
  31.                 /*
  32.                  * Some basic (silly?) logic for the purposes of an example.
  33.                  * Logic will markup the RRP by 35% if there are less than 25 copies in stock and the release date is this year.
  34.                  * Else, it will add 25% to the
  35.                  */
  36.                 if (int.Parse(e.Row.Cells[3].Text) < 25 && DateTime.Parse(e.Row.Cells[4].Text).Year == DateTime.Now.Year)
  37.                 {
  38.                     Label1.Text = (decimal.Parse(e.Row.Cells[2].Text) + (decimal.Parse(e.Row.Cells[2].Text) * (decimal)0.35)).ToString("N2");
  39.                 }
  40.                 else
  41.                 {
  42.                     Label1.Text = (decimal.Parse(e.Row.Cells[2].Text) + (decimal.Parse(e.Row.Cells[2].Text) * (decimal)0.25)).ToString("N2");
  43.                 }
  44.             }
  45.         }
  46.     }
  47. }

* It’s not really THAT bad. It’s not ideal for larger applications with a lifespan longer than 12 months.

If we apply automated tools to this application, they would likely blow up with errors, right? Let’s see…

My vanilla setup covers (in order of priority):

  • Test Coverage
  • Measuring Complexity
  • Basic Correctness & Static Code Analysis
  • Style Consistency

If there is a budget for 3rd party tools:

  • Feature Rich Code Analysis (which could incorporate some or all of the previous points)

Test Coverage

There are no unit tests for this project. So code coverage is not applicable.

This is a simple metric to measure. The aim is to increase test coverage percentage of the code base with each release. 100% coverage is probably not realistic or pragmatic. Anything above 60% is VERY good going. Although in some areas it would be more beneficial to have 100% coverage of a more complex part of an application whilst the wider coverage may be less than 50%.

There are various options for measuring code coverage that can be integrated into build cycles:

Measuring Complexity: Visual Studio Code Metrics

Visual Studio has built-in Code Metrics reporting which, when run on this application, indicate that there are no major problems:

image

Click Analyze menu, and then click Calculate Code Metrics for Solution.

Basic Correctness & Static Code Analysis: Visual Studio Code Analysis

Visual Studio has integrated code analysis features (version dependent). The settings can be accessed via the project properties under “Code Analysis”. As a starting point I use the “Microsoft Basic Design Guideline Rules”:

image

The analysis doesn’t yield much in this example, since there isn't a lot of managed code:

image

Style Consistency: StyleCop

Running StyleCop on this project returns some minor complains regarding layout and the ordering of namespace imports:

image

I add some simple comments to methods and a document header to the code behind and the violation warnings clear.

Note: changes need to be made to the project file to run StyleCop as part of a build. The following post has details of how-to: HowTo: apply StyleCop Settings on several projects.

Feature Rich Code Analysis: NDepend

NDepend is a paid 3rd party tool  that analyses assemblies and provides code metrics that incorporates some of the points previously discussed. A diagram I like is Abstractness vs. Instability.

The Abstractness versus Instability Diagram helps to detect which assemblies are potentially painful to maintain (i.e concrete and stable) and which assemblies are potentially useless (i.e abstract and instable).

  • Abstractness: If an assembly contains many abstract types (i.e interfaces and abstract classes) and few concrete types, it is considered as abstract.
  • Stability: An assembly is considered stable if its types are used by a lot of types of tier assemblies. In these conditions stable means painful to modify.

This diagram indicates that the project is within the green zone, but scores highly on instability. Perhaps an indication that something isn’t ok? This metric suggests that the entire implementation is in one place – which is true!

AbstractnessVSInstability

Initial Conclusion

This example serves to demonstrate that there is a limit to what these automated tools can offer and that there really is no substitute for skilled developers.

Even with this simple example, this data can be used to measure incremental improvements as the code base evolves.

Refactoring to “Better” Code

Having refactored this example code to an MVP pattern approach we can review the same metrics and compare the results. There is no logic in the UI. No direct data access in the code-behind and the application pays better attention to separation of concerns.

I’ve written about the MVP pattern before and rather than repeating all of the code we will just review how the code-behind of the .ASPX now looks with the dependencies abstracted:

  1. using WhatMakesCodeGood.Model;
  2. using WhatMakesCodeGood.Presentation;
  3. using WhatMakesCodeGood.Views;
  4.  
  5. namespace WhatMakesGoodCode.Web
  6. {
  7.     using System;
  8.     using System.Collections.Generic;
  9.  
  10.     ///<summary>
  11.     /// The products page.
  12.     ///</summary>
  13.     public partial class Default : System.Web.UI.Page, IProductView
  14.     {
  15.         ///<summary>
  16.         /// The _product presenter.
  17.         ///</summary>
  18.         private readonly ProductPresenter _productPresenter;
  19.  
  20.         ///<summary>
  21.         /// Initializes a new instance of the <see cref="Default"/> class.
  22.         ///</summary>
  23.         public Default(ProductPresenter productPresenter)
  24.         {
  25.             _productPresenter = productPresenter;
  26.         }
  27.  
  28.         ///<summary>
  29.         /// The page_ load.
  30.         ///</summary>
  31.         ///<param name="sender">
  32.         /// The sender.
  33.         ///</param>
  34.         ///<param name="e">
  35.         /// The page event args.
  36.         ///</param>
  37.         protected void Page_Load(object sender, EventArgs e)
  38.         {
  39.             _productPresenter.Initialize();
  40.         }
  41.  
  42.         ///<summary>
  43.         /// Sets the products.
  44.         ///</summary>
  45.         public IEnumerable<Product> Products
  46.         {
  47.             set
  48.             {
  49.                 this.uiProducts.DataSource = value;
  50.                 this.uiProducts.DataBind();
  51.             }
  52.         }
  53.     }
  54. }

The project also contains the following layers:

  • Web – the .ASPX pages and code-behind files
  • Presentation – the presenters
  • Model – the domain model
  • DataMappers – data access code

The Abstractness vs. Instability diagram from NDepend now looks as follows:

AbstractnessVSInstability[6]

This diagram is suggesting that the Web and DataMappers projects are not abstract. It also places the Presentation project closer to “Uselessness” because it contains little code – this will change over time. Generally speaking, the markers are moving towards the centre of the diagram which suggests a better balance of dependencies.

Comparing the Results

Comparing the initial Code Metrics with the refactored version of only the ASP.Net project/code-behind shows the following differences:

Metric

Last Release

Current Release

Description

Maintainability Index

69

92

Higher is better
Cyclomatic Complexity

6

3

Lower is better
Class Coupling

16

10

Lower is better
Lines of Code

13

5

Lower is better (sometimes)

Overall there are more lines of code for the Presenter, Model and Data Mappers in the solution but with better layering and abstraction, which reminds us that just following numbers doesn’t indicate good design.

Creating a Baseline

The first two comparison were extreme opposites and unrealistic examples for contrast. Starting with the MVP implementation as a baseline we get the following metrics:

image

Note that this indicates the DataMappers project scores below 70 for the maintainability index. Which flags this as a candidate for refactoring.

For the next release, ideally there would be a limited decrease in the existing metrics and time permitting, an increase in the maintainability index of the under performers.

Refactoring Phase 2

The DataMappers assembly, which was the lowest scoring in terms of maintainability from the benchmark, contains only one class. Based on the hints provided in the metrics, what could be changed to make this class have a higher maintainability index? And does it even matter?

Here is the class in question:

  1. using System;
  2. using System.Collections.Generic;
  3. using System.Data;
  4. using System.Data.SqlClient;
  5. using System.Linq;
  6. using System.Text;
  7. using System.Threading.Tasks;
  8. using WhatMakesCodeGood.Model;
  9.  
  10. namespace WhatMakesCodeGood.DataMappers
  11. {
  12.     public class ProductDataMapper : IProductDataMapper
  13.     {
  14.         public IEnumerable<Product> Products
  15.         {
  16.             get
  17.             {
  18.                 SqlConnection sqlConnection1 = new SqlConnection(@"Data Source=(localdb)\v11.0;Initial Catalog=GoodCodeStore;Integrated Security=True;MultipleActiveResultSets=True");
  19.                 SqlCommand cmd = new SqlCommand("SELECT Id, ProductTitle, Description, Price, Stock, ReleaseDate FROM PRODUCTS", sqlConnection1);
  20.  
  21.                 sqlConnection1.Open();
  22.  
  23.                 List<Product> products = new List<Product>();
  24.  
  25.                 SqlDataReader reader = cmd.ExecuteReader(CommandBehavior.CloseConnection);
  26.  
  27.                 while (reader.Read())
  28.                 {
  29.                     Product product = new Product();
  30.  
  31.                     product.Id = (int)reader.GetValue(0);
  32.                     product.ProductTitle = (string)reader.GetValue(1);
  33.                     product.Description = (string)reader.GetValue(2);
  34.                     product.Price = (decimal)reader.GetValue(3);
  35.                     product.Stock = (short)reader.GetValue(4);
  36.                     product.ReleaseDate = (DateTime)reader.GetValue(5);
  37.  
  38.                     products.Add(product);
  39.                 }
  40.  
  41.                 sqlConnection1.Close();
  42.  
  43.                 return products;
  44.             }
  45.         }
  46.     }
  47. }

After refactoring the ProductDataMapper (code snippets to follow) the comparisons are:

Metric

Benchmark

Refactored

Description

Maintainability Index

63

84

Higher is better
Cyclomatic Complexity

3

9

Lower is better
Class Coupling

9

15

Lower is better
Lines of Code

18

14

Lower is better (sometimes)

What’s changed? Let’s start with the ProductDataMapper class. The dependency with the SqlConnection, SqlCommand and SqlRepeater have been abstracted away:

  1. namespace WhatMakesCodeGood.DataMappers
  2. {
  3.     public class ProductDataMapper : AbstractDataMapper<Product>, IProductDataMapper
  4.     {
  5.         protected override string FindAllSelect
  6.         {
  7.             get { return"SELECT Id, ProductTitle, Description, Price, Stock, ReleaseDate FROM PRODUCTS"; }
  8.         }
  9.  
  10.         protected override Product Map(IDataRecord reader)
  11.         {
  12.             return new Product()
  13.                 {
  14.                     Id = (int)reader.GetValue(0),
  15.                     ProductTitle = (string)reader.GetValue(1),
  16.                     Description = (string)reader.GetValue(2),
  17.                     Price = (decimal)reader.GetValue(3),
  18.                     Stock = (short)reader.GetValue(4),
  19.                     ReleaseDate = (DateTime)reader.GetValue(5)
  20.                 };
  21.         }
  22.     }
  23. }

The code for the abstract class AbstractDataMapper:

  1. namespace WhatMakesCodeGood.DataMappers
  2. {
  3.     public abstract class AbstractDataMapper<T>
  4.     {
  5.         protected abstract string FindAllSelect { get; }
  6.  
  7.         protected IDbConnection Connection
  8.         {
  9.             get
  10.             {
  11.                 return new SqlConnection(ConfigurationManager.ConnectionStrings["WhatMakesCodeGood"].ConnectionString);
  12.             }
  13.         }
  14.  
  15.         public virtual IEnumerable<T> FindAll()
  16.         {
  17.             SqlCommand cmd = new SqlCommand(FindAllSelect, Connection as SqlConnection);
  18.  
  19.             List<T> items = new List<T>();
  20.  
  21.             SqlDataReader reader = cmd.ExecuteReader(CommandBehavior.CloseConnection);
  22.  
  23.             while (reader.Read())
  24.             {
  25.                 items.Add(Map(reader));
  26.             }
  27.  
  28.             return items;
  29.         }
  30.  
  31.         protected abstract T Map(IDataRecord reader);
  32.     }
  33. }

Since there was only once instance of the ProductDataMapper, this simple example doesn’t convey the full potential impact of this refactor

Consider if there were four different variants of the initial un-refactored ProductDataMapper class:

image

Refactoring these classes and abstracting away the dependencies should demonstrate a more profound impact on the metrics. The refactored class now look as per the class diagram:

image

Here is how the metrics compare:

Metric

Last Release

Current Release

Description

Maintainability Index

68

88

Higher is better. Indicates 20% improvement.
Cyclomatic Complexity

12

18

Lower is better.
Class Coupling

15

20

Lower is better. This increases slightly, since the abstract class is introduced.
Lines of Code

48

26

Lower is better (sometimes). Significant reduction, even for only 4 classes.

The metric suggests a 20% improvement. However, in my opinion the improvement is 1000% which suggests there is still a requirement for developer skill.

Calculating Maintainability

The formula used by the Visual Studio Code Metrics is:

MAX(0,(171 - 5.2 * ln(Halstead Volume) - 0.23 * (Cyclomatic Complexity) - 16.2 * ln(Lines of Code))*100 / 171)

  • Halstead: measurable properties of software, and the relations between them. Think dependencies!
  • Cyclomatic Complexity: directly measures the number of linearly independent paths through a program's source code. Think of lots of nested IF statements!
  • Lines of Code: the lines of code within a class/project/application.

You can read more about the Maintainability Index Range and Meaning.

Build Integration

There are many combinations of how these tools can be used i.e. Team City with ReSharper. TFS with StyleCop and Code Analysis reporting to name a few.

Conclusion

These tools won't write great code for you and are little more than a productivity tool. When integrated as part of a release cycle these metrics offer a very useful "heads up display" of how a code base is evolving (degrading?). These tools can help you identify the problem area more efficiently, but they won't tell you how to resolve it.

Disclaimer: I was given a free NDepend license for developer evaluation. I’m not associated with NDepend. I just think it has some nice features that are useful.

SHARE
Don't miss out on the free technical content:

Subscribe to Updates

CONNECT WITH BRADLEY

Bradley Braithwaite Software Blog Bradley Braithwaite is a software engineer who works for search engine start-ups. He is a published author at pluralsight.com. He writes about software development practices, JavaScript, AngularJS and Node.js via his website . Find out more about Brad. Find him via:
You might also like:
mean stack tutorial AngularJS Testing - Unit Testing Tutorials