Search
Recent Tweets
Monday
May202013

ASP.NET NLog Sql Server Logging and Error Handling Part 1

It was the eleventh hour before a web app was to go live and there was nothing in place for error handling or logging. You're shocked I'm sure (that's my sarcastic voice). Normally with web apps in the past I've used ELMAH for unhandled error logging and NLog with text file targets for other logging.

ELMAH works great but in this case I was told it was pulled for some reason. With no time to find out the story there I threw together something to capture basic error details and send an email. Later that grew and was customized and ELMAH never returned though perhaps it should have. Still there are benefits of doing this yourself including more customization, less bloat, less hunting through perhaps lacking documentation etc.

On the logging front I still wanted to use NLog but text file logging was no longer a fit. This app had a few instances behind a load balancer and central logging was needed plus it needed a higher volume of logging in some troublesome workflows, more so than the occasional errors or warnings here and there. I had not used NLog with SQL Server before but this seemed like the perfect time to do so.

Series Overview

Part 1 - Setting up logging with ASP.NET MVC, NLog and SQL Server

Part 2 - Unhandled exception processing, building an error report, emailing errors, and custom error pages.

Setup and Configuration

I began by installing the NLog NuGet package and the NLog.Extended package for asp.net specific information into the ASP.NET MVC 4 project. The NLog.Config package is useful for a standalone NLog config file but for web apps with existing config transforms I find embedding the configuration in Web.config easier.

In web.config under configuration/configSections, the NLog section is defined:
	<section name="nlog" type="NLog.Config.ConfigSectionHandler, NLog" />
Under <configuration> my initial NLog config looked like the below, using the async attribute on targets to make each target work asynchronously:
<nlog xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" throwExceptions="true" 
    internalLogLevel="Debug" internalLogFile="${basedir}/NlogInternal.log">
    <!-- go to http://nlog-project.org/wiki/Configuration_file for more information -->
    <extensions>
      <add assembly="NLog.Extended" />
    </extensions>
    <targets async="true">
      <target xsi:type="Database" name="dbTarget" connectionStringName="Logger" 
	  commandText="exec sLogEvent_Insert @time_stamp, @level, @logger, @userName, @url, 
	    @machineName, @sessionId, @threadId, @referrer, @userAgent, @code, @message">
        <parameter name="@time_stamp" layout="${date}" />
        <parameter name="@level" layout="${level}" />
        <parameter name="@logger" layout="${logger}" />
        <parameter name="@userName" layout="${identity}" />
        <parameter name="@url" layout="${aspnet-request:serverVariable=Url}" />
        <parameter name="@machineName" layout="${machinename}" />
        <parameter name="@sessionId" layout="${aspnet-sessionid}" />
        <parameter name="@threadId" layout="${threadid}" />
        <parameter name="@referrer" 
			layout="${aspnet-request:serverVariable=HTTP_REFERER}" />
        <parameter name="@userAgent" 
			layout="${aspnet-request:serverVariable=HTTP_USER_AGENT}" />
		<parameter name="@code" layout="${event-context:item=Code}" />
        <parameter name="@message" layout="${message}" />
      </target>
      <target name="debugTarget" xsi:type="Debugger" 
		layout="${time}|${level:uppercase=true}|${logger}|${message}" />
    </targets>
    <rules>
      <!-- Levels: Off, Trace, Debug, Info, Warn, Error, Fatal -->
      <logger name="*" minlevel="Trace" writeTo="debugTarget,dbTarget" />
    </rules>
</nlog>
Later I swapped to explicit use of AsyncWrapper targets as it allowed me to control the behavior more explicitly and mirrored the reality at runtime, at the cost of being more verbose.
<nlog xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" throwExceptions="true" 
    internalLogLevel="Debug" internalLogFile="${basedir}/NlogInternal.log">
    <!-- go to http://nlog-project.org/wiki/Configuration_file for more information -->
    <extensions>
      <add assembly="NLog.Extended" />
    </extensions>
    <targets>
      <target name="asyncDbWrapperTarget" xsi:type="AsyncWrapper" queueLimit="10000" 
	  timeToSleepBetweenBatches="50" batchSize="100" overflowAction="Block">
        <target xsi:type="Database" name="dbTarget" connectionStringName="Logger" 
		commandText="exec sLogEvent_Insert @time_stamp, @level, @logger, @userName, @url, 
		@machineName, @sessionId, @threadId, @referrer, @userAgent, @code, @message">
          <parameter name="@time_stamp" layout="${date}" />
          <parameter name="@level" layout="${level}" />
          <parameter name="@logger" layout="${logger}" />
          <parameter name="@userName" layout="${identity}" />
          <parameter name="@url" layout="${aspnet-request:serverVariable=Url}" />
          <parameter name="@machineName" layout="${machinename}" />
          <parameter name="@sessionId" layout="${aspnet-sessionid}" />
          <parameter name="@threadId" layout="${threadid}" />
          <parameter name="@referrer" 
			layout="${aspnet-request:serverVariable=HTTP_REFERER}" />
          <parameter name="@userAgent" 
			layout="${aspnet-request:serverVariable=HTTP_USER_AGENT}" />
          <parameter name="@code" layout="${event-context:item=Code}" />
          <parameter name="@message" layout="${message}" />
        </target>
      </target>
      <target name="asyncDebugWrapperTarget" xsi:type="AsyncWrapper" queueLimit="10000" 
	  timeToSleepBetweenBatches="50" batchSize="100" overflowAction="Block">
        <target name="debugTarget" xsi:type="Debugger" 
			layout="${time}|${level:uppercase=true}|${logger}|${message}" />
      </target>
    </targets>
    <rules>
      <!-- Levels: Off, Trace, Debug, Info, Warn, Error, Fatal -->
      <logger name="*" minlevel="Trace" 
		writeTo="asyncDebugWrapperTarget,asyncDbWrapperTarget" />
    </rules>
 </nlog>
The database target's connectionStringName value of Logger points to the DB connection string to use.
<connectionStrings>
<add name="Logger" providerName="System.Data.SqlClient" 
   connectionString="Data Source=(local);Initial Catalog=Chinook.Logs;Persist Security Info=True;User ID=ChinookLogger;Password=L0gger!" />
</connectionStrings>

The Database

Our DBA setup a separate database for logging data to set DB properties differently as log data and transactional app data are different beasts. The sample app for this post is written against the Chinook sample database so I named the log database Chinook.Logs. The primary table script follows, minus the indexes, SET statements and the like.
CREATE TABLE [dbo].[LogEvent](
	[LogId] [int] IDENTITY(1,1) NOT NULL,
	[LogDate] [datetime] NOT NULL,
	[EventLevel] [nvarchar](50) NOT NULL,
	[LoggerName] [nvarchar](500) NOT NULL,
	[UserName] [nvarchar](50) NULL,
	[Url] [nvarchar](1024) NULL,
	[MachineName] [nvarchar](100) NOT NULL,
	[SessionId] [nvarchar](100) NULL,
	[ThreadId] [int] NULL,
	[Referrer] [nvarchar](1024) NULL,
	[UserAgent] [nvarchar](500) NULL,
	[Code] [nvarchar](10) NULL,
	[LogMessage] [nvarchar](max) NOT NULL,
	[PartitionKey] [tinyint] NOT NULL,
PRIMARY KEY CLUSTERED 
(
	[LogId] ASC,
	[PartitionKey] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
)
The table is pretty straightforward with perhaps a couple of exceptions. The Code field was to be used for custom error codes (HTTP or otherwise) or "tagging" log statements. PartitionKey was added by our DBA mainly for archiving log data but also for efficient querying by day; more on this to come. Finally I originally had the string columns as varchar but we found that NLog used nvarchar and a conversion was happening with each insert that had a slight performance impact.

The insert sproc grabs the day of the week and inserts that as the partition key and the rest of the data just passes through.
CREATE PROCEDURE [dbo].[sLogEvent_Insert]
	@time_stamp datetime,
	@level nvarchar(50),
	@logger nvarchar(500),
	@userName nvarchar(50),
	@url nvarchar(1024),
	@machineName nvarchar(100),
	@sessionId nvarchar(100),
	@threadId int,
	@referrer nvarchar(1024),
	@userAgent nvarchar(500),
	@code nvarchar(10),
	@message nvarchar(max)
AS
BEGIN
	SET NOCOUNT ON;
	Declare @currentDate Datetime
	declare @partitionKey tinyint
	
	set		@currentDate = getdate()
	set		@partitionKey = DATEPART(weekday, @currentDate)

	INSERT INTO [dbo].[LogEvent]
           ([LogDate]
           ,[EventLevel]
           ,[LoggerName]
           ,[UserName]
           ,[Url]
           ,[MachineName]
           ,[SessionId]
           ,[ThreadId]
           ,[Referrer]
           ,[UserAgent]
		   ,[Code]
           ,[LogMessage]
		   ,[PartitionKey])
     VALUES
           (@time_stamp
           ,@level
           ,@logger
           ,@userName
           ,@url
           ,@machineName
           ,@sessionId
           ,@threadId
           ,@referrer
           ,@userAgent
		   ,@code
           ,@message
		   ,@partitionKey);
END
An identical log table was created but with the name LogEvent_Switched. The cleanup sproc was created by our DBA:
CREATE PROCEDURE [DBA].[WeekdayPartitionCleanup_PartitionSwitching] 
AS
BEGIN
	SET NOCOUNT ON;
	declare @partitionKey int
	declare @SQLCommand nvarchar(1024)
	truncate table [Chinook.Logs].dbo.LogEvent_Switched;

	set @partitionKey = datePart(weekday, getdate()) + 1
	if(@partitionkey >7)
		set @partitionKey = 1

	set @SQLCommand = 'alter table [Chinook.Logs].dbo.LogEvent switch partition  ' 
	  + cast(@partitionKey as varchar) + ' to [Chinook.Logs].dbo.LogEvent_Switched;'
	exec sp_executesql @SQLCommand
END
That was scheduled to run daily. So if today is Friday, datePart(weekday, getdate()) returns 6, +1 is Saturday so all of last Saturday's log records get automagically switched over to the LogEvent_Switched table. This leaves the last 6 days of log records in LogEvent and the 7th day in LogEvent_Switched. If you don't have the Enterprise edition and can't use partitioning, regular deletes may work but could be problematic when deleting a large number of rows during frequent inserts or selects.

Finally permission to execute the sproc was granted and then it was back to .net land.
GRANT EXECUTE on dbo.sLogEvent_Insert to ChinookLogger;

Wiring up the Logging Code

First an interface that mostly mirrors NLog's Logger class.
namespace NLogSql.Web.Infrastructure.Diagnostics.Logging
{
    public interface ILog
    {
        void Debug(string format, params object[] args);
        void Error(string format, params object[] args);
        void Fatal(string format, params object[] args);
        void Info(string format, params object[] args);
        void Trace(string format, params object[] args);
        void Warn(string format, params object[] args);

        // custom
        void Write(LogType type, object properties, string message, params object[] args);

        bool IsDebugEnabled { get; }
        bool IsErrorEnabled { get; }
        bool IsFatalEnabled { get; }
        bool IsInfoEnabled { get; }
        bool IsTraceEnabled { get; }
        bool IsWarnEnabled { get; }
    }
}
The functionality of the Log class is provided mostly by inheriting from NLog's Logger class. One custom method is defined to allow passing custom log properties much like you would in ASP.NET (new {code = "404"}).
using System.ComponentModel;
using System.Globalization;
using NLog;

namespace NLogSql.Web.Infrastructure.Diagnostics.Logging
{
    public class Log : Logger, ILog
    { 
        public void Write(LogType type, object properties, string message, 
			params object[] args)
        {
            var info = new LogEventInfo(LogLevel.FromOrdinal((int)type), Name, 
				CultureInfo.CurrentCulture, message, args);

            if (null != properties)
            {
                foreach (PropertyDescriptor propertyDescriptor 
					in TypeDescriptor.GetProperties(properties))
                {
                    var value = propertyDescriptor.GetValue(properties);
                    info.Properties[propertyDescriptor.Name] = value;
                }
            }

            Log(info);
        }
    }

    public enum LogType
    {
        Trace, Debug, Info, Warn, Error, Fatal
    }
}
A log instance is typically created via injecting ILog into the constructor of the class needing logging. Here I use Ninject to wire this up.
using System;
using System.Linq;
using System.Web.Mvc;
using NLog;
using NLogSql.Web.Infrastructure.Diagnostics.Logging;
using NLogSql.Web.Infrastructure.ErrorHandling;
using Ninject.Activation;
using Ninject.Modules;
using Ninject.Web.Mvc.FilterBindingSyntax;

namespace NLogSql.Web.DI
{
    public class DiagnosticsModule : NinjectModule
    {
        public override void Load()
        {
            Bind<ILog>().ToMethod(CreateLog);
            Bind<IErrorReporter>().To<ErrorReporter>();
            Bind<IEventLogWriter>().To<EventLogWriter>();
            Bind<IErrorEmailer>().To<ErrorEmailer>();

            Kernel.BindFilter<AppErrorHandlerAttribute>(FilterScope.Controller, 0);
        }

        private static ILog CreateLog(IContext ctx)
        {
            var p = ctx.Parameters.FirstOrDefault(x => 
				x.Name == LogConstants.LoggerNameParam);
            var loggerName = (null != p) ? p.GetValue(ctx, null).ToString() : null;

            if (string.IsNullOrWhiteSpace(loggerName))
            {
                if (null == ctx.Request.ParentRequest)
                {
                    throw new NullReferenceException(
                        "ParentRequest is null; unable to determine logger name; " 
						+ "if not injecting into a ctor a parameter for the "
						+ "logger name must be provided");
                }

                var service = ctx.Request.ParentRequest.Service;
                loggerName = service.FullName;
            }

            var log = (ILog)LogManager.GetLogger(loggerName, typeof(Log));
            return log;
        }
    }
}
In some cases we can't easily do ctor injection - for example in the application class of Global.asax.cs or in a filter attribute. For these exception cases AppLogFactory is used to create an instance of ILog.
using System;
using System.Diagnostics;
using System.Web.Mvc;
using Ninject;
using Ninject.Parameters;

namespace NLogSql.Web.Infrastructure.Diagnostics.Logging
{
    /// <summary>
    /// Creates a log object for those instances where one cannot be injected (i.e. app startup).
    /// Generally you should just ctor inject ILog
    /// </summary>
    public static class AppLogFactory
    {
        public static ILog Create()
        {
            var declaringType = new StackTrace(1, false).GetFrame(1)
				.GetMethod().DeclaringType;

            if (declaringType != null)
            {
                var loggerName = declaringType.FullName;
                return Create(loggerName);
            }

            throw new InvalidOperationException(
				"Could not determine declaring type; specify logger name explicitly");
        }

        public static ILog Create<T>()
        {
            return Create(typeof(T).FullName);
        }

        public static ILog Create(string loggerName)
        {
            var log = Kernel.Get<ILog>(
				new ConstructorArgument(LogConstants.LoggerNameParam, loggerName));
            return log;
        }

        private static IKernel Kernel
        {
            get
            {
                return DependencyResolver.Current.GetService<IKernel>();
            }
        }
    }

    public class LogConstants
    {
        public const string LoggerNameParam = "loggerName";
    }
}

Exercising the Log

First a simple test, injecting ILog into a controllor ctor, ensuring it isn't null using the Ensure.That NuGet package and logging a count around a database fetch.
public partial class HomeController : Controller
{
	private readonly IMappingService _mappingService;
	private readonly ILog _log;
	private readonly IMusicService _musicService;

	public HomeController(IMusicService musicService, ILog log, IMappingService mappingService)
	{
		_mappingService = Ensure.That(mappingService, "mappingService").IsNotNull().Value;
		_log = Ensure.That(log, "log").IsNotNull().Value;
		_musicService = Ensure.That(musicService, "musicService").IsNotNull().Value;
	}
	
	public virtual ActionResult Index()
	{
		_log.Debug("Retrieving genres");
		var genres = GetGenres().Result;
		_log.Info("Retrieved {0} music genres from the database", genres.Count);
		// ...
	}
}
To get a little more value, an action filter to log each controller action executed along with the execution time.
public class ActionTrackerAttribute : ActionFilterAttribute
{
	private Stopwatch Watch { get; set; }
	private ILog Log { get; set; }
	private ActionExecutingContext FilterContext { get; set; }

	private string ActionName
	{
		get { return FilterContext.ActionDescriptor.ActionName; }
	}

	private string ControllerName
	{
		get { return FilterContext.ActionDescriptor.ControllerDescriptor.ControllerName; }
	}

	private Uri Url
	{
		get { return FilterContext.RequestContext.HttpContext.Request.Url; }
	}

	public override void OnActionExecuting(ActionExecutingContext filterContext)
	{
		base.OnActionExecuting(filterContext);

		try
		{
			FilterContext = filterContext;
			Log = AppLogFactory.Create<ActionTrackerAttribute>();
			Log.Trace("Executing {0}.{1}", ControllerName, ActionName);
			Watch = Stopwatch.StartNew();
		}
		catch (Exception ex)
		{
			Trace.WriteLine(ex);
		}
	}

	public override void OnResultExecuted(ResultExecutedContext filterContext)
	{
		base.OnResultExecuted(filterContext);

		try
		{
			Watch.Stop();
			Log.Info("Executed {0}.{1} for {2} in {3:##0.000} second(s)", 
				ControllerName, ActionName, Url, Watch.Elapsed.TotalSeconds);
		}
		catch (Exception ex)
		{
			Trace.WriteLine(ex);
		}
	}
}

Troubleshooting Failed Logging

When setting up logging initially or making changes later, logging might not work. Usually I'd first fire up SQL Server Profiler and watch for calls to the log insert sproc. If I didn't see them then I'd know it's probably a configuration issue. In the nlog tag in the config file I set throwExceptions="true" internalLogLevel="Debug" internalLogFile="${basedir}/NlogInternal.log" but in practice that rarely seemed to do anything; turning off async temporarily may further help troubleshoot.

First I'd check when creating the log if the log levels were enabled as expected from the config.


Next I'd check the configuration in code with a small class, also useful for changing config at runtime.
public class LogManagement
{
	public static AsyncTargetWrapper GetAsyncDbWrapperTarget()
	{
		var target = (AsyncTargetWrapper)LogManager.Configuration
			.FindTargetByName("asyncDbWrapperTarget");
		return target;
	}
}
From there I'd further inspect the configuration to see if everything looks okay.


If I did see the insert sproc call in SQL Server Profiler then I'd check DB setup and permissions and grab the SQL from Profiler...


... and paste into a query window, then format the Sql (I use Poor Man's TSql Formatter) and execute to check for errors.


Evaluating Async Logging

Because I'm paranoid, OCD, detailed and curious I wanted to ensure the async behavior and compare to non-async. On the home page I added a link to generate batch log records, corresponding to this controller action.
public virtual ActionResult BatchTest()
{
	var sw = Stopwatch.StartNew();
	const int count = 10000;
	for (var i = 0; i < count; i++)
	{
		_log.Trace("Testing {0}", i);
	}
	sw.Stop();
	var msg = string.Format("{0} log invokes in {1:##0.000} seconds", count, 
		sw.Elapsed.TotalSeconds);
	_log.Info(msg);

	return new ContentResult { Content = msg };
}
The async result:


And the non-async result:


Evaluating Log Queries

Usually when inspecting log records I'd select just those columns I'm interested in and order by LogId DESC.


Once enough log data was there it was helpful to query on the columns that would typically be filtered on and evaluate the execution plan and response times. From there indexes were added as needed for columns such as EventLevel, URL, UserName, etc. and execution times compared afterwards.

Preparing for Deployment

In Web.Config.Release NLog config transformations are performed to change the log level from Trace to Info, to remove debugger output, and to tone down NLog internal issue reporting.
<nlog xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xdt:Transform="SetAttributes"
        throwExceptions="false" internalLogLevel="Warn" 
		internalLogFile="${basedir}/NlogInternal.log">
    <rules>
       <!-- Levels: Off, Trace, Debug, Info, Warn, Error, Fatal -->
       <logger name="*" minlevel="Info" writeTo="asyncDbWrapperTarget" 
	      xdt:Transform="SetAttributes" xdt:Locator="Match(name)" />
    </rules>
</nlog>
Slow Cheetah preview:
Note that removing the nlog xsd attribute in the root web.config (and all transforms) was needed for the transformation to work correctly. Also at one point I tried a replace transform to replace the entire nlog tag but found it didn't work correctly and it created too much duplicate content between configurations.

Some Queries


Everything today:
SELECT *
FROM LogEvent WITH (NOLOCK)
WHERE PartitionKey = (
		SELECT DATEPART(weekday, getdate())
		)
ORDER BY logid DESC;
Errors and warnings in the past 2 hours:
SELECT *
FROM LogEvent WITH (NOLOCK)
WHERE eventlevel IN (
		'Error'
		,'Warn'
		)
	AND DATEDIFF(hh, LogDate, GETDATE()) <= 2
ORDER BY logid DESC;
Errors by day and url:
SELECT cast(logdate AS DATE) [Day]
	,url
	,count(*) ErrorCount
FROM LogEvent WITH (NOLOCK)
WHERE eventlevel = 'Error'
	AND len(url) > 0
GROUP BY cast(logdate AS DATE)
	,url
ORDER BY Day DESC
	,Url;
Errors by logger name / class name / component area:
SELECT LoggerName
	,count(*) AS ErrorCount
FROM LogEvent WITH (NOLOCK)
WHERE EventLevel = 'Error'
GROUP BY LoggerName
ORDER BY count(*) DESC;
Unhandled exceptions today:
SELECT *
FROM LogEvent WITH (NOLOCK)
WHERE PartitionKey = (
		SELECT DATEPART(weekday, getdate())
		)
	AND eventlevel = 'Error'
	AND LoggerName IN ('NLogSql.Web.Infrastructure.ErrorHandling.IErrorReporter')
ORDER BY logid DESC;

Future Enhancements / Considerations

  • Web view to read, search, filter log maybe with some SignalR goodness
  • AppId type column if logging from multiple apps w/o desire for multiple log databases
  • Possibly a separate error table that the log table could link to for errors with the full error report split up

The Code

The code for this series is available at https://github.com/thnk2wn/NLogSql.Web
Monday
Oct012012

Using T4 to Generate Typesafe Enum Classes and Resource Files

In working on the domain layer of an application, I wrote a couple of typesafe enumeration classes that mirrored data in a couple of reference tables in a database. If you are not familiar with the pattern, Jimmy Bogard's Enumeration classes post explains the pattern and rationale. The classes I wrote inherit from a modified version of the Enumeration class presented in Jimmy's post. The basic idea was to avoid switches and provide more functionality than an enum could offer. In our case the enum members mirrored a reference table and would be used for domain logic and for dropdown lists.

As the app progressed we had more of a need to do this for other reference tables, some of which had several records. At this point we decided to look into T4 to generate the typesafe enum classes. Since this application has a worldwide audience, the display names of the enum members needed globalization consideration since some of the enum values would end up in dropdown lists displayed to end users. I decided to create two T4 files, one to generate resource files with display name strings, and another to generate the enum classes that would use the resource strings.

Reference Datasource

This application uses XML files for various database definitions including reference data; these files get parsed to build the SQL Server database in automated fashion.

A sample of reference_data.xml looks something like:
<referenceData>
	<refTable name="CONTACT_TYPE" alias="ContactType" default="Person">
		<record CONTACT_TYPE_ID="1" CONTACT_TYPE_NAME="Company" RANK="2">
			<langs field="CONTACT_TYPE_NAME">
				<lang name="en">Company</lang>
				<lang name="de">Gesellschaft</lang>
				<lang name="es">Empresa</lang>
				<lang name="zh-CN">公司</lang>
				<lang name="fr">Société</lang>
			</langs>
		</record>
		<record CONTACT_TYPE_ID="2" CONTACT_TYPE_NAME="Person" RANK="1">
			<langs field="CONTACT_TYPE_NAME">
				<lang name="en">Person</lang>
				<lang name="de">Person</lang>
				<lang name="es">Persona</lang>
				<lang name="zh-CN">人</lang>
				<lang name="fr">Personne</lang>
			</langs>
		</record>
	</refTable>
</referenceData>

Schema Metadata and Parsing

Since multiple projects needed to perform T4 code generation off database schema information, I defined some T4 include files in a shared project to define the schema structure and XML parsing logic. This would make the T4 code easier and prevent duplicating XML parsing logic in multiple T4 files.

First SchemaMetadata.ttinclude defines the classes that hold the schema information in a T4-friendly manner:


Next SchemaReader.ttinclude would get consumed by T4 files to parse the XML data and return friendly objects defined in SchemaMetadata.ttinclude:


This shared T4 folder also included MultipleOutputHelper.ttinclude from this Damien Guard post to make splitting T4 output into multiple files a bit easier.

RefDataResources.tt

This T4 file generates one resource file per language and will setup the custom tool to produce the designer generated file to reference the resources in code:

Inside the function block of this file the schema data is loaded and the resource files are generated:
const string ReferenceData = "ReferenceData";

private SchemaMetadata LoadSchema()
{
	var loader = new SchemaReader(Host);
	var schema = loader.Load(SchemaReader.LoadOption.ReferenceData);
	return schema;
}

private void GenerateResourceFiles(SchemaMetadata schema, Manager manager, 
	DirectoryInfo t4DirInfo)
{
	var distinctLanguages = schema.ReferenceData.DistinctLanguages;
	foreach (var lang in distinctLanguages)
	{
		var resxNameNoExt = ("en" != lang) ? ReferenceData + "." + lang : ReferenceData;
		var resxName = resxNameNoExt + ".resx";
		manager.StartNewFile(resxName);

		var resxFilename = Path.Combine(t4DirInfo.FullName, resxName);
		// use .net's ResXResourceWriter so we don't have to worry about the XML format
		using (ResXResourceWriter  resx = new ResXResourceWriter(resxFilename))
		{
			var strings = schema.ReferenceData.StringsForLanguage(lang);

			foreach (var warn in schema.Warnings)
			{
				base.Warning(warn);
			}

			foreach (var de in strings)
			{
				try 
				{
					resx.AddResource(de.Key, de.Value);
				}
				catch (Exception ex)
				{
					base.Warning(ex.ToString());
				}
			}

			resx.Generate();
			resx.Close();
		}

		// we've written the file but outside the process of T4. 
		// In order to get the file to automatically added as a new output file 
		// underneath the t4 file, we must write the generated content to output stream
		Write(File.ReadAllText(resxFilename));

		manager.EndBlock();
		
	} // end for each lang loop
}

In the above code block there are a couple of things worth pointing out. First, the filename of the main / default language resource file will be ReferenceData.resx for English (en), otherwise ReferenceData.lang.resx for other languages. Second, the output from .NET's ResXResourceWriter gets read in with File.ReadAllText and written to T4 output with Write; otherwise the generated content would just exist on disk and would not get added into the project nested under the T4 file.

Finally, to get the designer generated class created, a function is created to set the custom tool property on the main ReferenceData.resx file that was generated. For the initial add that would be enough. However we also invoke execution of the custom tool with RunCustomTool() to handle the case where reference data is modified later on and the T4 transformation is performed again:
private void SetCustomToolOnMainResourceFile(DirectoryInfo t4DirInfo)
{
	// WARNING: You are entering the dark land of EnvDTE COM. You've been warned
	var hostServiceProvider = (IServiceProvider) Host;
	var dte = (EnvDTE.DTE) hostServiceProvider.GetService(typeof(EnvDTE.DTE));
	var filename = Path.Combine(t4DirInfo.FullName, "ReferenceData.resx");
	var projectItem = dte.Solution.FindProjectItem(filename);
	projectItem.Properties.Item("CustomTool").Value = "PublicResXFileCodeGenerator";
	projectItem.Properties.Item("CustomToolNamespace").Value = "App.Domain";
	var projItemObj = (VSProjectItem)projectItem.Object;
	projItemObj.RunCustomTool();
}

Running the T4 produces the following files:


The resource strings are created with a TableName_FieldNameValue format:


Enums.tt

Enums.tt reads the schema data just as in RefDataResources.tt and generates the C# typesafe enum class. The full source is available with the sample code for this post.


A small example of the generated output (ContactType.generated.cs):
//------------------------------------------------------------------------------
// <auto-generated>
//     This code was generated from a template.
//
//     Changes to this file may cause incorrect behavior and will be lost if
//     the code is regenerated.
// </auto-generated>
//------------------------------------------------------------------------------
using System;
using System.CodeDom.Compiler;
using System.Diagnostics;
using System.Diagnostics.CodeAnalysis;
using JetBrains.Annotations;

namespace App.Domain
{
	public partial class ContactType : Enumeration
	{		
		/// <summary>
		/// ContactType of Company (ID: 1, "Company")
		/// </summary>
		[GeneratedCode("TextTemplatingFileGenerator", "11")]
		public static readonly ContactType Company 
			= new ContactType( 1, ReferenceData.ContactType_Company );

		/// <summary>
		/// ContactType of Person (ID: 2, "Person")
		/// </summary>
		[GeneratedCode("TextTemplatingFileGenerator", "11")]
		public static readonly ContactType Person 
			= new ContactType( 2, ReferenceData.ContactType_Person );


		[ExcludeFromCodeCoverage, DebuggerNonUserCode, GeneratedCode("TextTemplatingFileGenerator", "11")]
		private ContactType( int value, string displayName ) : base( value, displayName ) { }

		[ExcludeFromCodeCoverage, DebuggerNonUserCode, GeneratedCode("TextTemplatingFileGenerator", "11")]
		public static ContactType Default()
		{
			return Person;
		}

		[UsedImplicitly, Obsolete("ORM runtime use only")]
		[ExcludeFromCodeCoverage, DebuggerNonUserCode, GeneratedCode("TextTemplatingFileGenerator", "11")]
		private ContactType() { }
	}
}

Of course there's more value in automatically generating all of the reference enums or at least those with more members:


Testing It Out

First some basic tests just to ensure the resource files and strongly typed resource class generated correctly:
using System.Globalization;
using NUnit.Framework;
using System.Threading;

namespace App.Domain.Tests
{
    [TestFixture, Category("Unit")]
    public class ReferenceDataI18NTests
    {
        [Test]
        public void English_To_French_Strings_Change()
        {
            var orig = Thread.CurrentThread.CurrentUICulture;
            Assert.AreEqual("Company", ReferenceData.ContactType_Company);
            Thread.CurrentThread.CurrentUICulture = new CultureInfo("fr-FR");
            Assert.AreEqual("Société", ReferenceData.ContactType_Company);
            Thread.CurrentThread.CurrentUICulture = orig;
        }

        [Test]
        public void English_To_Spanish_Strings_Change()
        {
            var orig = Thread.CurrentThread.CurrentUICulture;
            Assert.AreEqual("Company", ReferenceData.ContactType_Company);
            Thread.CurrentThread.CurrentUICulture = new CultureInfo("es");
            Assert.AreEqual("Empresa", ReferenceData.ContactType_Company);
            Thread.CurrentThread.CurrentUICulture = orig;
        }

        [Test]
        public void English_To_German_Strings_Change()
        {
            var orig = Thread.CurrentThread.CurrentUICulture;
            Assert.AreEqual("Company", ReferenceData.ContactType_Company);
            Thread.CurrentThread.CurrentUICulture = new CultureInfo("de");
            Assert.AreEqual("Gesellschaft", ReferenceData.ContactType_Company);
            Thread.CurrentThread.CurrentUICulture = orig;
        }

        [Test]
        public void English_To_Chinese_Taiwan_Strings_Change()
        {
            var orig = Thread.CurrentThread.CurrentUICulture;
            Assert.AreEqual("Company", ReferenceData.ContactType_Company);
            Thread.CurrentThread.CurrentUICulture = new CultureInfo("zh-CN");
            Assert.AreEqual("公司", ReferenceData.ContactType_Company);
            Thread.CurrentThread.CurrentUICulture = orig;
        }
    }
}

Just to make sure the Enumeration class works as expected, some tests to exercise it through one of the concrete classes:
using System.Linq;
using FluentAssertions;
using NUnit.Framework;

namespace App.Domain.Tests
{
    [TestFixture(
        Description = "Tests base Enumeration class via PhoneType concrete class")]
    [Category("Unit")]
    public class EnumerationTests
    {
        [Test]
        public void FromValue_Matches_Type_Value()
        {
            var fromValue = Enumeration.FromValue<PhoneType>(PhoneType.Cell.Value);
            Assert.AreEqual(PhoneType.Cell, fromValue);

            fromValue = Enumeration.FromValue<PhoneType>(PhoneType.Voice.Value);
            Assert.AreEqual(PhoneType.Voice, fromValue);
        }

        [Test]
        public void FromDisplayName_Matches_Type_Name()
        {
            var fromName = Enumeration.FromDisplayName<PhoneType>(PhoneType.Fax.DisplayName);
            Assert.AreEqual(PhoneType.Fax.DisplayName, fromName.DisplayName);

            fromName = Enumeration.FromDisplayName<PhoneType>(PhoneType.Pager.DisplayName);
            Assert.AreEqual(PhoneType.Pager.DisplayName, fromName.DisplayName);
        }

        [Test]
        public void ToString_Equals_DisplayName()
        {
            Assert.AreEqual(PhoneType.Cell.DisplayName, PhoneType.Cell.ToString());
        }

        [Test]
        public void Absolute_Difference_Math_Is_Correct()
        {
            var diff = Enumeration.AbsoluteDifference(PhoneType.Cell, PhoneType.Voice);
            Assert.AreEqual(3, diff);
            diff = Enumeration.AbsoluteDifference(PhoneType.Voice, PhoneType.Cell);
            Assert.AreEqual(3, diff);
        }

        [Test]
        public void GetAll_Contains_Expected_Members()
        {
            var all = Enumeration.GetAll<PhoneType>().ToList();
            Assert.AreEqual(4, all.Count);
            Assert.NotNull(all.FirstOrDefault(x => x == PhoneType.Cell));
            Assert.NotNull(all.FirstOrDefault(x => x == PhoneType.Fax));
            Assert.NotNull(all.FirstOrDefault(x => x == PhoneType.Pager));
            Assert.NotNull(all.FirstOrDefault(x => x == PhoneType.Voice));
            Assert.NotNull(all.FirstOrDefault(x => x == PhoneType.Default()));
        }

        [Test]
        public void Equality_Two_Are_Equal()
        {
            var one = PhoneType.Cell;
            var two = PhoneType.Cell;
            Assert.AreEqual(one, two);
            Assert.True(one == two);
        }

        [Test]
        public void Equality_Two_Different_Not_Equal()
        {
            var one = PhoneType.Cell;
            var two = PhoneType.Voice;
            Assert.AreNotEqual(one, two);
            Assert.True(one != two);
        }

        [Test]
        public void Equality_One_Null_Not_Equal()
        {
            PhoneType.Cell.Equals(null).Should().BeFalse();
        }

        [Test]
        public void Compare_To_Succeeds()
        {
            var diff = PhoneType.Cell.CompareTo(PhoneType.Fax);
            diff.Should().Be(1);

            diff = PhoneType.Fax.CompareTo(PhoneType.Cell);
            diff.Should().Be(-1);
        }

        [Test]
        public void Invalid_Parse_Number_Throws()
        {
            Assert.That(() => { Enumeration.FromValue<PhoneType>(999); }, Throws.Exception);
        }
    }
}

There's More

  • For simplicity some complexity was removed from the T4. This includes reference tables that have foreign keys to other reference tables (including strongly typed properties and ctor params) and resource strings for fields other than the primary display name.
  • One thing that bothered me initially with this was it felt a bit like introducing Data Driven Design into an otherwise Domain Driven Design paradigm. This was offset somewhat with aliases for table names (or a naming convention pattern) and reference tables could be selectively ignored in code generation through the use of attributes or other means. Another possible issue is class name conflicts with existing types in the domain project; this could be offset with a different namespace and/or naming convention.
  • Many apps load reference data from the database each time it's needed, or load it once and cache it until invalidated. Other apps may let users edit select sets of reference type data. If the data is likely to change during an app session or if users can edit some of it, chances are it isn't truly reference data to begin with. With a good deployment process, any compiled reference data info can be easily deployed and various reference data is likely to be tied to app business and presentation rules anyway.
  • Several of the enum classes would have corresponding partial classes for extended logic. For this reason, various code generation type attributes were places directly on generated members and not on the class itself, per various guidance.
  • An example use in the domain would be a ContactType enum property on a Contact object and requiring different fields when attempting to add contacts of different types. This app uses this component strategy in Fluent NHibernate to map the data from the database reference table into the domain class. For the UI side, Enumeration.GetAllMembers can be used along with AutoMapper to get the id and text values into simple ViewModel types for select lists.

Code

T4RefDataCode.zip
Friday
Aug032012

AutoMapper's ForMember

Calling AutoMapper's ForMember has been bugging me lately with having to deal with its member configuration options like so:
Mapper.CreateMap<Contact, AddressBookDetailsModel>()
	.ForMember( d => d.PhoneNumbers, o => o.MapFrom( s => s.Phones ) );
The member configuration options provides much more than just MapFrom but 99 times out of 100 I'm only dealing with MapFrom. What I'd really like is something like this:
Mapper.CreateMap<Contact, AddressBookDetailsModel>()
	.ForMember( d => d.PhoneNumbers).MapFrom( s=> s.Phones );				
In looking around I did find http://trycatchfail.com/blog/post/A-More-Fluent-API-For-AutoMapper.aspx. It looked promising but (a) the code was incomplete and (b) it broke down for more complex mappings.

I'd like the syntax above but for now I went with something quicker to implement that gave me a shorter syntax though it might not be as syntactically sweet:
public static class AutoMapperExtensions
{
	public static IMappingExpression<TSource, TDestination> MapItem<TSource, TDestination, TMember>(
		this IMappingExpression<TSource, TDestination> target,
		Expression<Func<TDestination, object>> destinationMember,
		Expression<Func<TSource, TMember>> sourceMember )
	{
		return target.ForMember( destinationMember, opt => opt.MapFrom( sourceMember ) );
	}
}
Now when creating maps I don't need to fuss with an extra lambda and method call:
				
Mapper.CreateMap<Contact, AddressBookDetailsModel>()
	.MapItem( d => d.PhoneNumbers, s=> s.Phones );				
I can do something similar to simplify Ignore calls:
		
public static class AutoMapperExtensions
{
	public static IMappingExpression<TSource, TDestination> Ignore<TSource, TDestination>(
		this IMappingExpression<TSource, TDestination> target,
		Expression<Func<TDestination, object>> destinationMember)
	{
		return target.ForMember( destinationMember, opt => opt.Ignore() );
	}
	
	/* other code removed for brevity */
}