Friday, May 25, 2007

Domain Driven Design - Why we need to keep it simple

I'm really happy to see how much discussion there is of DDD these days and it really helps that technologies like NHibernate make it work. I'm also hopeful that ADO.NET Entity Framework will support this style of development whilst also addressing lingering issues, including how your domain model can live happily in a system that has lots of reports or a legacy database.

However the content of some of the discussions is starting to surprise me as instead of discussing real domain modeling issues many seem to want to focus on technology.

The first thing I should say is that discussing DDD without discussing persistence, usually ORM, is silly. Persistence is pervasive in many systems so you need to discuss how you will handle it.

However the same does not go for things like AOP, its perfectly possible to discuss DDD without going near AOP. In fact I would say that it is preferable to try to avoid discussing these complex options every time we discuss DDD for a few reasons:

  1. It will turn off to many readers.
  2. The more complex technologies are sometimes not required.
  3. There is a risk that many people will end up with overly complex solutions if they start out with the view that they have to use something like Spring.NET to make DDD a reality.

Having said that its obviously good that we have the option to use things like AOP but I personally prefer to avoid using them until I really feel they are needed.

Share This - Digg It Save to del.icio.us Stumble It! Kick It DZone

Thursday, May 24, 2007

Sandcastle

Just tried out Sandcastle and it seems to have progressed a long way. After downloading it I'd recommend you look at the Wiki not least as it has some GUIs for it, which you'll need as the normal procedure for generating the help is a bit complex. I downloaded the NDoc style one which seemed quite good.

The help file it produces is also quite professional but it was fairly slow considering how small the codebase is, though I seem to remember NDoc wasn't too rapid either.

Share This - Digg It Save to del.icio.us Stumble It! Kick It DZone

Tuesday, May 22, 2007

Spec#

Great podcast about Spec#, there is also a link that you can use to ask MS to add the features to C#.

Until the features are in C# I don't think I'll be looking at using them but they do sound useful.

Share This - Digg It Save to del.icio.us Stumble It! Kick It DZone

Sunday, May 20, 2007

Problems With Team System

The Cost

I don't mind Microsoft charging large quantities for VSTS for developers, its par for the course. However there are some things I do resent:

  1. To do basic things like maintain the (AWFUL) VSMDI file or view code coverage results you seem to need to to have the tester edition
  2. If your users want to interact with Team System, using the portal or TeamPlain, they need expensive CALs
  3. There is no longer any database support, so you need to have team system for database developers.

To me its the first two that are an issue, and that really should be dealt with.

Feedback

I've found that the quality of Team System isn't that great, but I guess you expect that from the first version of such a large project. Features that we consider key, such as unit testing and check in policies, just do not work as well as we'd expect.



However what bothers me is that when I log these issues using Microsoft Connect all I get is "cannot reproduce". I can't argue with that but there seems to be two problems:

  1. Far from fixing issues I've reported SP1 seems to have introduced new ones.

  2. When you do get problems you get very vague error messages that give you no chance to fix the problem and that don't provide Microsoft with the information they need to fix the problems. Take the date time related error message that I get when my testing check in policy fails (which it does all the time).
  3. As far as I can see SP1 didn't fix any of the issues I was getting but did introduce new ones.
  4. Very old issues like this and this have been reported by several people but Microsoft make no attempt to indicate whether they are doing anything about them.
As an example of an error message...







...needless to say the related issue was immediately marked as not reproducible.

VSMDI File
I hate it, its a maintenance nightmare that we need to keep around because the check in policies and code coverage functionality needs it.

Release Cycle
With open source software you tend to get releases relatively regularly and on a good project each release improves the quality. I'm not sure if I would say this happens with VSTS.

Documentation
The online documentation is quite often very poor and you end up looking at the odd blog entry to get information. As an example we wanted to have one of our team system builds do the following:
  1. Get the database scripts
  2. Build them
  3. Run the integration tests

Simple you might think, so how do we get the database scripts. Well we quickly identified we needed to use CoreGet. Do a search in MSDN and you get this page with this example:

What we thought, we can't specify a path within the Team Project to get, so we couldn't specify that we only wanted to get the "CrmDatabase" within the "Database" team project.

Our search continued and eventually we found this blog entry. Superb, just what we needed. So why didn't MSDN provide us with this information.

To be honest this is no isolated case, in general I'd say that 80% of the time we need team system information we find it in blogs rather than in MSDN, not perfect.

Open Source
Since we have so many problems getting testing and code coverage to work well in Team System I'm wondering why any company evaluating VSTS would choose it over open source.

I guess the whole integrated environment idea is a good one but when many of the integrated features don't work particularly well, and when each of them is expensive, and when the open source alternatives are in most cases at least as good...

Share This - Digg It Save to del.icio.us Stumble It! Kick It DZone

Tuesday, May 08, 2007

BDD

Brian Buttons blog had a link to an excellent video on BDD by Dave Astels.

I wasn't familiar with the term but I really enjoyed Astels book on TDD so hearing about BDD very interesting. BDD just seems to be refocusing on doing TDD as it was supposed to be done. I have a few issues with the idea though:

  1. We already have a lot of acronyms, I'm not entirely sure a new one will help especially because the difference between TDD and BDD is quite subtle.
  2. TDD serves several purposes, one of which is testing. Sure its testing the contracts but the tests themselves are very useful when you redesign and refactor.

I definitely agree with a lot of it, especially things like Should/Can based test naming and not mapping your test classes one to one with the classes being tested.

Having said that its a big step from doing TDD well to coming up with a completely different concept, anyway the video itself is superb.

Share This - Digg It Save to del.icio.us Stumble It! Kick It DZone

Monday, May 07, 2007

NHibernate and Coarse Grained Locking

One of the requirements of the aggregate pattern is that the aggregate has a root and a boundary and that the aggregate as a whole is the unit that we consider when looking at concurrency.

This has bothered me for a while as NHibernates concurrency is based on a version column at the row level, which is a completely different way of doing things.

For example if a Customer has a CustomerIdentification and both are in one row then we're OK, modifying either and saving the Customer will cause NHibernate to check that the row hasn't changed since we loaded it.

But lets say that the CustomerIdentification is in it's own table, now if we modify the CustomerIdentification and save then the save will pass even if someone else has modified the Customer. That can cause problems (see Eric Evans' book on DDD for concrete examples of the issues it can cause).

I looked it up in Nilssons "Applying Domain-Driven Design and Patterns" and was in luck as page 127 had the following:

"...an Order including its OrderLines is a concurrency conflict detection unit of it's own or - more conceptually - an Aggregate...".

He then goes on say that because the aggregate is the unit as far as concurrency we don't need to worry about other users interfering. Exactly what we need. I have read the book before so I was surprised that if it had a solution to this I'd missed it so I looked on, page 309 indicates two things:

  1. We want to use coarse-grained locks at the aggregate root level.
  2. We want to use a version field on the aggregate roots

The second bit confused me but its only for optimistic offline locking (though I would have thought you'd use that for all objects or for none). It doesn't seem to get me any closer to actually getting coarse grained locking working though so I skipped to page 341 covers how NHibernate works with concurrency and it tells me that NHibernate doesn't give us coarse grained locking. And thats all, no more reference to how we can get this important capability, even though the earlier examples assumed that we had a way of doing it. Back on the shelf for you.

Having said that I was particularly interested in the discussion of coarse grained locking of aggregates in Fowlers excellent "Patterns Of Enterprise Application Architecture". On page 439 the idea of a "root lock" is introduced, the root of the aggregate would have a lock and all parts of the aggregate would use it.

The book points out that for this to work we need to be able to navigate from any part of the aggregate back to the root, indirectly (with each one having a reference to its "parent") or directly (each has a reference to the root).

For me for now this is too much, having to couple each part of the aggregate directly to the root is too much and even coupling each part to the parent is troublesome, not least as to ensure that relationship is maintained when reloaded from persistence I'm going to need to store the relationship in the database. Unless I'm missing something obvious, which is quite possible.

For now I'll forget it :)

Share This - Digg It Save to del.icio.us Stumble It! Kick It DZone

Saturday, May 05, 2007

TypeMock / Rhino.Mocks - Designing for testability

On the project I work on we use a combination of Rhino.Mocks and TypeMock.

I'd used TypeMock a lot in the past and found it to be superb and I have always been skeptical on the advantages of designing for testability when dealing with domain classes. TypeMock also helped me avoid having to look at IOC, simplifying my domain and the related infrastructure code.

However I decided to give Rhino Mocks a shot and I did find it useful, if you are prepared to accept the limitation that it can only do interfaces/virtual methods/delegates.

In general I decided to favor TypeMock and only use Rhino.Mocks where it fitted the design I wanted, for example I use Rhino Mocks when mocking out dependencies to infrastructure.

There's a lot of debate about the use of TypeMock though, some people seem to disagree with it:

http://vadim-net.blogspot.com/2007/04/typemock-too-powerful-to-use.html
http://weblogs.asp.net/rosherove/archive/2007/04/26/choosing-a-mock-object-framework.aspx
http://www.mockobjects.com/2007/03/stop-designing-for-testability.html

There are also a few people who seem to thing it is worth using though:

http://www.paraesthesia.com/blog/comments.php?id=1086_0_1_0_C
http://www.elilopian.com/2007/02/26/object-oriented-testable-designed-you-must-be-out-of-your-mind/

It’s an interesting discussion and I guess your take on it is affected by many factors including the types of systems you work on and where in the code base you’re working.

As an example in our application we have persistence ignorant domain assemblies (Customer/Order) and separate persistence assemblies containing the repositories (CustomerRepository/RepositoryOrder). Nine times out of 10 we can ensure our domain assemblies never need to access repositories by careful use of NHibernate's features, making the domain classes very easy to unit test.

However in some cases the domain objects do need to call out to the repositories, what should we do then? We need to make sure that what we maintain the ability to test the domain simply. We could have looked at IOC but instead I went for a simpler Registry pattern based approach. You pass in a RepositoryFactory to this registry during initializaiton (constructor injection) and the interfaces of the repositories (though not the implementations) are in the domain assembly (separated interface).

This works fine and because the repositories and the factory implement interfaces I can use Rhino Mocks. Superb, so I can create a mock repository factory and pass it to the registry and that factory will create mock repositories.

This is fine and is a good example of designing for testability. However we also have cases where one domain object (A) uses another complex domain object (B). When testing A we want to mock B but we're happy with the design as it is so we don't want to start shoving interfaces onto B or using dependency injection. In those cases I turn to TypeMock.

Share This - Digg It Save to del.icio.us Stumble It! Kick It DZone

NDepend Podcast

Excellent podcast about NDepend.

It definitely seems like NDepend has moved on a long way from when I originally used it, since it was good then I can only imagine what you can do with it now.

One unfortunate thing is that Robert Martin's .NET book, on which a lot of the package dependency analysis is based, does not seem to mention NDepend and the NDepend Website makes little mention of Robert Martin.

It kind of seems like both the book and the tool could both do with referencing the other to help getting their existence out to the masses.

Share This - Digg It Save to del.icio.us Stumble It! Kick It DZone

Thursday, May 03, 2007

First Impressions Of Validation Application Block with DDD

I would highly recommend the Enterprise Library drill down webcasts:

http://blogs.msdn.com/tomholl/archive/2007/04/25/enterprise-library-3-0-drill-down-webcasts.aspx

I've just watched the validation application block one and it left me with a few questions:

Looks good but I'm not sure how valid it is in practice on proper domain model based development. In particular a few things worried me:


  1. Primtive Validation - The view seems to be that the primitive validation with the application block will deal with most cases but I'm not sure this is true...I think I'll end up creating a lot of custom rules just as I do now.
  2. Configuration Flexibility - The configuration flexbility might be useful in some cases, associate the rules to the domain objects in XML configuration where appropriate. However the video was slightly confusing on how they were advising we used this. David was saying start with attributes and then move to configuration based approach once your domain model was "set". Tom was saying use configuration if the rules are likely to be flexibile or you don't have the code that you want to apply to rules to. Personally I think Toms approach seemed more appropriate for the type of development I do.
  3. Attributes v Configuration Files - The whole Ruleset approach seems to be a bit basic. It seems odd that rulesets are not considered the default, lots of companies are applying domain models/layers where that layer is written in C#/Java and the rules are written in those languages. Business rules engines are not the only way to do it and the offhand way it was discussed left me a little worried. In our project most of our domain objects do have some inbuilt states and the validation rules are based on the state so and I'm not sure Rulesets give us enough flexibility.
  4. Self validation - Where its necessary you can put validation methods into the classes being validated. Evans deals with the issues with such an approach in pp 224 on his book, the obvious one being if you go down that path your domain objects will be full of rule code. To me a Specification based approach is far better and so I would imagine we'd be using CustomValidators instead.
  5. Custom Validators - Getting more into Specification style approach. Your rules must derive from a supplied base class. These classes look quite like the implementation of our IDomainRule interface, all very simple. You also need to write an attribute, again its simple enough and if you don't use this approach you'd need to write your own code to ensure validation rules are applied.
  6. Policy Injection App Block Integration - The example integrating the Policy Injection seemed to me to be resulting in very ugly code, attributes like that on every argument would not be nice at all. I like the declarative approach but having a one line call to something like this ArgumentValidation.ThrowIfOutsideRange(....) in the method would seem to me to be much nicer.
  7. Integration Adapters - Interesting way to integrate your domain validation into the UI, it'll be interesting to see how it works in practice and whether its flexible enough for us.In particular I'm not sure how this will work with the ASP.NET proxy validates as you need to specify the RulesetName but in our case which ruleset to use depends on the context. Having said that this could prove very useful.

On the Ruleset point I added a topic to CodePlex about it so hopefully I'll get some feedback.

Share This - Digg It Save to del.icio.us Stumble It! Kick It DZone