So What is Microsoft Trying to Tell Me?

by jmorris 30. December 2009 23:10

This is the error message I woke up to this morning after installing VS 2010 Beta 2 last night:

One would think that an operation that had completed successfully would not be a problem, eh?

Tags: , , , ,

So What is Microsoft Trying to Tell Me?

by jmorris 30. December 2009 14:10

This is the error message I woke up to this morning after installing VS 2010 Beta 2 last night:

One would think that an operation that had completed successfully would not be a problem, eh?

Tags: , , , ,

Comparing ORM's

by jmorris 26. December 2009 09:33

I am currently using Linq, EF, ADO.NET and NHibernate (trying at least) in various projects. Here is my current feeling about each:

  • Linq2Sql - easy, works, obsolete, lacks many features...
  • EF- easy, fast, low barrier to entry...Linq tooling is better than v3.5, which is pretty much useless...v4 seems better
  • ADO.NET - simple, easy, no hidden "gotchas", repetitious, flexible...in most all cases requires framework built around it to scale developer-wise
  • NHibernate - flexible, complex, high barrier to entry, lots of hidden "gotchas" regarding performance, mappings, etc., poor (IMO) query API (syntax), outdated documentation, dedicated community albeit scattered

When one say's NHibernate is missing VS integration, I don't think it's the GUI Modeler that they are missing. It's the fact that they can't point at a database and hit "GO"...I think model first is important, but not everyone works that way.

Low friction prototypes are often the start to more complex finished products. We don't need complexity in prototypes; we tailor our finished works to our problem domain - this is where we tweak the API (away from the GUI) to fit the specific requirements of our problem. The ability to quickly create prototypes from the IDE is incredibly important. Without IDE integration, creating these prototypes is difficult...friction.

The thing that a MS solution offers is IDE integration, which you do not find in the NHibernate suite. If I could go to the NHForge.com site and download a VS plug-in, point at a db, and then start playing...I would be much more inclined to use NHIbernate in a project.

This post was motivated by this post...

Tags: , , ,

Overly Complex Solutions to Simple Problems: Take 1 - Local Time to GMT Conversions

by jmorris 11. December 2009 22:41
[No text]

Tags:

Generating Data Transfer Objects with Seperate Files from a Schema Using T4

by jmorris 3. December 2009 04:21

T4, or Text Template Transformation Toolkit, is a template based solution for generating code that is built into VS2008 (it's also available as an add in VS2005). Alas, it has minimum support in VS2008 in that there are are no Visual Studio Template for adding a T4 template - you cannot just right click Add > New Item and add a T4 file to you project. However, you can create a new file and change the extension to ".tt" and VS will know that you are adding a T4 file to the project (after a security prompt asking you if you want to really add a T4 file) and create the appropriate T4 template and .cs code behind file that accompanies each .tt file. For a detailed explaination of how to do this, please see Hanselman's post here.

T4 templates are pretty cool once you get the hang of some of nuances of the editor, which is somewhat lacking in features (reminds me of using Notepad to write Assembly code in college). There are in fact at least two VS add ons that add some degree of intellisense and code/syntax highlighting: Clarius Visual T4 and Tangible T4 Editor for VS. They both offer free developer editions with limited functionality if you just want to get a feel for what they can do without forking out the cash.

Out of the box, T4 templates have one glareing weakness: only one template (.tt) file can be associated with one output file (.cs, et al). This is not really ideal in that we typically associate one source code artifact (class, sproc, etc) with it's own file. This makes it easier to grok, manage, read a projects source and also makes versioning easier using version control software such as Git or Subversion. It's much easier to track changes to a single class in a file than multiple classes in a file. With a little work and a little help it is possible to generate multiple source files from template files, however.

So, T4 aside, what are Data Transfer Objects (DTOs) and why do need them? DTOs are exactly what they propose to be: objects containg data, typically corresponding to a single record in a database table. In my opinion, they are anemic in that they contain NO behavior whatsoever. They are the data...constrast this with domain objects, which are data and behavior. A very typical scenario involving DTOs are situations where data must be moved from one part of the system, to another where they are consumed and used, likely by a domain object(s). For instance, we may use an ORM such as NHibernate or Entity Framework to access the database, bring back a set of data and then map it a DTO.

In many cases your DTOs are mapped directly to your database schema in that there is a one to one mapping between table and object field or property. In this situation, manually creating an object per entity becomes tedious and scales poorly in terms of developer productivity. This is where using a code generation solution, such as one created with T4 really shines.

For example, given the following schema, generate a DTO for each entity:

The first step is getting enough information about the tables from the database metatables so that you can generate objects for each table. Assuming you are mapping to pure DTOs, you can ignore any of the relationships in the form of 'Has A' in your objects. It's not these relationships do not exist; they do, just not explictly. Instead you maintain the relationships through properties that represent the foriegn keys between the objects. An obvious benefit to this sort of convention is that you immediatly resolve the potential n+1 problems inherit with ORMs and lazy loading, at the expense some other features of ORMs, such as object tracking. IMO ignoring these relationships is a personal preference as well as a architectural concern; for this example I am ignoring these relations.

The following sproc is an example of how to get this data from a database, in this case I am using MS SQL Server 2008:



This stored procedure returns a record describing the table and each column of the table or at least the relevent parts: name, data type, and whether or not the column is a primary key. This sproc is called from a T4 template to load a description of each table and it's columns into memory. Here is the relevent code:



In order to generate seperate files for each artifact generated, we will be using three seperate T4 templates: GenerateEntities.tt, Entity.tt, and MultiOuput.tt. GenerateEntities.tt contains the code above as well as another code block in which it loops through the results returned from GetTypes() and uses the Entity.tt template to generate the artifact. The MultiOuput.tt takes the code generated by GenerateEntities.tt and Entity.tt to write the output file to disk and add the file Visual Studio. Note that MultiOutput.tt comes from one of many excellant posts by Oleg Sych and that an updated version of the file is purported to be available with the T4 Toolkit up on CodePlex.com.




The code above loops through the table definitions and uses remoting to set a property on the Entity.tt template with each value. Finally, the MultiOutput.tt.ProcessTemplate(...) method is called which writes the output of Entity.tt to disk and adds the file to Visual Studio. Entity.tt is pretty straight forward:



When the solution is saved, the T4 templating engine will process the templates a file will be added to Visual Studio under the GenerateEntities.tt file that represents a DTO for each table in the database.



References:

  1. http://www.hanselman.com/blog/T4TextTemplateTransformationToolkitCodeGenerationBestKeptVisualStudioSecret.aspx
  2. http://www.codeplex.com/t4toolbox
  3. http://www.olegsych.com/2008/03/how-to-generate-multiple-outputs-from-single-t4-template/

Tags: , , , , ,

Fluent NHibernate Mapping Identity Columns to Properties

by jmorris 1. December 2009 06:37

I decided to jump into the NHibernate lovefest and use it in an upcoming project that I am planning right now. I have been following the NHibernate project for some years, but never actually comitted to using it in a project, because frankly, it was a bit intimidating in size and complexity. Now, of course this was my biased assumption and boy was I wrong! The new Linq 2 NHibernate and Fluent NHibernate API's are awesome and relatively simple to get up and running.

Although I still have some reservations about the completeness and performance of Linq 2 Hibernate, Fluent NHibernate seems to be pretty mature. Additionally, the Fluent NHibernate community is robust, friendly and very quick to lend a hand when I ran into some trouble with AutoMapping.

AutoMapping is a convention based feature of Fluent NHibernate in which with a very little configuration, you can map your entire schema to your domain model. This feature is a tremendous time saver, and gives the illusion of "it just works"! As awesome AutoMapping is, there are certain situations where it will choke. In these cases you must add a little "help" to make the mapping work correctly. Take the following table:

 


Just your basic Role table, but notice how the primary key column is named [entity name]+Id: RoleId? This is a convention that I use for naming the primary keys of all tables I create. It is simple, easy to understand, and works! Now here is the domain model object that it maps to:

 


Notice that the domain object does not have a field called RoleId? Instead we have another field in our base called Id. Now seeing how AutoMapping requires convention (namely naming conventions) to map entities, how does Fluent NHibernate map this with AutoMapping? Well, unfortuntaly it can't:

 



However, with a little help from the Constraints API, we can easily resolve this mapping with a minimum amount of code. What are Conventions, you might ask? According to the Fluent Nhibernate documentation:

"Conventions are small self-contained chunks of behavior that are applied to the mappings Fluent NHibernate generates. These conventions are of varying degrees of granularity, and can be as simple or complex as you require. You should use conventions to avoid repetition in your mappings and to enforce a domain-wide standard consistency."

Connventions are a set of base classes and interfaces that when implemented, allow you to override the default AutoMapping behaviour. Pretty sweet. 

Ok, so how did I resolve the mapping exception above? First the Covention implementation:


And finally the configuration with Fluent NHibernate AutoMapping API:




Tags: , , ,

Jeff Morris

Tag cloud

Month List

Page List