On this page.... RSS 2.0 | Atom 1.0 | CDF
# Monday, 01 October 2007

Previously, I mentioned I was working on an example of using Visual Studio to create a concrete domain model using object thinking, and here it is.  The domain I ended up modeling was that of a shared event calendar, including event registration and agenda planning.  This is something that's been kind of rolling in and out of my mind for quite a while now because it seems that we need a good system for this for all the code camps and like events that occur.  Of course, lately I've come across a few solutions that are already built1, but it seemed like a domain I knew enough about that I could take a whack at modeling it on my own.  I also figured it was small enough in scope for a sample.

So without further ado, I present you with the domain model:
Click to See Full Event Calendar Domain Model

I put this together in about an hour, maybe an hour and a half, on the train up to SD Best Practices.  When I started out modeling it, I was actually thinking more generally in the context of a calendar (like in Outlook), but I transformed the idea more towards the event planning calendar domain.  So you see some blending of an attendee being invited to a meeting with the event planning objects & behaviors (agenda, speaker, etc.).  Interestingly, they seem to meld okay, though it probably needs a bit of refactoring to, e.g., have an Attendee Register(Person) method on the Event object.

So the interesting thing to see here, contrasting it to the typical model you see in the .NET world (if you're lucky enough to see one at all!), is that there is pretty much no data, no simple properties or attributes, in the model.  The model is entirely objects and their behaviors and relationships to other objects.  You can look at this model and get a pretty darn good feel for the domain and also how the system functions as a whole to serve this domain.  I was able to identify and model the objects without once thinking about (and getting distracted with) particular data attributes.2

In the story of our Tangerine project, I describe in some depth the compromise I had to make with the .NET framework when it comes to data properties.  I think if I were to continue with this event calendar project, after I had nailed down the objects based on their behaviors (as begun in this example) and felt pretty good that it was spot on, at that point, I'd think about the data and do something like I did on Tangerine, having the open-ended property bag but also adding strongly-typed properties as needed to support framework tooling.3 

I hope you can imagine how you could sit with your clients or whoever your domain experts are and quickly map out a lightweight model of the domain using the VS Class Designer DSL.  I'll wager that if we took this diagram and showed it to a non-technical person, with a little help (maybe adding a key/legend), they'd quickly understand what's going on with the system.  And if you're building it with the domain expert, you'll have that dialog done already so that everyone will be on the same page.

Sure, there will be further refinement of both the domain model and the code; the nice thing about using the class designer DSL is that tweaking the model tweaks the code, so the two stay in sync.  We already mentioned the need to focus on the data at some point, and depending on your situation, you can do this with the domain experts or maybe you'll have an existing data model to work with.  As the developer, you're going to want to get in there and tweak the classes and methods to use best coding and framework practices, things that aren't best expressed in such a model.  You will have other concerns in the system to think about like security, performance, logging, user interface, etc., but that's all stuff you need to do regardless of how you approach analyzing and modeling your domain. 

In the end, you will have a fairly refined model of the domain (or part of the domain) that is using a language that everyone gets and agrees on (Eric Evan's "ubiquitous language"); you'll have identified the objects in the domain accurately based on their behaviors and relationships, and you'll even have a starting point in code for the implementation.  You also have objects that are responsible and that collaborate to get the job done, so in that way you avoid code complexity by reducing imperative control constructs.  All in all, it seems like a great foundation upon which to build the software.

Notes
1. Such as Microsoft Group Events, Community Megaphone, and Eventbrite.
2. Okay, so maybe I was tempted once or twice, but I fought the urge. :)
3. I suppose another option would be to create LINQ-based DTOs; I have to think more about how best to meld this kind of domain modeling with LINQ.

Monday, 01 October 2007 16:59:37 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Saturday, 29 September 2007

I finally got around to finishing The Timeless Way of Building, by Christopher Alexander (most well known in software for being the source of the patterns movement).  The last part of the book is called "The Way" to build things.  His focus is physical architecture, but it is interesting how closely it resembles agile software development.

There are a few similarities that I see.  First, he advocates (or at least shows in his example) working directly with the folks who are going to be using the building(s) when designing it with the pattern language.  You design it together with them.  Similarly, agile seems to advocate the same process of working as closely as possible with those who will be using the system.1

But Alexander goes on to say, using this real-world example of a health care complex he helped to build, that it almost failed (in terms of having the quality without a name) because even though it was initially designed using the pattern language, it was in the end passed off to builders who conformed the design to "drawings" (think UML) that ultimately caused it to lose a large amount of the quality

The point he goes on to make is that you can't just use the language up front and then go translate it into formal design techniques and end up with the quality.  Rather, you have to build using the language, and in particular, build each part of the structure piecemeal, to best fit its particular environment, forces, context, and needs.  This is the only way that you can get the quality.  Here I see another similarity with agile and its focus on iterations and regular feedback.  You build in pieces, adapting each piece to its whole as it is built and ensuring that it best fits the needs, context, forces,  and environment. 

He also says that invariably our initial ideas and designs for a solution don't exactly reflect the ways in which the solution will be used.  And this disparity between our design and reality gets worse as the solution grows in scope.  Again, this is true in software and why the regular feedback is important, but Alexander proposes repair as a creative process in which we better adapt the solution to its environment based on deepening understanding of needs or when the design just isn't working or breaks.  This is akin to what we call refactoring, and like we do in software, Alexander advocates a continual process of repair (refactoring).  And this process doesn't stop when the initial thing is built--we keep tweaking it ad infinitum.

This seems somewhat intuitive, yet in software we're always talking about legacy systems and many have and continue to suggest "rewrites" as the answer to software woes.  While I understand that this is one area where software differs from real-world building (in the relative ease that something can be redone), I do think that we software folks tend to err too much on the side of rewriting, thinking that if only we can start from scratch, our new system will be this glorious, shining zenith of elegance that will last forever. 

It is this thinking, too, that even causes many of these rewrites to fail because so much time is spent trying to design a system that will last forever that the system is never completed (or becomes so complex that no one can maintain it), providing the next impetus for another "rewrite of the legacy system."  On the contrary, some of the best software I've seen is that which has simply been continuously maintained and improved, piece by piece, rather than trying to design (or redesign) an entire system at once. 

What is interesting to me in all this is the similarities between the process of building physical structures and that of building software, the general applicability of Alexander's thought to the creation of software.  I continually see this in Alexander's writing.  In part, it is good to see a confirmation of what we've been realizing in the software industry--that waterfall just doesn't work, that pre-built, reusable modules don't really work well, that we need regular, repeated input from stakeholders and users, that we shouldn't try to design it all up front, that we shouldn't use formal notations and categories that create solutions that fit the notations and categories better than their contexts, environments, and needs, that we should create and use pattern languages that are intelligible by ordinary people, and more.

There is one last observation I'd make about The Timeless Way of Building, regarding the "kernel of the way."  Alexander says that when it comes down to it, the core (the kernel) of the timeless way of building is not in the pattern language itself (the language is there to facilitate learning the timeless way); he says the core is in building in a way that is "egoless." 

In some ways, I think the concern about ego is less pronounced in the software world--rarely is a piece of software admired as a piece of art--but at the same time, the underlying message is that you build something to fit just so--not imposing your own preconceptions on how the thing should be built.  For software developers, I think the challenge is more in learning to see the world for what it is, to really understand the problem domain, to look at it through the eyes of the users and design a solution to fit that rather than trying to foist the software worldview onto the users.  To put it another way, we need to build software from the outside in, not the inside out.  The timeless way is really about truly seeing and then building to fit what you see.

Notes
1. At this point, another interesting thought occurs to me about pattern languages; I see a relation to Eric Evan's "ubiquitous language" in that the language you use needs to be shared between the builders and those using the thing being built.  What stands out to me is the idea of building a pattern language that is intelligible enough by non-software experts to be incorporated into the ubiquitous language shared by both the domain experts and the software experts.  Software patterns vary on this point; some are intelligible and some are not so intelligible; we need to make them intelligible.

Saturday, 29 September 2007 21:31:12 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, 21 September 2007

As I sit here on the train home, I've been thinking (and writing) about a lot of stuff.  But I figured I should put this post together for completeness and finality, even though I only made it to one session today before I left  early.  Last night I was shocked and somewhat dismayed to find that I had somehow managed to book the train for return on Saturday afternoon rather than today.  I looked at my reservation email, thinking surely the ticket was misprinted, but nope, the reservation says the 22nd clearly in black and white.

Now, those who spend much time with me know that I tend to be sort of the absent-minded professor type.  I often have trouble keeping track of the details of day-to-day things (but I can tie my shoes!).  I like to think good reasons for this, but whatever the reasons, that's me.  So I can totally imagine that somehow I tricked my brain into thinking that the 22nd was the day I wanted to return when I booked the train.

That said, I think this is a good opportunity to observe a way in which the UX of the reservations system could be improved.  If it had simply said somewhere that the return trip was on SATURDAY and not just used these obscure things called numeric dates, I'd immediately have seen and avoided my mistake.  But nowhere online nor in the email nor on the ticket does it say Saturday.  In fact, there is SO MUCH GARBAGE on the ticket, that the non-initiate has trouble finding anything of value.  So think about that if you're designing some sort of booking system--show the day of the week, please. :)

Lean Process Improvement
So this morning, on top of being tired because I stayed up late writing, I was late for the class I wanted to attend, one called Agile Architecture.  Unfortunately, it was in the smallest room in the conference (same one as the manager meeting yesterday), and unfortunately, the planners didn't anticipate attendance to that session correctly.  Plus, this room had this odd little old lady who felt it was her duty to prevent anyone from attending who had to stand. 

Yesterday, I watched her try to turn away (a few successfully) quite a few folks, even though there was plenty of room on the far side to stand.  She kept saying "there really is no room," but there was.  What made the whole scene kind of comical was that she refused to go sit OUTSIDE the door, so rather than simply preventing folks from coming in and causing a distraction, she let them come in, then animatedly tried to convince them to leave, causing even more distraction.

Well, when I peeked in the door this morning, saw the full room and saw her start heading toward me, I knew I was out of luck.  I just didn't have the heart to muscle by her and ignore her pleading to go stand on the other side, and besides, I don't like standing still for 1.5 hours anyway.  So I was off to find an alternative.

I knew there wasn't much else I wanted to see during that hour, but by golly I was there and this was the only slot I could make today, so I was going to make it to a session!  After two more failed entries into full sessions and studiously avoiding some that sounded extremely dull by their titles, I finally found one that sounded nominally interesting and had a lot of open space.  I really had no clue what I was getting into...

It ended up being somewhat interesting.  It was about applying the "lean process" from the manufacturing space to software development.  I'm personally not really into process and methodologies, particularly when they come from disciplines that are only marginally like our own.  But this did sound like it could be useful in some situations, particularly in software product (i.e., commercial) development. 

He talked about value stream mapping, which is basically modeling the process flow of specific activities in product development from beginning to end (so you'd do one for new feature dev, one for enhancements, one for hot fixes, etc.).  It sounds like it does have potential to be useful as long as you don't spend too much time on it.  Particularly if you think you have a problem in your process, this method can help you to both visualize and identify potential problems.  If you do product development, it's worth a look.

Final Thoughts
After that session, I made off to go to the 12:05 mass at the chapel outside the convention center.  My deacon friend had let me know about it, and I was glad of it.  And he was there, so after mass, we went back into the conference to grab lunch together.  Talked more about the usual, and then I had to run off to catch my train.

Looking back, I feel that this is definitely a conference worth attending.  Of course, your mileage will vary.  I wouldn't come here to go to a bunch of sessions on topics you're already an expert on.  But the nice thing about this conference over others I've been to is that it really is focused on best practices.  It's not really focused much on technology-specific stuff (though there was a bit of that), so you can derive value whether you do Java, C/C++, .NET, or whatever. 

Also, it is a good place to come to meetings of minds from other technology experts, so you get some more exposure than you might normally to how folks are doing software outside of your technological community.  And one interesting thing I noticed is that there is a tangible presence of software product developers, and that's a different and valuable perspective for those who are more used to, say, standard custom/consulting/corporate IT software.

Overall, if you look over the sessions and see topics that you haven't had a chance to explore in depth or maybe you want to just get exposed to other ideas in the software space, this seems like a good conference for that.  I really enjoyed it.

Friday, 21 September 2007 16:27:11 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Thursday, 20 September 2007

Today I stumbled into Barnes & Noble (because it had the nearest Starbucks), wandered into the notebook section, and was reminded that my current Moleskine notebook was almost full.  Silly me, I still have two back at the office, so I thought it must be fate for me to go ahead and restock while I'm here.  I highly recommend Moleskine; I like the small, book-like ones without lines because small is convenient enough to put in pocket and I don't like to conform to lines or have even the suggestion that I should, but they have all kinds.  Good, tough little notebooks, and supposedly they've been used by some famous people.  This has not been a paid advertisement for Moleskine.  Now we return you to your regular program.

Applying Perspectives to Software Views (Cont'd)
Yesterday I talked about Rebecca Wirfs-Brock's session on software views.  There's a lot more to what she said than what I communicated, but I'm just propounding what stuck with me.  Looking at my notes, I forgot to mention another key thing, which is that you should model these views and model them in a way that effectively communicates to the stakeholders that their needs are being addressed.  She threw up some UML diagrams, commenting that they're probably not good for most business folks.  (I think UML is not good for most technical folks either, but I'm a rebel like that.)  The point she made, though, was regardless of what notation you use, provide a key that let's people know how to unlock the meaning of the model.  Good point for sure.

Actually, this reminds me of Beautiful Evidence, by Edward Tufte.  I recommend Tufte for his area of expertise, though I'd suggest skipping the chapter on Powerpoint (which sadly was released as a separate booklet) because it's not his area of expertise and it shows.  Anyways, when he is sticking to the realm of visual communication, he is excellent, and Beautiful Evidence is a pretty easy read that helps you start thinking about how to communicate "outside the box" as it were.  I bring it up here because applying his ideas in the area of modeling software, particularly to non-technical audiences, is something we should explore.

Now, back to Day III.

Software Managers
The first session I made it to kind of late (and it was absolutely packed--standing room only) was a session on tips for being a good technical/software manager.  Having become one of these this year, it is definitely a subject of interest, and I'm always on the lookout for more tips, though I must say that I think management books (as a rule) are really bad about regurgitating each other.  You get to where it becomes increasingly hard to find new, good insights the more you read them.

But I thought this session would be good since it is specifically focused on managing technical teams.  Some of her points were standard managerial stuff, but it was nice to have it focused in on the IT industry.  I always end up feeling a bit guilty, though, because I know I've already made numerous faux pas (not sure how to pluralize that).  I hope my guys know I love them even though I screw up being a good manager at times. :) 

One recurring theme I keep coming across is having regular 1-1s with your peeps.  I've heard weekly and bi-weekly, but it seems like both of those would be overkill for my group since we have daily meetings, often go out to lunch, etc., so I'm going to try monthly first.  It'll be better than nothing! 

I have to say that managing well is at lot harder than I expected it to be.  For those of us who aren't natural people persons, it is definitely an effort.  I'm sure it is tough regardless, but I gotta think that it'd be easier if I was naturally more a people person.  Anyways, I keep tryin' for now at least.

Designing for User Success
Went to another Larry Constantine session around UX.  This one was really good.  He, like Patton, affirmed that "user experience is about everything."  Again, it's nice to know I'm not crazy, and it takes a burden off me knowing that I won't be a lone voice crying out about that.  It seems that maybe just those who don't know anything about UX think it is "just another term for UI."  Of course, these "UX professionals" are naturally focused in on their areas of expertise (usability, information architecture, human factors, human-computer interaction, visual design, interaction design, etc.), so maybe I'm still a bit odd in my contention that architects must be the chief experience officers on their projects.

Anyhoo, this session focused in on "user performance" as a distinct focus, meaning that you are providing the tools to get the best performance out of people.  Though none of the session was spent explicitly justifying the importance of a focus on UX, implicitly the whole session was an illustration of why it is important.  I have a ton of good notes from this session, but I won't bore you with them (you can probably get most of it from his slides or other presentations he's done).  If you get nothing else, though, it's to change the way you think about designing software--design from the outside in.  If you're a smart person, you'll realize this has huge implications.  And also, recognize that you won't make all parts of your system perfectly usable, so prioritize your usability efforts based first on frequency of use and second on severity of impact (i.e., those things that will have serious ramifications if not done correctly).

Human Factors in API Design
The next session I hit was one related to UX for developers.  Here are some salient one-liners:

  • Consistency is next to godliness.
  • API = Application Programmer Interface
  • When in doubt, leave it out. <-- More specifically, unless you have at least two, real use cases, don't stick it in your API.
  • Use the Iceberg Principle. <-- This means what people see of your code should only be the tip of the iceberg--keep it small, simple, and focused.

This session actually seemed to be a blend of general UX guidelines (yes, they apply here, too, not just on end-user interfaces) and more general framework design principles that only had varying degrees of pertinence to ease of use.  Some highlights:

  • Default all members to private; only raise visibility with justification.
  • Prefer constructors to factory/builder pattern, and setup object fully with constructor where possible.
  • Use domain-specific vocabulary.
  • Prefer classes to interfaces.  Amen!
  • Prefer finality (sealing) to inheritance--minimize potential for overriding.

There's a good deal more, and I'm not offering the justification he proposed (for brevity's sake).  I agree to varying levels of vehemence with most of what he said, but one area where I think I have to disagree is his advice to only refactor to patterns.  I can imagine where this comes from--because patterns can be abused (paternitis as he said).  But I think saying refactor to patterns shows a big misunderstanding of the point and value of patterns.  This is why it's important to pay attention to the context and rationale in a pattern--so you know when to apply it.  But patterns should be used where they apply--they're known, established, tried and true ways of solving particular problems in particular contexts!  If consistency is akin to godliness, using patterns is ambrosia.

One last interesting note from this session was the admonition to consider using or creating a domain-specific language where it helps with the usability of the API.  His example was around JMidi and JFugue, where JMidi is a terribly verbose API, requiring the construction of and coordination of a host of objects to do something simple like play a note, JFugue offers a simple string-based DSL that is based off of musical notation to let you place a whole series of notes very compactly.  Good/interesting advice.

Pair Programming
The last session I went to today was one based on practical pair programming.  I was actually on my way to a class on Business Process Modeling Notation, which would have been potentially more intellectually stimulating, but I walked by the room with the Pair Programming session on it and had a sudden feeling I should attend it.  When I thought about it, I figured that I'd put off giving the idea fair play long enough and that I should take the time to hear it in more depth.  I figured it'd have more immediate relevancy to my current work situation in any respect.

I won't belabor all the points because I suspect with good reason that they're all the standard arguments for pair programming along with a good bit of the "how" to do it in real situations.  He actually has a number of patterns and anti-patterns to further illustrate good/bad practices in pair programming.  It was an interesting extension of the pattern-based approach (to people).  Suffice it to say, I think if you can get buy in in your organization it is definitely worth a try.  There are numerous difficulties with it, chief one being it is hard to do effectively in a non-co-located environment, but I think I'd try it given the opportunity. 

Random Thoughts
One thing that I've come to the conclusion on being here is that TDD seems to be unanimously accepted by those who have actually tried it as a best practice.  The API guy went so far as to say that he won't hire devs who don't have TDD experience.  (I think that's a bit short-sighted, but I take his point.)  It's something to think about for those still hesitating to adopt TDD.

I met up again with the same fella I met last night.  We were both in the pair programming class at the end of the day; he's been doing pair programming on a few teams at his company for years and is a fan, though he definitely attests to the difficulty of dealing with prima donnas, which apparently are more tolerated in his LOB (because they have very specialized knowledge that requires PhD level education).  So he wasn't able to carry XP to his entire company.  He also said that pairing (which was echoed by the presenter) is a taxing process; 4-5 hours max is good.

We also had a good long chat about things Catholic.  It's good to know that we Catholics will be getting another good, solid deacon in him.  I imagine tonight won't be the last time we talk.

All in all, another great day.  Learned a bunch.  No sessions I regret going to thus far, which is I think a big compliment for a conference. :)

Thursday, 20 September 2007 22:48:34 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Wednesday, 19 September 2007

Hi again.  Today was another good day at the conference. 

User Experience Distilled

The first class I attended was a whirlwind tour of user experience.  I was heartened to learn that I am not alone or crazy in recognizing that there are a number of disciplines that go into this thing we call UX, and the presenter, Jeff Patton, also recognizes that actually virtually every role in developing software has an effect on UX, which is also something I have come to the conclusion of (as I hint at on IG's UX area).  I develop the idea more explicitly in an unpublished paper I'm working on.  (I'm hoping the inputs I get from this conference will help me to finish that out.)  

I actually think that all of this UX stuff falls under the architect's purview because (in my mind at least) he or she is primarily responsible for designing the software as a whole.  This means that architects need to have a conversational familiarity (at least) with the different disciplines that people traditionally think of as user-oriented disciplines, but I'd take it a step further and say that the architect needs to be the chief experience officer, as it were, on a software project.  The architect needs to ensure that the appropriate expertise in user-oriented disciplines is brought to bear on his or her project and also needs to understand how the other aspects of software design and development impact UX and optimize them for good UX. 

That discussion aside, Jeff had a pretty clever graph that showed how the kind of software being developed affects the perceived ROI of expenditure on UX.  His talk also was about as effective an introduction to UX that I can imagine.  He dealt with what it is, why it's important, and then offered a high-level overview of key bits of knowledge for people to make use of.  I want to steal his slides! :)

Global Teams & Outsourcing Agilely

The keynote during lunch today was done by Scott Ambler.  It was nice to finally see/hear him in person since I've heard so much about him.  I got the feeling (from what he even admitted) that he was presenting stuff that wasn't just his--he was from what I could tell presenting an overview of a book that IBM publishes (related) on the subject.  But that didn't take away from the value of the knowledge by any means.  I'd definitely check it out if you're going to be dealing with geographically distributed teams.

Usability Peer Reviews

In my continuing quest to learn more about UX (part of which is usability), I attended a class by Larry Constantine about lightweight usability practice through peer review/inspection (related paper).  I was actually surprised because he has a very formal methodology for this, which means he's put a lot of thought into it but, more importantly, he's used it a lot in consulting, so it is tested.  I personally am not a big fan of being too formal with these things.  I understand the value in formalizing guidance into repeatable methodology, but I've always felt that these things should be learned for their principles and less for their strictures.  Of course, that runs the risk of missing something important, but I guess that's a trade off.  Regardless of if you follow it to a T or not, there's a ton of good stuff to be learned from this technique on how to plug in usability QA into the software process.

Applying Perspectives to Software Views

After that, I slipped over to another Rebecca Wirfs-Brock presentation on applying perspectives to software views in architecture.  (She was presenting the subject of this book.)  To me, the key takeaway was that we should figure out the most important aspects of our system and focus on those.  It echoed (in my mind) core sentiments of domain-driven design, though it used different terminology and approach.  I think the two are complementary--using the view approach helps you to think about the different non-functional aspects.  Using strategic DDD (in particular, distilling the domain) helps you and stakeholders to focus in on the most important aspects of the system from a domain strategy perspective, and that will inform which views and perspectives are the ones that need the focus. 

This approach also echoes the sentiment expressed by Evans yesterday that says you can't make every part of the system well-designed (elegant/close to perfection).  Once you accept that, you can then use these approaches to find the parts of the systems where you need to focus most of your energies.  I really like that this practical truth is being made explicit because I think it can help to overcome a lot of the problems that crop up in software development that have to do with the general idealistic nature that we geeks have.

Expo

After the classes today, they had the expo open.  In terms of professional presentation, it was on par with TechEd's Expo, but certainly the scope (number of sponsors) was far smaller.  That said, I popped into the embedded systems expo.  That was a new experience for me.  It was interesting to see almost every booth with some kind of exposed hardware on display.  As a software guy, I tend to take all that stuff for granted.  They even had a booth with specialized networked sensors for tanks of liquid.  This stuff stirred recollections of weird science and all the other fun fantasies that geeky kids have about building computerized machines.  The coolest thing there was the Intel chopper, which apparently was built by the Orange County Chopper guys, but it had a lot of fancy embedded system stuff on it.  I didn't stick around to hear the spiel, but it was pretty cool.

After the expo, I bumped into a guy at Cheesecake factory.  We started chatting, and it turns out that he's in the process of becoming a Roman Catholic deacon.  Pretty cool coincidence for me!  We talked about two of my top passions--my faith and software development (as exemplified here on dotNetTemplar!).  It was a good dinner.  He works at a company that does computer aided engineering; sounds like neat stuff with all that 3D modeling and virtual physics.  Way out of my league!

As I said, another good day here at SD Best Practices.

Wednesday, 19 September 2007 21:54:31 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 

I meant to write this last night, but I didn't get back to my room till late and just felt like crashing.  I'm at the SD Best Practices conference in Boston this week, which is a new experience for me.  It's one of a very few non-MS-oriented conferences I've attended, and I really wanted to come because best practices are a passion for me (and part of my job).  Infragistics was kind enough to send me.  I thought I'd share my experiences for anyone else considering going (and just for my own reference.. hehe)  Anyways, enough of the intro...

Day 1 - Tuesday, 18 September 2007

First off, let me say I like the idea of starting on a Tuesday.  It let me work for a good part of the day on Monday and still make it out here by train on Monday night.  I've found in the past that attending sessions non-stop for a few days can really wear you out, so four days seems about right.

The conference is in the Hynes convention center, and I'm at the Westin, a stone's throw away.  Also, it's right next to the Back Bay Station, so thus far the logistics aspect has worked out quite well for me.  I'd personally much rather take a train over a plane anytime. 

Responsibility-Driven Design

Tuesday was a day of "tutorials," which are half-day sessions.  So in the morning, I attended Rebecca Wirfs-Brock's tour of responsibility-driven design (RDD?).  I actually had her book at one point because it was mentioned in a good light by Dr. West in his Object Thinking, but somewhere along the line I seem to have lost it.  Anyways, I was glad to get a chance to learn from the author directly and to interact. 

From what I can ascertain, RDD has some good insight into how to do good object design.  It seems to me that thinking in terms of responsibilities can help you properly break apart the domain into objects if you struggle with just thinking in terms of behavior.  It's potentially easier than just thinking in terms of behaviors because while behaviors will certainly be responsibilities, objects can also have the responsibility to "know" certain things, so it is a broader way of thinking about objects that includes their data.

That said, it doesn't really negate the point of focusing on behaviors, particularly for folks with a data-oriented background because I do think that focusing on the behaviors is the right way to discover objects and assign them the appropriate responsibilities.  I think the key difference is that with the object-thinking approach, you know that there will be data and that it is important to deal with, but you keep it in the right perspective--you don't let it become the focus of your object discovery.

Another beneficial thing I think Ms Wirfs-Brock has is the idea of using stereotypes as a way to discover objects in the domain.  This is more helpful, I think, when dealing with objects that are more part of the software domain than those in the business domain because the stereotypes are very software-oriented (interfacers, information holders, etc.). 

In terms of process, she advocates this idea of having everyone on a team write their thoughts down about the problem being faced in a few sentences, focusing on what seems like it'll be a challenge, what will be easy, what you've run into before, etc.  Then have everyone bring those to the initial design meetings.  I like the idea because it bypasses the introvert-extrovert problem you sometimes get in meetings and you can start out with a lot of ideas to really jump sta

rt the design.  It's a good way to ensure you don't miss out on ideas due to personality issues.

The other thing I like in her process is writing down a purpose statement for objects as you discover them and thinking of them as candidates.  This is part of the CRC card process (the first C is now "candidates").  The reason I like it is that it helps you to focus on the point of the object and sort of justify its existence, which can help weed out some bad ideas. 

What I don't like about the process is the overall CRC card idea.  While it surely is more lightweight than many ways to approach object design, you still end up with a bunch of paper that you then have to translate into code at some point.  I much prefer to use a tool that will literally be creating the code as I design.  I've found the VS class designer serves this purpose quite well.  In fact, on the way up here, I spent some time doing up a sample class diagram using the object thinking approach to share as an example of domain modeling.  I'll be sharing it soon, but I just mention it to say this is not just speculation.  It was actually very lightweight and easy to discover objects and model the domain that way, and at the end I had literal code that I can then either fill out or hand off to other devs to work on who can then further refine it.

Domain-Driven Design

The second session I attended was one by Eric Evans on strategic domain-driven design.  Eric wrote a book on the subject that's been well received by everyone I've encountered who spent time with it.  I've seen a presentation on it, and I've read parts of Jimmy Nillson's Applying Domain-Driven Design and Patterns book.  So I thought I was acquainted well enough with the ideas, but as I often find to be the case, if you rely on second-hand info, you'll inevitably get a version of the info that has been interpreted and is biased towards that person's point of view.

For instance, most of what I've seen on DDD is focused on what Eric calls "tactical" DDD, i.e., figuring out the objects in the domain and ensuring you stay on track with the domain using what he calls the "ubiquitous language."  Eric presented parts of his ideas yesterday that he calls "strategic" because they are more geared towards strategic level thinking in how you approach building your software.  Two key takeaways I saw were what he calls context mapping, which seems to be a really effective way to analyze existing software to find where the real problems lie, and distilling the domain, which is a way to really focus in on the core part of a system that you need to design.

In short (very abbreviated), he claims (and I agree) that no large system will be completely well designed, nor does it need to be.  This isn't to say you're sloppy but it helps you to focus your energies where they need to be focused--on the core domain.  Doing this actually can help business figure out where they should consider buying off-the-shelf solutions and/or outsourcing as well as where to focus their best folks.  It's a pretty concrete way to answer the buy vs. build question.

Anyways, I'm definitely going to get his book to dig in deeper (it's already on the way).  Please don't take my cliff's notes here as the end of your exploration of DDD.  It definitely warrants further digging, and it is very complementary to a good OOD approach.

After all this, I was privileged enough to bump into Eric and have dinner, getting to pick his brain a bit about how all his thinking on DDD came together, his perspectives on software development, and how to encourage adoption of better design practices (among other things).  Very interesting conversation, one that would have been good for a podcast.  I won't share the details, but I'm sure folks will eventually see some influence this conversation had on me.  Good stuff.

Software for Your Head

I almost forgot about Jim McCarthy's keynote.  I've only seen Jim twice (once in person and once recorded).  He's a very interesting and dynamic speaker, which makes up for some of the lack of coherence.  I find the best speakers tend to come across a bit less coherent because they let speaking become an adventure that takes them where it will.  But I do think there was definitely value in his message.  I tend to agree that he's right in asserting that what we all do on a daily basis has a larger impact on humanity than we realize, and I can't argue with his experience in building teams that work.  http://www.mccarthyshow.com/ is definitely worth a look.

Overall, Tuesday was a big success from an attendee perspective.  So far so good!

Wednesday, 19 September 2007 11:17:20 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, 17 September 2007

When searching recently so as to provide further reading for "domain model" in a recent post, I was quite surprised to find that there seemed to be no good definition readily available (at least not by Googling "domain model").  Since I tend to use this term a lot, I figured I'd try to fill this gap and, at the very least, provide a reference for me to use when I talk about it.

So What is a Domain Model?
Put simply, a domain model is the software model of a particular domain of knowledge (is that a tautology?).  Usually, this means a business domain, but it could also mean a software domain (such as the UI domain, the data access and persistence domain, the logging domain, etc.).  More specifically, this means an executable representation of the objects in a domain with a particular focus on their behaviors and relationships1.

The point of the domain model is to accurately represent these objects and their behaviors such that there is a one-to-one mapping from the model to the domain (or at least as close as you can get to this).  The reason this is important is that it is the heart of software solutions.  If you accurately model the domain, your solution will actually solve the problems by automating the domain itself, which is the point of pretty much all business software.  It will do this with much less effort on your part than other approaches to software solutions because the objects are doing the work that they should be doing--the same that they do in the physical world.  This is part and parcel of object-oriented design2.

Nothing New
By the way, this is not a new concept--OO theory and practice has been around for decades.  It's just that somewhere along the line, the essence of objects (and object-oriented design) seems to have been lost or at least distorted, and many, if not most, Microsoft developers have probably not been exposed to it, have forgotten it, or have been confused into designing software in terms of data.  I limit myself to "Microsoft developers" here because it is they of whom I have the most experience, but I'd wager, from what I've read, the same is true of Java and other business developers. 

I make this claim because everyone seems to think they're doing OO, but a concrete example of OOD using Microsoft technologies is few and far between.  Those who try seem to be more concerned with building in framework services (e.g., change tracking, data binding, serialization, localization, and data access & persistence) than actually modeling a domain.  Not that these framework services are not important, but it seems to me that this approach is fundamentally flawed because the focus is on software framework services and details instead of on the problem domain--the business domain that the solutions are being built for. 

The Data Divide
I seem to write about this a lot; it's on my mind a lot3.  Those who try to do OOD with these technologies usually end up being forced into doing it in a way that misses the point of OOD.  There is an unnatural focus on data and data access & persistence.  Okay, maybe it is natural or it seems natural because it is ingrained, and truly a large part of business software deals with accessing and storing data, but even so, as I said in Purporting the Potence of Process4, "data is only important in as much as it supports the process that we’re trying to automate." 

In other words, it is indeed indispensable but, all the same, it should not be the end or focus of software development (unless you're writing, say, a database or ORM).  It may sound like I am anti-data or being unrealistic, but I'm not--I just feel the need to correct for what seems to be an improper focus on data.  When designing an application, think and speak in terms of the domain (and continue to think in terms of the domain throughout the software creation process), and when designing objects, think and speak in terms of behaviors, not data. 

The data is there; the data will come, but your initial object models should not involve data as a first class citizen.  You'll have to think about the data at some point, which will inevitably lead to specifying properties on your objects so you can take advantage of the many framework services that depend on strongly-typed properties, but resist the temptation to focus on properties.  Force yourself to not add any properties except for those that create a relationship between objects; use the VS class designer and choose to show those properties as relationships (right-click on the properties and choose the right relationship type).  Create inheritance not based on shared properties but on shared behaviors (this in itself is huge).  If you do this, you're taking one step in the right direction, and I think in time you will find this a better way to design software solutions.

My intent here is certainly not to make anyone feel dumb, stupid, or like they've wasted their lives in building software using other approaches.  My intent is to push us towards what seems to be a better way of designing software.  Having been there myself, I know how easy it is to fall into that way of thinking and to imagine that simply by using these things called classes, inheritance, and properties that we're doing OOD the right way when we're really not.  It's a tough habit to break, but the first step is acknowledging that there is (or at least might be) a problem; the second step is to give object thinking a chance.  It seems to me that it is (still) the best way to do software and will continue to be in perpetuity (because the philosophical underpinnings are solid and not subject to change).

Notes
1. An object relationship, as I see it, is a special kind of behavior--that of using or being used.  This is also sometimes represented as a having, e.g., this object has one or more of these objects.  It is different from data because a datum is just a simple attribute (property) of an object; the attribute is not an object per se, at least not in the domain model because it has no behaviors of its own apart from the object it is attached to.  It is just information about a domain object.

2. I go into this in some depth in the Story paper in the Infragistics Tangerine exemplar (see the "To OOD or Not to OOD" section).  I use the exemplar itself to show one way of approaching domain modeling, and the Story paper describes the approach.

3. Most recently, I wrote about this in the Tangerine Story (see Note 2 above).  I also wrote publicly about it back in late 2005, early 2006 in "I Object," published by CoDe Magazine.  My thought has developed since writing that.  Interestingly, in almost two years, we seem to have only gotten marginally better ways to deal with OOD in .NET. 

4. In that article, I put a lot of focus on "process."  I still think the emphasis is valid, but I'd temper it with the caveat that however business rules are implemented (such as in the proposed workflow-driven validation service), you still think of that as part of your domain model.  The reason for separating them into a separate workflowed service is a compromise between pragmatism and idealism given the .NET platform as the implementation platform.  I've also since learned that the WF rules engine can be used apart from an actual .NET workflow, so depending on your application needs, just embedding the rules engine into your domain model may be a better way to go than using the full WF engine.  If your workflow is simple, this may be a better way to approach doing validation.

Monday, 17 September 2007 11:41:54 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Saturday, 15 September 2007

As I sit here on my deck, enjoying the cool autumn breeze1, I thought, what better thing to write about than Web services!  Well, no, actually I am just recalling some stuff that's happened lately.  On the MSDN Architecture forums and in some coding and design discussions we had this week, both of which involve the question of best practices for Web services.

Before we talk about Web services best practices, it seems to me that we need to distinguish between two kinds of application services.  First, there are the services that everyone has been talking about for the last several years--those that pertain to service-oriented architecture (SOA).  These are the services that fall into the application integration camp, so I like to call them inter-application services. 

Second, there are services that are in place to make a complete application, such as logging, exception handling, data access and persistence, etc.--pretty much anything that makes an application go and is not a behavior of a particular domain object.  Maybe thinking of them as domain object services would work, but I fear I may already be losing some, so let's get back to it.  The main concern within this post are those services using within an application, so I call them intra-application services.

It seems like these latter services, the intra-application ones, are being often confused with the former--the inter-application services.  It's certainly understandable because there has been so much hype around SOA in recent years that the term "service" has been taken over and has lost its more generic meaning.  What's worse is that there has been a lot of confusion around the interaction of the terms Web service and just plain service (in the context of SOA).  The result is that you have folks thinking that all Web services are SO services and sometimes that SO services are always Web services.

My hope here is to make some clarification as to the way I think we should be thinking about all this.  First off, Web services are, in my book at least, simply a way of saying HTTP-protocol-based services, usually involving XML as the message format.  There is no, nor should there be, any implicit connection between the term Web service and service-oriented service.  So when you think Web service, don't assume anything more than that you're dealing with a software service that uses HTTP and XML. 

The more important distinction comes in the intent of the service--the purpose the service is designed for.  Before you even start worrying about whether a service is a Web service or not, you need to figure out what the purpose of the service is.  This is where I get pragmatic (and those who know me know that I tend to be an idealist at heart).  You simply need to determine if the service in question will be consumed by a client that you do not control. 

The reason this question is important is that it dramatically affects how you design the service.  If the answer is yes, you automatically take on the burden of treating the service as an integration (inter-application) service, and you must concern yourself with following best practices for those kinds of services.  The core guideline is that you cannot assume anything about the way your service will be used.  These services are the SO-type services that are much harder to design correctly, and there is tons of guidance available on how to do them2.  I won't go in further depth on those here.

I do think, though, that the other kind of services--intra-application services--have been broadly overlooked or just lost amidst all the discussion of the other kind.  Intra-application services do not have the external burdens that inter-application services have.  They can and should be designed to serve the needs of your application or, in the case of cross-cutting services (concerns) to serve the needs of the applications within your enterprise.  The wonderful thing about this is that you do have influence over your consumers, so you can safely make assumptions about them to enable you to make compromises in favor of other architectural concerns like performance, ease of use, maintainability, etc.

Now let's bring this back to the concrete question of best practices for intra-application Web services.  For those who are using object-oriented design, designing a strong domain model, you may run into quite a bit of trouble when you need to distribute your application across physical (or at least process) tiers.  Often this is the case for smart client applications--you have a rich front end client that uses Web services to communicate (usually for data access and persistence).  The problem is that when you cross process boundaries, you end up needing to serialize, and with Web services, you usually serialize to XML.  That in itself can pose some challenges, mainly around identity of objects, but with .NET, you also have to deal with the quirks of the serialization mechanisms.

For example, the default XML serialization is such that you have to have properties be public and  read-write, and you must have a default constructor.  These can break encapsulation and make it harder to design an object model that you can count on to act the way you expect it to.  WCF makes this better by letting you use attributes to have better control over serialization.  The other commonly faced challenge is on the client.  By default, if you use the VS Add Web Reference, it takes care of the trouble of generating your service proxies, but it introduces a separate set of proxy objects that are of different types than your domain objects.

So you're left with the option of either using the proxy as-is and doing a conversion routine to convert the proxy objects to your domain objects, or you can modify the proxy to use your actual domain objects.  The first solution introduces both a performance (creating more objects and transferring more data) and a complexity (having conversion routines to maintain) hit; the second solution introduces just a complexity hit (you have to modify the generated proxy a bit).  Neither solution is perfectly elegant--we'd need the framework to change to support this scenario elegantly; as it is now, the Web services stuff is designed more with inter-application services in mind (hence the dumb proxies that encourage an anemic domain model) than the intra-application scenario we have where we intend to use the domain model itself on the client side.

If you take nothing else away from this discussion, I'd suggest the key take away is that when designing Web services, it is perfectly valid to do so within the scope of your application (or enterprise framework).  There is a class of services for which it is safe to make assumptions about the clients, and you shouldn't let all of the high-falutin talk about SOA, WS-*, interoperability, etc. concern you if your scenario does not involve integration with other systems that are out of your control.  If you find the need for such integration at a later point, you can design services (in a service layer) then to meet those needs, and you won't be shooting yourself in the foot trying to design one-size-fits-all services now that make so many compromises so as to make the app either impossible to use or very poorly performing.

My own preference that I'd recommend is to use the command-line tools that will generate proxies for you (you can even include a batch file in your project to do this) but then modify them to work with your domain model--you don't even need your clients to use the service proxies directly.  If you use a provider model (plugin pattern) for these services, you can design a set of providers that use the Web services and a set that talk directly to your database.  This enables you to use your domain model easily in both scenarios (both in a Web application that talks directly to the db as well as a smart client that uses Web services). 

It requires a little extra effort, but it means you can design and use a real domain model and make it easier easier to use by hiding the complexity of dealing with these framework deficiencies for consumers of the domain model.  This is especially helpful in situations where you have different sets of developers working on different layers of the application, but it is also ideal for use and reuse by future developers as well.

One of these days, I'll write some sample code to exemplify this approach, maybe as part of a future exemplar.

Notes
1. The weatherthing says it's 65 degrees Fahrenheit right now--at 1pm!
2. My observation is that it is safe to assume that when other people talk about services and Web services, these are the kind they're thinking of, even if they don't make the distinction I do in this post. 

Saturday, 15 September 2007 18:00:03 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, 10 September 2007

I wasn't going to post about it, but after reading Don's post, I realized that I should so that I can thank those involved in presenting me with this honor.  I was surprised when I was contacted about being nominated to be an INETA speaker, and I was even more surprised when I heard that I'd been voted in.  Looking over the folks on the list, I feel hardly qualified to be named among them.

So without further ado, let me thank David Walker (who's an all around great guy and VP of the Speakers Bureau), Nancy Mesquita (who I've not had the pleasure to meet personally but has been very helpful in her role as Administrative Director), as well as everyone else involved on the Speaker Committee and others (whom I know not of specifically) in welcoming me into the INETA speaker fold.  It's a great honor--thank you. 

Now, I have to get back to work!  My group, UXG, just released Tangerine, the first of our exemplars, and now we're on to the next great thing!

Monday, 10 September 2007 10:19:19 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 

Disclaimer
The opinions expressed herein are solely my own personal opinions, founded or unfounded, rational or not, and you can quote me on that.

Thanks to the good folks at dasBlog!

Copyright © 2017 J. Ambrose Little