On this page.... RSS 2.0 | Atom 1.0 | CDF
Who is dotNetTemplar?
Love the Bomb
Silverlight Controls
Cooper's Keynote at Agile 2008
Get My UX On!
Wrox Silverlight 2 Programmer's Reference Early Access
Podder Skinning Contest Extended
Minimizing Email Distractions
OpenSocial - The OpenID for Social Networks?
We Don't Need No Architects--Really!
DeveloperDeveloperDeveloper Ireland
Creating Software is Not Like Building
Just Do It! or: How I Learned to Stop Worrying and Love the Job
This is My Bamboo
IT Architect Regional Conference - NYC
Silverlight 2 Sample Application - faceOut
One System to Rule Them All - Managing Complexity with Complexity
Software as a Biological Ecosystem
Blog Notes Live Writer Plug-in
Favor Thoughtful Adherence Over Blind Adherence
Are We Missing the Point of Patterns?
SQL Toolbelt (Mug) by Red Gate
Object Thinking Domain Model Example
The Timeless Way is Agile
Report from SD Best Practics Day IV (Final)
Report from SD Best Practices Day III
Report from SD Best Practices Day II
Report from SD Best Practices Day 1
What is a Domain Model?
Web Services Best Practices
Me? An INETA Speaker?
From Interaction Design to Context Awareness
Are You Passionate?
Tampa Code Camp 2007 [Download]
Review: Essential Windows Presentation Foundation
How Would You Like Your Bits?
Notes on the Notes of the Synthesis of Form
Reminder - NYC Code Camp!
Best Thoughts on SOA
blogmailr Beta
New Jersey CodeCamp III - This Weekend!
Got ASP.NET AJAX? Get ASP.NET AJAX for NetAdvantage!
New Web Site Released for Infragistics!
BEWARE the EnableCaching of XmlDataSource
Tulsa Tech Fest Results Are In
Adding Custom Browser Capabilities in ASP.NET
Tulsa TechFest 2006
Cross-Frame Scripting in Firefox Using XMLHttpRequest
Strongly-Typed Profiles in Web Application Projects (WAP)
Totally Awesome Software Company Wants You
Web Dev with IE
Output Caching Profiles and Custom Caching
Yes, Dorothy, There Really is a Web Architect
Oracle ELMAH Provider
Right-Click to Run ASP.NET Development Server
Live from Redmond Series 2 from the .NET FX Product Teams
Philosophy and IT Architecture
Blog Reading and Authoring Review
The Cat's Out of the Box: The ASP.NET Sand Box
Using Live.com as Blog Reader
NJ.NET User Group Meeting TONIGHT!
Password Policy Policy
Vista 64 bit Up and Running
Accelerating Web Dev with Enterprise Library 2 Session at TechEd
New Blog at Infragistics
Snap-In Failed to Initialize - Indexing Service
VS 2005 Web Application Project V1.0 Released
More Oracle Gotchas
Updated DasBlog
Visual Inheritance with User Controls in ASP.NET 2.0 Web Applications
Launching New Convenience Categories
OPC Odyssey
# Sunday, September 25, 2011

This blog has been retired. For the foreseeable future, I will maintain the content here as is.

Check out my current blog(s).

Sunday, September 25, 2011 5:20:15 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [343]  | 
# Wednesday, September 14, 2011


Courtesy of ZDNet

Much ado has been made in the recent months about the impending death of <insert name of MS UI stack here>. This week, at BUILD, Microsoft has finally stepped up and shown us what their view of the future is. Note that, in their architecture diagram, the current "desktop" technologies are represented as a peer. Included in the right side is Windows Forms, which of any .NET technology has long been exaggerated as dead; and yet it is still alive.

The point is, despite all the "future is Metro" talk by other analysts (e.g., Mary Jo herself), the fact remains that these are peer technologies, and Microsoft is not "killing" anything in the right side. In fact, there is no such intent expressed implicitly or explicitly.

That's not to say, of course, that nothing has changed. That's not to say that we can or should ignore Metro/WinRT (duh!). But there seems to be this common knee jerk reaction when a new technology is released to say that the current ones are now dead or somehow not worth investing in further. That reaction just doesn't reflect reality.

As impressed and (mostly) happy as I am about the direction expressed in the Win8 stack, we need to keep in mind that we are still in the early stages, still in gestation. The baby isn't even born yet, and once it is born, it will take time to grow up and mature. In the meantime, we have these mature, stable, currently released technologies that are great to build on.

I think it's great that Microsoft has released this stuff early. I like that about them better than other tech vendors. Although they've been more tightlipped about this than any other tech they've released, the fact remains we have plenty of time to plan, prepare, design, prototype, explore, and ultimately build for the new stack. In the meantime, we can still safely invest in the current technologies.

The future is uncertain. That is the nature of the future. Devs need to quit unrealistically asking Microsoft to guarantee them the future of technology. We know that it would be bad business for Microsoft to kill off these current technologies; so bad, we should feel it as a positive guarantee that they are here to stay for any future that we should be currently planning for. We will always have legacy. Someday, the Win8 stack will, I assure you, be legacy.

The things that remain constant are:

  • Understand the needs of your application context.
  • Understand the capabilities, strengths, and weaknesses of the various technologies you can build on, including current investments.
  • Understand your team's capabilities, strengths, and weaknesses, including current investments.
  • And choose the technology stack that makes the most sense, best balancing all these considerations, realizing that you won't make all the right choices (in retrospect) and that this is just life as a software professional.

Everything else is just a bunch of unnecessary worry and hullaballoo.

Wednesday, September 14, 2011 9:58:18 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Wednesday, October 29, 2008
The good folks over at Microsoft have released their Silverlight Toolkit today (or I guess yesterday, now).  It's a good start to complement what you get in the core/box.  I personally found the AutoCompleteBox probably the most interesting control, and the theming capabilities being introduced with the ImplicitStyleManager are promising, too.  I also like the way the team is running the toolkit project on codeplex and all.  Shawn Burke (of Ajax Control Toolkit fame) is a great guy to be leading that group of talented folks.

Of course, in my humble opinion ;-), no discussion of Silverlight controls would be complete without looking at the outstanding work that Infragistics is doing, particularly in the data visualization space.  In our current CTP that you can download now, we have the xamWebChart, xamWebGauge, xamWebMap, xamWebTimeline, and xamWebZoombar.

If you go to http://infragistics.com/silverlight, you can read about all those controls, but I encourage everyone to just go play with them in our really awesome (if I do say so myself!) Silverlight controls samples browser.  I mean, it is just plain fun tweaking around with it.  I could sit and watch the datapoint transitions all day, and the chart zooming is freakin' cool.  Needle dragging gauge, overview plus detail implementation for the map, timeline...  Heck, who am I kidding?  It's all sweet!

    

I'm not just saying that because I work there; I really was just having fun and am frankly impressed with the work our guys are doing.  Kudos to our entire Silverlight team!   Great job, guys!  I'd name names, but I'm bound to forget someone.  You know who you are!

And the fact that I'm playing with this in Safari on my Mac over my crappy hotel internet access just makes it that much cooler and fun.  You gotta love Silverlight! 

If you're interested more in Line of Business (e.g., Outlook bar, hierarchicial data grid, tree, etc.), check out our info on Silverlight LOB controls.  According to our roadmap published there, you should see a CTP of our first release towards the end of this year.  Silverlight rocks, and I'm looking forward to seeing it develop and being a part of making it even better.

You should stop by our booth at PDC, if you haven't already, and ask about all this.  We're Booth #201 (about 3 down on the left from the middle entrance).  There's still some time left, and you can pummel the guys and gals there with your questions.  Of course, you can always just call, chat, or email someone as well (or use the contact link on this blog, and I'll put you in touch). 

Also, feel free to stop me in the hall at PDC if you want.  I won't bite.  Everyone says I look like Kevin Smith (which is why I was Silent Bob at the Expo reception Monday), so you should be able to recognize me, even without the trenchcoat. :)  Now I have to get to sleep so I can keep my promise and not be a zombie (and ergo not bite).  G'night!

Wednesday, October 29, 2008 4:10:19 AM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [1]  | 
# Saturday, August 16, 2008

I just ran across Alan Cooper's keynote at Agile 2008.  The gist is that he's making the case for integrating interaction design into Agile development, something that is near and dear to me, as well.  I was pleasantly surprised by his talk, and I recommend it to all my dev friends. 

You can quickly scan through the slides and his notes to get the whole story.  I'm not sure if I could have said it better myself!

Saturday, August 16, 2008 2:45:23 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, August 8, 2008

I'm heading out to San Francisco Monday to get my UX on at UX Week 2008! It's my first time both in the city and at that conference. Looking forward to meeting new folks and talking about making great software experiences. If you're in the area or at the conference, send me an email (ambrogio[at]gmail) or ping me on crowdvine. I'd be glad to get together to talk about UX, software, architecture, whatever!

Friday, August 8, 2008 10:36:18 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, July 29, 2008

Wrox has started a new thing as of late they're calling Wrox First.  It's essentially a wiki of the book that I and my fellow Silverlight 2 authors are working on--Silverlight 2 Programmer's Reference.  Not only do you get early access, you can also shape how the book develops by making comments and suggestions.  My understanding is that it's just $19.99 and will get you access to drafts, revisions, and the final chapters as they are in the book for up to a year after publishing. 

Seems like an interesting option for those who want the book and sample code now rather than waiting until later this year when it is published.  Let me know what you think!

Tuesday, July 29, 2008 2:35:41 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Wednesday, July 2, 2008

Phwew!  I just moved yesterday (actually all weekend and yesterday and still more unpacking to go now!).  Man, all that moving is starting to wear, but we're very happy in the new place.  A lot more space to make room for number four! :)

On to the point.  Josh Smith has extended his Podder skinning competition.  For those who don't know, Podder is this nifty WPF-based podcasting client/player.  He designed it so that you can completely change the look and feel using skins.  I suggested a better term would be skeletoning, since you can change the structure in addition to the styling, but so far that hasn't caught on.  Be sure to tell him you think that's a better term!

Wednesday, July 2, 2008 9:39:04 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, June 20, 2008

I'm not sure why this didn't occur to me before...  I read recently another brief article about the negative impact of email on productivity the other day, so I was thinking about a way to deal with it that didn't involve, e.g., closing Outlook and maybe even setting an "I'm not available by email until 3p today" out of office type message--seems a bit extreme, and it would also preclude my getting meeting reminders. 

It occurred to me that what usually happens is I get the nifty little toaster popup notification while doing something, almost always click on it for more detail, and then get drawn into a distraction over it.  Similarly, I was using one of those Gmail Vista gadgets that would highlight when I had Gmail waiting, or I'd leave it open and minimized and see the Inbox count in the taskbar.  The problem was not (for me) so much getting too much email as having the regular interruptions that were occasioned by these terribly useful notification mechanisms. 

Having isolated the problem, i.e., having framed the question correctly (which usually the most important part of solving a problem), I asked "How can I make these notifications go away?"  And the answer was immediately apparent: turn them off. :)

To that end, I went into Outlook advanced email options (Tools -> Options -> Email Options -> Advanced Email Options--who knew notifications were advanced?!) and deselect all the notification options:

Advanced E-mail Options Dialog

I then removed the Gmail notifier gadget, and I close my Gmail when done with it.  The magic is that I still get my task and meeting reminders, but I don't get the regular interruptive notifications.  This had an immediate noticeable effect--I could work through to a good stopping point on the thing I was working on, i.e., a point I'd normally take a break, and then I'd check my email.  Wow!  Who knew something so simple could make such a difference?  I figure if it is critical, somebody will call or come knocking on my door. :)

As a complimentary technique to that, I have taken my Inbox strategy to the next level, following a bit of advice given by Mark Hurst (who wrote a book on Bit Literacy [that I haven't read]).  One of his suggestions to avoid information overload is to keep your Inbox empty.  I previously already worked to do that because I used my Inbox like a to-do list (and don't like having a long to-do list), but Mark's advice is precisely not to do that--use it as an Inbox and get stuff out of it immediately. 

Having not read the book (in which I'm sure are tons of helpful little tidbits), I take that to mean act on it immediately if possible, file it if need be, or set up a task to do something with it later.  I was already doing the first two, but I've found this additional third technique to be a nice add.  There is a distinct satisfaction (for me anyway) to having an empty inbox--maybe it's my personality type. :)

I hope this maybe helps others out there in the same boat.

Friday, June 20, 2008 5:28:31 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Thursday, May 15, 2008

I haven't done any research, so maybe it is out there.  But I had a thought the other day as I accepted yet another invite to connect from yet another social networking site from someone I have connected with numerous times. 

Wouldn't it be great if I could have one, unified set of social contacts, my social network, that I could then just share out to various social networking sites?  I mean, sure, folks would have to opt into it, someone would have to think about the privacy issues, but good grief, it seems like we need something like that...

Thursday, May 15, 2008 1:02:49 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Wednesday, April 23, 2008

Just reading the first article in the latest edition of Microsoft's The Architecture Journal.  It's called "We Don't Need No Architects" by Joseph Hofstader.  I thought, oh good, someone voicing a dissident opinion, but the article is rather a rebuttal to that claim.  I figure maybe a response to the response is in order. :)

Mr. Hofstader suggests that architects think in terms of bubbles and devs think in terms of code and, by extension, only see part of the picture.  He describes various "architectural" activities such as analyzing the problem domain, choosing and applying appropriate technologies to solve problems, and the use of patterns.

Is it just me, or is this a sort of dumbing down of the developer role in order to support a, potentially unnecessary, distinction between it and "the architect"?  I mean, a smart developer needs to do all of these things, too.  They're not just code monkeys.

In fact, in introducing such a division in responsibilities, we would actually seemingly perpetuate a long-standing problem in software development--a disjuncture between the problem and solution space because we keep trying to insert these business translators (call them technical business analysts, software architects, whatever you want) into our methodology. 

What's wrong with this?  First, it puts the burden for understanding the business onto one (or a few) persons, but more importantly, it limits that mind share to those individuals.  That is never a good thing, but it is especially bad for software.  In so doing, it also puts a burden on those individuals to correctly interpret and translate (a considerable challenge) and finally to sufficiently communicate a design to developers--enter large specification documents, heavier process, and more overhead.

On the other hand, domain-driven design, for instance, is all about instilling domain knowledge into the solution and coming to a common alignment between the business and the solution creators.  It's axiomatic in business that you need a shared vision to be successful, and this approach to software creation is all about that.  Shared vision, mutual cooperation, and a shared language. 

It eliminates the need for a translator because both learn to speak the same domain language.  It eliminates the knowledge bottlenecks (or at least really reduces them), and it increases shared knowledge.  And DDD is not burdened with the distinction between an architect and a developer.  Agile methodologies in general are geared towards reducing barriers and overhead in the creation of software (and that's why they're generally more successful, and they can scale).

I hope that all the brilliant and more-well-known/respected folks will forgive me; this is not intended as a slight, but I have to ask--are we creating the "architecture" profession unconsciously just to create a more defined career path (i.e., a way for us techies to move up the ranks)?  Are we just going with the flow from an old but broken analogy?  Are we introducing roles that really would be better served through other, non-architecty roles?

To this last point, I see some folks suggesting "infrastructure" and "business" and "software" and "whatehaveyou" architects.  Why are we so keen on the term "architect"?  I'll grant, it does sound really fancy, but it is so, so painfully clear that it is ambiguous and overloaded (and inaccurate, if you ask me) .  Maybe these other roles do need to exist in some organizations, but it seems like we're just bent on calling them "architect" for no apparent good reason other than we've latched onto it as a respectable (and well-paid) moniker. 

In choosing to proliferate the "architect" terminology, we're perpetuating and extending the confusion around it.  We're purporting to solve the problem of it being ill-defined, but in reality we're doing the opposite.  And everyone (IASA, Open Group, Microsoft, to name some just in the latest issue of the Journal) is trying to do it all at once with little coordination. 

It seems borderline insane. 

Or maybe I'm the crazy one?

there is no spoon

Wednesday, April 23, 2008 3:15:42 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [2]  | 
# Tuesday, April 15, 2008

A few buddies of mine, Phil Winstanley and Dave Sussman, have asked me to pass along that they're doing an upcoming DeveloperDeveloperDeveloper event in Galway, Ireland on 3 May.  So on the off chance I have some readers in that area, I figured I'd pass it along.

Enjoy!

Tuesday, April 15, 2008 5:13:13 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 

I'm becoming more and more averse to the term architecture and architect in terms of creating software, partially because it is such an overloaded term that seems to cause so much continual opining about its meaning but, more importantly, because I don't think it is like what we do, at least not to the extent that seems to be commonly thought.

We are not building houses (or bridges, or skyscrapers, or cities).  Houses, and other physical constructions, rely on pretty much immutable laws of nature, of physics, chemistry, etc.  These sciences are sciences in the established sense--you can perform experiments repeatedly and get the same results, and others who perform those experiments will get the same results.  Physical building, architecture, and engineering is fundamentally a scientific endeavor because it is essentially serving scientific laws.1

Software Serves Human Social Needs
Software, on the other hand, is fundamentally a human and social endeavor.  Above the basic electrical and magnetic level, i.e., hardware, it is purely human constructs built on layers of human-created abstractions built to serve human social needs--for, ultimately, business or pleasure.  As such, we (as a human industry) are pretty much free to create the abstractions as we see fit. 

Beyond the basic hardware translation layer, we are not bound by elemental laws, only by our imagination.  The problem is, it seems to me, that early software development was very closely tied to the electrical engineering disciplines that gave birth to computing machinery, so the early abstractions were engineering-oriented and assumed an unnecessary scientific and engineering bent.  Subsequent developments, for the most part, have built on this engineering basis, and our educational system has perpetuated it.  Even though relatively few software creators these days need to understand the inner workings of the hardware (and even one layer of abstraction up), such low-level engineering is at the core of many computer science curricula.

As the power of computing machinery has grown, we've expanded the uses of software to take advantage of the new power, but we have remained an essentially engineering-based culture and have accrued other engineering-related words such as architecture and architect.  We have engineers and developers, systems analysts, and architects.  We have projects and project managers, and many try to manage software projects as if they were building projects.  We have builds, and we say we're building or developing or engineering software.

We have, built into our very language, an implicit association with physical building, and we have the association repeatedly reinforced by those who want to draw direct analogies between our trades.  Certainly, there are similarities, but I tend to think much of those similarities have been manufactured--they're not inherent to the nature of software.  We've painted ourselves into a corner by such analogies and borrowing of techniques and language.

Perceived Crisis of Complexity and Terminology
Now we're having this crisis, as some seem to paint it, where we need to elaborate further and push the idea that our systems are like cities and that we need varying levels and kinds of architects to help plan, build, maintain, and expand these software cities.  We have folks struggling to define what an architect is, what architecture is, and creating various stratifications within it to expand on this analogy.  We purportedly need enterprise architects, solutions architects, infrastructure architects, data architects, and more.

There is I think a well-intentioned effort to fix it because we do see this corner we've painted ourselves into, but we're reaching for the paint brush and bucket to solve it--reaching for those same ill-fashioned analogies, techniques, mindset, and culture.  We see all this accrued complexity, and our solution is to make things even more complex, both terminologically and systematically, because we're engineers and scientists, and scientific problems are solved with scientific methods and precision, no?.

It seems the underlying problem is that we're approaching the problem all wrong.  The problems we're solving are fundamentally human problems, particularly social problems.  And by social, I don't mean social networking software that is now en vogue; I mean social in the basic sense of dealing with interactions between humans, be that economic, entertainment, education, social connection, or whatever.  It follows, then, that the best solution will be fundamentally human in nature, not scientific, not engineering.

Realigning with Our Core Problem Domain
Maybe we should avoid likening ourselves to engineering and scientific disciplines, and especially, we should shun terminology that ties us to them and binds our thinking into those molds.  As a man thinks, so is he, as the saying goes.  Surely, we can and should learn what we can from other disciplines, but we need to be more reticent to insinuate them into our own as we have done with building.

I do think various solutions have been tried to better align software with its problem domain.  Object-oriented design is at a generic level an attempt to urge this sort of alignment, as is its more developed kin, domain-driven design.  Agile and its like work toward human-oriented processes for creating software.  Natural language systems, workflow systems, small-scale (solution-level) rule engines, and even some higher-level languages have attempted this.  And in fact, as a rule, I think they succeed better than those more closely tied to the computing and building conceptual models, except that even these more human-oriented abstractions are chained by the lower level abstractions we've created.

What we need to do is continue to develop those human-oriented models of creating software.  It seems that we may be at a breaking point, however, for our continued use of the building paradigm.  Our repeated struggles with the terminology certainly seem to speak to that.  Our terribly confused and complicated enterprise systems landscape seems to speak to that.  Our control-driven, formal, gated processes have been most well shown to be broken and inappropriate to the task of software creation.

New Terminology
To make the next step, perhaps we should reexamine at a fundamental level how we think about software, both the artifacts and how we create them.  I think we need to make a clean break with the engineering and building analogy.  Start fresh.  Unshackle our minds.  Maybe we need to drill down the abstraction layers and figure out where we can most effectively make the transition from hardware control to our human, social domain.  I imagine it would be lower than we have it now.  Or maybe it is just a matter of creating a better language, an intentional language (or languages) and a move away from our control-oriented languages. 

At a higher level, we certainly need to rethink how we think about what we do.  Some folks talk about the "architect" being the "bridge" (or translator) between the business and the technical folks.  If that is a technical role, which I tend to doubt, it seems like a more appropriate title would be Technical Bridge or Technical Translator or Technical Business Facilitator or even just Software Facilitator.  Call it what it is--don't draw unnecessarily from another dubiously-related profession.

But maybe thinking this role is best served with a technical person is not ideal.  Maybe we technical folks are again trying to solve the problem with the wrong tools--us.   Well-intentioned though many are, if we are technical in tendency, skills, talent, and experience, we are not as well equipped to understand the squishy, human needs that software serves or best identify how to solve such squishy human problems.

Since software is essentially a human-oriented endeavor, perhaps we need a role more like that which has been emerging on the UX side of things, such as [user] experience designer or interaction designer.  They are better-equipped to really grok the essentially human needs being addressed by the software, and they can provide precise enough specification and translation to technical folks to create the experiences they're designing, even with the tools we have today.

Then again, some say that architects are the ones concerned with high-level, "important" views of a solution, interactions among individual pieces, that they are those who model these high-level concerns and even provide concrete tools and frameworks to help effectively piece them together.  I say that we could call this role solution coordinator, solution designer, or solution modeler.  But then, according to folks like Eric Evans, these folks should be hands-on to be effective,2 which I also believe to be true.  In that case, what they become, really, is a kind of manager or, simply, team leader, someone who's been there and done that and can help guide others in the best way to do it.  At this point, the skills needed are essentially technical and usually just a matured version of those actually crafting the solution. 

Instead of software developers and architects, how about we just have technical craftsmen?  The term is appropriate--we are shaping (crafting) technology for human use; it also scales well--you can add the usual qualifiers like "lead," "manager," "senior," whatever fits your needs.  There's no unnecessary distinction between activities--whether the craftsman is working on a higher-level design or a lower-level, it is all essentially the activity of shaping technology for human use.  Depending on the scale of the team/endeavor, one craftsman may handle all levels of the craft or only part, and in the latter case, the division can easily be made based on experience and leadership.  And finally, it does not introduce cognitive dissonance through extremely-overextended and inaccurate analogy (like developer and architect).

Even if you don't like the term craftsman--we could collaborate to choose another that doesn't chain us to wrong thinking--the point remains that we should recognize that we've introduced unnecessary and unhelpful distinction in our discipline by using the dev and architect terminology.  We could begin to solve the conundrum by abandoning these titles.

Resisting the Urge to Rationalize and Control
Also, by looking at each solution as a craft--an individual solution tailored to address a particular human problem, it becomes clearer that we need not be so ready to try to rationalize all of these solutions into some greater system.  As soon as we do that, we fall back into the engineering and computing mode of thinking that will begin to impose unnatural constraints on the solutions and inhibit their ability to precisely and accurately solve the particular human need.3 

As I suggested before, we should rather treat these solutions more like a biological ecosystem--letting selection and genetic mutation mechanisms prevail in a purely pragmatic way that such systems have so well embedded in their nature.  I believe it is a misplaced good intention to try to govern these systems in a rationalistic, control-driven way.  We deceive ourselves into thinking that we are managing complexity and increasing efficiency when in reality we are increasing complexity that then, recursively, also has to be managed in such a philosophy (creating an infinite complexity management loop).  We also reduce efficiency and effectiveness (well-fittedness) of solutions by interfering with solutions with controls and imposing artificial, external constraints on them to serve our governance schemes.4

Wrapping It All Up
Once we stop trying to align ourselves with a fundamentally different endeavor--physical building--we free ourselves to essentially orient what we're doing towards the right domain--human social problems.  In doing so, we can re-examine our abstraction layers to ensure they most effectively fit that domain at the lowest possible level, and then we can start building new layers as needed to further enable effective (well-fitted) solutions for that domain.  By changing our language, we solve cognitive dissonance and illuminate where distinctions are truly needed, or not needed, and may even recognize where skills that are not inherently technical would better serve our solutions (such as UX pros).  And lastly, by treating the solutions as fundamentally human, we recognize that the most efficient, effective, time-tested5 and proven technique for managing them is more biological and less rational.  We see that they can best manage themselves, adapting as needed, to fit their environment in the most appropriate way possible.

If we're going to have a go at fixing the perceived current problem of complexity in software and, by extension, further understand how to solve it through our profession, I suggest that a somewhat radical departure from our current mode of thinking is needed, that we need to break away from the physical building analogy, and it seems to me that something like what I propose above has the most long-term promise for such a solution.  What do you think?

Notes
1. I should note that I recognize the artistic and ultimately social aspects physical constructions; however, they are still fundamentally physical in nature--bridges are physically needed to facilitate crossing of water or expanse, buildings are needed physically for shelter.  The social aspects are adornments not inherent to the basic problems that these constructions solve.  The same cannot be said of software; it exists solely to serve human social advancement in one form or another.
2. See Eric Evan's "Hands-On Modeler" in Domain-Driven Design: Tackling Complexity in the Heart of Software.
2. As an aside, I truly do wonder why we should have to try to convince businesses of the need for the "architect" role.  If you ask me, the need, and our value/solution, should be obvious.  If it takes a lot of talking and hand waving, maybe we should question if the solution we're proposing is actually the right one.  ?
3. I have to nuance this.  Obviously, if there are governmental regulations you have to follow, some such controls are required; however, if you think about it, this is still adapting the solution to best fit the human problem because the human problem likely involves some need of societal protection.  Certainly not all systems need such controls, and even only some within an organization need them.  Keep the controls scoped to the solutions that require them due to the human social conditions.  On the whole, I'd say that there are vastly far more systems that don't need them, though the ones that do loom large in our minds.
4. By this I mean to say that, according to evolutionary theory, biological processes have developed over many millions of years and have proven themselves as an effective means for highly emergent, living systems to self-govern.  Businesses and human social structures in general, especially these days, are highly emergent, dynamic, and living and need software that reflects that mode of being.

Tuesday, April 15, 2008 4:04:01 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Saturday, April 12, 2008

In his article, "Who Needs an Architect?", Martin Fowler says:

At a fascinating talk at the XP 2002 conference1, Enrico Zaninotto, an economist, analyzed the underlying thinking behind agile ideas in manufacturing and software development. One aspect I found particularly interesting was his comment that irreversibility was one of the prime drivers of complexity. He saw agile methods, in manufacturing and software development, as a shift that seeks to contain complexity by reducing irreversibility—as opposed to tackling other complexity drivers. I think that one of an architect’s most important tasks is to remove architecture by finding ways to eliminate irreversibility in software designs.

How interestingly this melds with my recent thoughts on managing complexity.2 You see, adding processes, management systems, and "governance" in general makes things more ossified, more difficult to change, i.e., less reversible.  According to Zaninotto, this would mean that the more governance we put in place to, theoretically, manage the complexity of our software systems, the more complex they are bound to become, which I think logically means that we are increasing our complexity woes rather than helping them through such efforts.

I came across this in a recent thread on our (now-retired-)architect MVP email list, where the age-old discussion of "what is an architect?" has come up again.  I have to admit, when I first seriously confronted this question, I was drawn in and fascinated.  I even wrote an article about it on ASPAlliance.3  Since writing that, I've been keeping an eye on the developments at IASA and elsewhere in this space, and developing my own thoughts.

I've delved even more into agile approaches, particularly Scrum and domain-driven design (DDD), and into this thing we call "user experience,"4 which at first glance seems counter to our architectural/engineering approaches to building software.  I've gained more experience building software as an architect and manager and observing software being built at the commercial level.  I've been more involved in the business and marketing side of things, and I've been blessed with the opportunity to learn from some of the leading minds in our profession. 

At this point, I'm of the get 'er done school, which I suppose might map loosely to Fowler's Architectus Oryzus, Eric Evans' Hands On Modeler, and others along those lines.  I'm bought into User-Centered Design (or human-centered design, for those who prefer that), though I think we need to figure out a good way to merge DDD with UCD and a smattering of service orientation (as needed!) to make software the best it can be.

Software is complex enough without our making it more so with artificial taxonomic and gubernatorial schemes.  Software should be teleological by nature.  It exists to serve an end, a purpose, and if it isn't serving that purpose, the answer is not to create counterproductive metastructures around it but rather to make the software itself better.

One of the chief complaints about IT is that we seem resistant to change or at least that we can't change at the speed of business.  Putting more processes, formalization, standardization, etc. in place exacerbates that problem.  The other biggie is that software doesn't meet the need it was designed to meet.  Both of these, at their core, have the same problem--ineffective and inefficient processes that are put in place to manage or govern the project.

I tend to think that projects need managing less than people need managing or, rather, coaching.  You get the right people, you give them the equipment, the training, and the opportunity to do the right thing, and you get out the way and help them do it.  You don't manage to dates (or specs!); you manage to results.  If you don't have a solution that meets or exceeds the need at the end of the day, you failed.  In fact, I might go as far to say that if what you built matches the original specs, you did something wrong.

Any managerial involvement should have a concrete and direct end in mind.  For instance, coordination with marketing and other groups requires some management, but such management should be communication-oriented, not control-oriented.  Start small and evolve your management over time.  Management, like other things that are designed, is best evolved over time5 to meet these concrete, real needs--and you should keep an eye out for vestigial management that can be extracted. 

Similarly, I don't think we need to tackle the software (IT) profession by trying to define and stratify everything we do.  In fact, I feel it would be a rather monumental waste of our collective valuable time.  One thing is certain, our profession will change.  New technologies and new ideas will combine with the rapidly changing business needs, and new roles will emerge while old roles will become irrelevant (or at least subsumed into new roles).  Monolithic efforts at cataloguing and defining (and by extension attempting to control) will, in the best of all possible worlds, be useful only for a short time.

It's clear that there are many approaches to doing software.  It's axiomatic that there are many distinct, even unique business needs (inasmuch as there are many unique individuals in the businesses).  What we should be doing, as a profession, (indeed what I imagine and hope most of us are doing) is focusing on how to make great, successful software, not wiling away our lives and energy talking about ourselves. 

If you ask me what I do (e.g., on a demographic form), I tend to put software maker, or just software.  Obviously, that's not specific enough for hiring purposes.  But really, in hiring, we're really looking for knowledge, experience, skills, talents, and attributes, not a role or title.  A title is just a hook, a handy way to get someone interested.  If the market shows that using "architect" in a title catches the attention you want, use it (whether you're a worker or looking for workers).  The job description and interview process will filter at a finer level to see if there's a match.

Outside of that, we don't really need to spend a lot time discussing it.  We're all just making software.  We all have unique knowledge, experience, talents, skills, and attributes, so there really is very little use in trying to categorize it much beyond the basic level.  So how about we stop agonizing over defining and stratifying "architecture" and "architect," stop worrying about controlling and governing and taxonomifying, and instead invest all that valuable time in just doing what we do--better!?

Notes
1.  More at http://martinfowler.com/articles/xp2002.html.
2. See One System to Rule Them All - Managing Complexity with Complexity and Software as a Biological Ecosystem.
3.  Read "What is Your Quest?" - Determining the Difference Between Being an Architect and Being a Developer.
4.  Good place to start: http://en.wikipedia.org/wiki/User_experience
5.  This principle is discussed, for example, in Donald Norman's Design of Everyday ThingsChristopher Alexander also discusses a similar principle in The Timeless Way of Building.

Saturday, April 12, 2008 4:06:39 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, March 28, 2008

I finally gave in and bought a graphics tablet.  My budget being as huge as it was, I opted for the Wacom Bamboo, which retails at $79, but ANTOnline (via Amazon) had it for $50 plus shipping ($58 total).  I haven't been this tickled to get a new gadget in a while.

The whole experience thus far has been grand.  I placed the order at about 10p on Tuesday night.  I got an email Wednesday night saying it had shipped, and when I opened it Thursday morning and clicked the tracking number, I was informed it was out for delivery--and I paid for standard shipping.  Awesome.

I got the box later Thursday morning, and opened it to find a sleek box wrapped in tissue paper, as if it were a gift.  After sliding it out of the tissue paper, here's what I saw:
Wacom Bamboo Box

Not bad styling.  Let's open 'er up:
Wacom Bamboo Welcome Messages

"This is your Bamboo.  Use it to get more out of your computer.  Let us know how it goes..."  In many languages.  Then it is signed by, presumably, the creators.  Very nice touch, I thought.  I felt like a proud owner already.  Then you lift up that insert, and there's the tablet in all its beauty.  Grab it out--there's the cord, the pen, the pen holder.  Great.  Simple. Obvious.  Beneath that is another tissue wrapped gift, a stylish little black box that has some simple instructions on getting going and the DVD.

Wacom Bamboo Open Box

Just opening the thing was a pleasure.  Honestly, these folks know what UX is, and this is just for an $80 graphics tablet. 

I plugged it in, and it immediately just worked.  Having read a comment somewhere, I just went to the Web site to download the latest drivers.  That was easy.  Install.  I had to try twice; it got hung up for some reason, but then, I did have 30 apps open at the time and they did suggest closing them all. :)

I immediately opened OneNote and went to town.  I started drawing the simple stuff as Dan Roam suggests in his new book, The Back of the Napkin.  (I attended his session at Mix and liked it enough to buy the book.)  Then I really went out on a limb and drew a self-portrait:

Ambrose Self Portrait

Not bad, eh? 

Well, it was a first shot.  I tried writing and realized just how bad my penmanship has become over the years.  Trust me; it's bad.  Nice thing is that maybe I'll get some of it back and improve it now that I have this (who knows?). 

I'm now on Day 2 of using my Bamboo, and I really like it.  My wrist, which had been hurting more as of late, has been loving me.  One of the reasons I tried this was to see if it'd be better to avoid "repetitive strain injury," and I noticed an immediate difference.  The other reason was because I get so tired of being constrained by drawing programs in terms of what I want to represent visually.  SmartArt in Office really, truly (as cool as it is) only goes so far. :)

So my first real use was to start diving into my Agile UX Design Process diagram to replace a particularly painful slide (Slide 19) in my Building Good UX talk.  It (both the drawing and the process) is a work in progress; just trying to visualize some of my thinking about it right now.

Agile UX Design Process

If you look hard, you can see my chicken scratch compared to the nice, free Journal font I picked up.  The point of this diagram is to show how to integrate UX pros into an Agile process.  Not saying this is all fleshed out or perfect, but it's a start. :)  One important point is that even if you don't have the pros, you can start doing the UX stuff yourself.

A Few Tips Using Bamboo (thus far)

  1. Use Mouse mode.  When you install the driver, it switches to Pen mode, which tries to map your screen(s) to the tablet.  Even though Wacom recommends this mode (even provides exercises to get use to it), I found it frustrating when trying to draw on my right screen--I felt too close to the edge for comfort. 
  2. Disable acceleration.  While it can be a nice feature when using it literally like a mouse, it messes you up when drawing.
  3. Switch to the dreaded single-click mode in Explorer.  Back when the single click mode was added (XP?), I tried it out and was disgusted.  But double-clicking w/ the pen is just not easy, and actually, the single-click mode feels really natural with the pen.
  4. Switch to scroll on touch ring. I don't feel too strongly about this, but honestly, I don't use zoom (the default) enough to have it as a top-level feature on the tablet.
  5. Upgrade to Vista?  I think that you must not get ink in Office 2007 w/o Vista?  I can't figure it out, but it's not there for me in XP.  The Wacom site mentions Vista explicitly, and my searches haven't turned up anything useful.  Folks talk about "Start Inking" as if it is just always there, but it may also have something to do with Tablet PC.  I'll let you know if I figure it out.

It is taking some getting used to, of course, but so far I think it's a big improvement.  Ask me in a few weeks. :)

And now for the gratuitous signature:

J. [Ambrose] Little

 

 

 

 

Nice.

Friday, March 28, 2008 5:32:00 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [2]  | 
# Wednesday, March 19, 2008

Just wanted to let you all know that I'll be speaking at and attending the upcoming ITARC in NYC, May 22-23. The conference is grass roots and platform agnostic. Grady Booch is giving the keynote from 2nd life. There are some great roundtables and panel discussions on SOA, SOAP vs. REST, as well as others.

It should be a good opportunity to get involved with the local architecture community and participate in discussions on what is currently happening. The registration price is lower then other conferences because we are non-profit and just trying to cover the costs.

There is an attendance limit and the early bird registration ends this month so we encourage you to sign up as soon as possible.  Register Now!

Wednesday, March 19, 2008 10:10:19 AM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, March 17, 2008

For those of you that don't keep an eye on my work blog, my team at Infragistics just published a new Silverlight 2 sample application, faceOut, using prototypes of Infragistics' Silverlight controls.  If you're interested in Silverlight 2 and/or interested in what Infragistics is doing with Silverlight, you should check it out.

Monday, March 17, 2008 8:54:27 AM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, January 11, 2008

After posting my ramblings about software as a biological ecosystem last night, I kept thinking a bit more about the topic of managing complexity and what seems to be the high-end industry response to it.  Put simply, it seems that we're trying to manage complexity with yet more complexity (the whole adding gasoline to the fire analogy).  The more I think about it, the more absurdly ludicrous this approach seems.

And it suddenly came to me--we are seeking The Ring:

One Ring to rule them all, One Ring to find them,
One Ring to bring them all and in the darkness bind them.

This is the solution the industry seems to propose with things like enterprise rule management software and other centralized IT governance initiatives. 

One Policy to rule them all, one Policy to find them,
One Policy to bring them all and in the darkness bind them. 
[Feel free to substitute System, Architecture, or any other grandiose schemes.]

Do we really want to be Dark Lords?  Is "the architecture group" the land where the shadows lie?  I guess some might indeed aspire to be dark lords ruling from a land of shadow, but it never ends well for dark lords.  As the history of Middle Earth shows (and indeed human history), you can't oppress life, creativity, passion, and freedom, at least not for long.  The yoke of tyranny will always be thrown off.  Life will find a way.  Attach other pithy axiom here.

Create software, systems, and policies that are alive, that encourage life, that can grow, adapt, and evolve.

Friday, January 11, 2008 9:19:33 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 

This thought occurred to me the other day.  Maybe the right approach to managing complexity in business software is something akin to creating a biological ecosystem.  By this, I mean designing correcting mechanisms to address chaos as it emerges and, ultimately, (the dream) would be designing systems that are biologically aggressive, that is, they look for niches to fill and also take steps to preserve themselves.

I don't know.  I'm sure I'm not the first person to think about this.  It just hit me the other day as I was walking into work.  It seems like the more common approach we take is to try to create a mechanical system as if the complexities of human interactions (i.e., business) can be specified and accounted for in a closed system.

I attended a session on managing complexity at the ITARC in San Diego last October, and the presenter was, if I recall correctly, advocating the usage of more precise specification of business rules through the use of Object Role Modeling (and in fact Dr. Terry Halpin was in attendance at that session and was a active participant).  I had attended another session the previous day by a fellow from Fair Isaacs on business rule management software.

All of these folks struck me as very intelligent and knowledgeable, and yet it seems to me that they are going in exactly the wrong direction.  In fact, I left that conference feeling very whelmed.  I felt as if I were living in a separate universe; at least I got the sense that there is a software multiverse, parallel software development universes, with me living in one and a lot of those guys in another.  All this talk of "governance" and highfalutin systems (e.g., grid SOA) leaves one feeling so disconnected from the everyday experience of being a software professional.

It seems to me that the solution to complexity in IT is not to create ever more complex mechanical systems, policies, and infrastructure to "govern" the problem.  It seems like that's throwing gasoline on the fire.  Not only that, it seems fundamentally opposed to the reality that is business, which is essentially a human enterprise based on humans interacting with other humans, usually trying to convince other humans to give them money instead of giving it to some other humans that want their money.

Because humans are intelligent and adaptable, particularly humans driven by, dare I say, greed (or at least self-preservation), these humans are constantly tweaking how they convince other humans to give them money.  The point is, business is fundamentally a human and an aggressively biological, enterprise.  It consists of humans who are constantly on the lookout to fill new niches and aggressively defending their territories.  So it seems to me that business software should be modeled, at a fundamental level, on this paradigm rather than on the mechanical paradigm. 

Of course, the problem is that the materials we're working with are not exactly conducive to that, but therein lies the challenge.  I tend to think that the efforts and direction being made by the agile community and approaches like domain-driven design are headed in the right direction.  At least they're focusing on the human aspects of software development and focusing in on the core business domains.  That's the right perspective to take.

Extend that to IT governance, and that means giving various IT departments within an enterprise the freedom to function in the best way that meets the needs of their local business units rather than trying to establish a monolithic, central architecture that attempts to handle all needs (think local government versus federal government).  It means developing with a focus on correction rather than anticipation, building leaner so that when change is needed, it is less costly (in a retrospective sense as well as in total cost of ownership).

I'm not advocating giving ourselves over to the chaos; I'm just thinking that this is a better way to manage the chaos.  And as we learn the best patterns to manage complexity in this way, it seems not too far a stretch to think that we could start automating mechanisms that help software systems be ever more agile and ultimately even anticipate the change that is needed by the business, either automatically making the adjustments needed or at the very least suggesting them.  That would be true business intelligence in software.

Maybe it's a pipe dream, but I think that without such dreams, we don't improve.  At the very least, I think it suggests that the agile approach to software is the right one, and that this approach should be extended and improved, not only in software development but also in architecture and IT in general.

What do you think?

Friday, January 11, 2008 12:01:33 AM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [2]  | 
# Saturday, December 22, 2007

I've been getting friendly with Windows Live lately, and after getting terribly tired of having to switch to HTML view in Windows Live Writer in order to insert a note (could be a footnote or endnote depending on how you look at it), I decided to see if I could write a plug-in to make my life easier.

So was born the Blog Notes plug-in.  Unfortunately, there is no extensibility for just marking up existing text (e.g., adding a superscript button to the markup toolbar), so I had to go with the option to insert some HTML using the  interface.  I really was trying to keep it simple and lightweight (for my own sanity), so it is pretty basic.

The functionality is pretty straightforward.  Thanks to Mark James for the free icons.  Once the plug-in is installed, you should see an "Insert Blog Notes..." option in the Insert pane on the right side as shown below.

Insert Blog Notes in Insert Pane

Clicking on it brings up the Blog Notes dialog:

Blog Notes Dialog

Clicking "New Note" will insert a new superscript number (the next one in the sequence).

Clicking "Reference Note" will insert the selected number as superscript.  You can also just double-click the number to do that.

The "Notes Section" button will insert a notes section.1

Lastly, "Write Note" simply adds the selected note plus a period and couple spaces.

As you can see, it's pretty basic, but it saves a few seconds for each note (assuming you bother to switch to HTML view, find the number, and put <sup></sup> tags around it like I do [did]).  You can also tweak one option/setting.  Go to Tools -> Options, and select the Plug-ins tab:

Live Writer Plug-ins Options

Clicking Options... on the Blog Notes plug-in brings up a tres simple dialog:

Blog Notes Options

This one option will toggle whether or not the plug-in uses in-page anchor links for the notes so that the superscript numbers would link down to the corresponding note in the Notes section.  I originally added this feature without realizing the implications.  Because blog posts are often aggregated and otherwise viewed in unexpected places, using in-page anchors is iffy at best.  Community Server seems to strip them out, and dasBlog keeps them, but since it emits a <base /> tag to the site root, all of the anchor links are relative to the site homepage instead of the current post, which effectively renders them useless.  I looked at the dasBlog code where this happens, and it's in the core assembly.  I was concerned what side effects changing it to use the current URL would have, so I didn't do that.  But if you have blog software that will let you use this feature, by all means, enjoy!

Caveats

  • Because of the way the plug-in framework works, I use a static/shared collection to keep track of the notes.  This means it acts a tad goofy if you close out of Live Writer or write multiple posts while it is open.  If you close and come back to a post, the notes count is reset.  To "fix" this, just re-add however many notes you had (if you want to bother).  If you write multiple posts, you just have to deal with it.  I don't know if there is post-local storage for plug-ins, but I didn't have time to dig into it.
  • Your mileage may vary.  I wrote this mainly to save myself time and get familiar with the Live Writer extensibility model, so it ain't a finished product to be sure.

Get It!
Since there are numerous tutorials on the Web (that I learned from) to write Live Writer plug-ins, I won't go into those details here, but you're welcome to download my code and learn from it directly if you want.  I think I have comments and such in there.

  • Download the Plug-in Only - If you just want to use this plug-in, this is what you want.  Drop the DLL into your plug-ins directory and go (typically C:\Program Files\Windows Live\Writer\Plugins).
  • Download the Source Code - This is a VS 2008 solution for those who want to learn, enhance, extend, whatever.  The license is more or less the MIT license.  You'll need Live Writer installed to reference its API.

Notes
1. This is the "Notes Section."  The button adds the "Notes" header and writes out any existing note numbers.

Saturday, December 22, 2007 2:07:12 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, December 11, 2007

Far be it from me to put words in Phil's mouth, but I hope that folks recognize that his post about favoring composition over inheritance is not specifically about that one best practice (the comments seem to indicate this is being missed).  It's pretty clear to me that the thrust of that post is around a philosophical approach that he thinks the ALT.NET community should make.

Two things stand out from Phil's post in this respect: 1) don't appeal to authority, and 2) don't organize yourself around a set of technical principles (best practices), but rather organize yourself around the non-technical values of independent thinking and desire to improve.  I hope that everyone can agree that these latter two values are good ones that should indeed be encouraged.

That said, should a community like ALT.NET eschew forming a more formal consensus on technical best practices?  I tend to think not.  While independent, critical thinking is valuable, it is not the summit of perfection.  The summit of perfection, in the realm of ideas at least, is conformance with truth (what actually is versus what I think is), and independent thinking at odds with what is true is not only not valuable in itself, it can be downright detrimental. 

For instance, what if you independently and critically think that security and privacy are not important aspects of the online banking application you are tasked with building?  Is that kind of independent, critical thinking valuable in itself?  Or will it potentially lead to great harm?  Independent, critical thinking is valuable only in as much as it deepens one's understanding of and conformance to truth.

So I think that there is value in a community such as ALT.NET expending the effort to define principles through critical thinking and argumentation that it will hold up as ideals, i.e., things that seemed to be most in accord with the truth as we know it.  This is where things like patterns and best practices come into play; it is the shared, accumulated wisdom of the technical community.

Now what about the broader idea of eschewing appealing to authority?  Far be it from me to claim to be an authority in logic, but it seems to me that all appeals to authority are not invalid (the wikipedia article Phil links to discusses this to some degree but does not go far enough, in my estimation).  The valid reasons for appealing to authority are discussed at the bottom of that article: 1) not enough time and 2) concern at one's ability to make the other understand the reasoning underlying the truth being expressed.

In terms of logic, it is not a fallacy to appeal to an authority on a topic that is accepted by all those involved in an argument.  We're talking about presuppositions here, and without them, we'd never get anywhere in our search for truth.  If you always have to argue from first principles (if you even acknowledge those), you simply get stuck in a quagmire.  In terms of the topic at hand, if folks accept (as they generally do) that the GoF et al are authorities on the subject of OOD, then it is valid, logically speaking, to appeal to their authority to establish the principle that you should favor composition over inheritance.

The thing to watch out for in appeals to authority is 1) thinking that the authority is incapable of being wrong and 2) ensuring that the parties involved accept the authority.  With the latter, you simply cannot argue (or at least the argument won't carry weight) from authority if the authority is not accepted.  With the former, unless it is a presupposition shared by those involved that the authority is indeed infallible, you should keep in mind that even if you buy into the authority's credentials, it is still possible that the authority can be wrong.

So I would nuance what Phil says and say that if the ALT.NET community agrees that GoF is an authority, it is valid to appeal to them, while remaining open to criticism of the concepts involved (even those backed by an authority).  The authority adds logical weight; it does not impose absolute authority.

We just don't have time to argue everything from first principles.  Others who are generally acknowledged to be qualified have already taken the time to research, think about, and propose some good patterns and practices, and unless there is good reason to object, there is no need to rehash those.  Instead, I'd suggest that the community focus on spreading knowledge of these patterns and practices all the while refining them, functioning essentially as a group in the way that Phil recommends individuals function--thinking critically and always working to improve.  Doing this will help ensure that the community does not fall into a quagmire of unnecessary argumentation, and it will ensure that the patterns and practices that they agree upon can be continuously refined and enhanced as new technologies emerge and greater wisdom is gained over time. 

Further, it gives the group a purpose that has meaning.  After all, if the group's message is only "think for yourself and be all that you can be," there isn't much of substance to say after that.  On the other hand, because it is a technical community that espouses that philosophy, it should take that philosophy on itself (as a group, not just the individuals in it).  I would suggest this includes establishing greater consensus on best practices and patterns and then spreading the word about them to others.  Be better together. :)

You see, it is not about setting down an infallible manifesto and excluding those who disagree, which is I think more than anything what Phil is concerned about.  However, it also isn't about best practices just being true for you but not for me (best practices relativism?).  Put another way, I suggest ALT.NET should favor thoughtful adherence to best patterns and practices, not blind adherence.

Tuesday, December 11, 2007 9:56:00 AM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [2]  | 
# Sunday, December 9, 2007

As I read1 the works of Christopher Alexander, I grew increasingly concerned that the software industry may be missing the point of patterns.  Well, maybe that's not the right way to put it.  I think we may be missing the real value that patterns bring to the table.

For whatever reason, it seems we approach them (broadly speaking) almost as loose algorithms to be applied here and there as it seems fit.  Or maybe we just see them as convenient ways to talk about things we already know, or maybe we even use them to learn particular solutions to particular problems.  And then maybe we just use them because it is en vogue. 

It seems to me that the real value to derive from patterns (as an idea, not necessarily as they are often proposed in the software world) is in learning to see and think about creating software in the best way.  What Alexander proposes at the end of The Timeless Way is that it isn't using patterns or pattern languages, per se, that give our creations the quality without a name.  No, he proposes that the value lies in helping us to recognize the quality and teaching us to build in the timeless way.

The timeless way is more than patterns.  The thing is, patterns help us to get there.  I think in some ways, we do get it.  Those who are really into patterns do seem to recognize that patterns are not the solution to everything.  The problem is, I think, in that we are not using patterns in the most profitable way. 

I think part of the problem is in not using patterns as a language.  We have numerous catalogues of patterns.  To be sure, we do not lack for patterns, and sure, there is obviously value just in having these catalogues and in using the patterns here and there.  But I think that as long as we see patterns as individual things in a pattern catalogue, we won't use them to their full effectiveness.

Perhaps what we need to do is to figure out how to use them as a language.  Perhaps we need to weave them into our thoughts so that when we approach the problem of building software, patterns are there, guiding our thinking, helping us to best arrange a solution to fit the problem.  When we use our natural language, it does the same thing.  Our thoughts are constrained by our languages, but at the same time, our thoughts are guided by our languages.  The ideas form in our heads and rapidly coalesce into some structure that is based upon our language, and the structure works because of the language--it tells us what works and what doesn't work to articulate our ideas.

I think that a pattern language would have the same power.  If we get the patterns into our heads, then when we're faced with articulating a solution to a problem, we will think in terms of the patterns.  The patterns will give form to our solution, and because they are patterns, the solution will work.  The pattern language will both guide and shape our thinking towards things solutions that have the quality without a name.

But then, as Alexander says of "the kernel," once we master the language, we move beyond it, so to speak.  The language is not an end in itself but a means to an end, a means to learn the timeless way.  It shapes our thinking to the extent that we are able to perceive the way even without a pattern.  And this is the superlative value in patterns that I think we're missing. 

Patterns, in themselves, have value, but as many have noted, they can be abused and misapplied.  The reason for this is not that a pattern (or patterns in general) are bad but that we're using them as an end in themselves.  If we simply let patterns shape the way we think about designing software, if we let them become a language, then we will learn to use them in ways that make sense and ultimately go beyond them and build great software even where a pattern doesn't exist.

So how do we do this?  Well, I think to some extent, we already do it.  I think there are people who use the language, who know the way, without necessarily being conscious of it.  And I think that there is a lot of great guidance out there that in a roundabout way does lead to building great software, even though it may not be conscious it is using patterns as a language.  But I do tend to think that there is far more bad or failed software out there that has come about because the language is not known, it is not explicit.

I think that what we need to do is to continue identifying patterns as best we can, but we need to start thinking about how to more firmly incorporate them into how we create software.  In fact, I think doing this, attempting to incorporate patterns more into development, will drive the further identification of patterns, to fill out patterns where we are lacking.  I also think it will help us to realize how patterns relate to each other, which is a big part of using them as a language and not just a bunch of monads floating about in the ether.  As we see them relating, see how they work together to form complete solutions, we'll better understand the language as well as the value of the language, and ultimately, we'll be able to impart that language to enable more of us to speak it.

This calls for those who build great software, who theoretically already know the way, to be introspective and retrospective.  It's not just a matter of looking about in the software world for repeated, similar solutions.  It's about identifying good solutions, solutions that bring software to life, not just addressing functional requirements, and forming from those solutions a language of patterns for building such software.  What do you think?

Notes
1. See Notes on the Notes of the Synthesis of Form and The Timeless Way is Agile.

Sunday, December 9, 2007 2:46:34 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [2]  | 
# Wednesday, December 5, 2007

I'm so pumped!  I just got my SQL Toolbelt mug from Red Gate.  I proudly display it on my desk (below) and in meetings. :)

You may look at my Red Gate mug.  You may not borrow it.

The mug reads (this is awesome):  "You may look at my Red Gate mug.  You may not borrow it."  Like the mug, Red Gate's software is awesome, too.  I've used SQL Compare and SQL Data Compare for a long time, and I love the (relatively new) SQL Prompt.  They have a ton of other tools in their toolbelt targeted more at DBA types than devs/architects like me.  I highly recommend them if you do much SQL development or administration!

Wednesday, December 5, 2007 10:49:34 AM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [2]  | 
# Monday, October 1, 2007

Previously, I mentioned I was working on an example of using Visual Studio to create a concrete domain model using object thinking, and here it is.  The domain I ended up modeling was that of a shared event calendar, including event registration and agenda planning.  This is something that's been kind of rolling in and out of my mind for quite a while now because it seems that we need a good system for this for all the code camps and like events that occur.  Of course, lately I've come across a few solutions that are already built1, but it seemed like a domain I knew enough about that I could take a whack at modeling it on my own.  I also figured it was small enough in scope for a sample.

So without further ado, I present you with the domain model:
Click to See Full Event Calendar Domain Model

I put this together in about an hour, maybe an hour and a half, on the train up to SD Best Practices.  When I started out modeling it, I was actually thinking more generally in the context of a calendar (like in Outlook), but I transformed the idea more towards the event planning calendar domain.  So you see some blending of an attendee being invited to a meeting with the event planning objects & behaviors (agenda, speaker, etc.).  Interestingly, they seem to meld okay, though it probably needs a bit of refactoring to, e.g., have an Attendee Register(Person) method on the Event object.

So the interesting thing to see here, contrasting it to the typical model you see in the .NET world (if you're lucky enough to see one at all!), is that there is pretty much no data, no simple properties or attributes, in the model.  The model is entirely objects and their behaviors and relationships to other objects.  You can look at this model and get a pretty darn good feel for the domain and also how the system functions as a whole to serve this domain.  I was able to identify and model the objects without once thinking about (and getting distracted with) particular data attributes.2

In the story of our Tangerine project, I describe in some depth the compromise I had to make with the .NET framework when it comes to data properties.  I think if I were to continue with this event calendar project, after I had nailed down the objects based on their behaviors (as begun in this example) and felt pretty good that it was spot on, at that point, I'd think about the data and do something like I did on Tangerine, having the open-ended property bag but also adding strongly-typed properties as needed to support framework tooling.3 

I hope you can imagine how you could sit with your clients or whoever your domain experts are and quickly map out a lightweight model of the domain using the VS Class Designer DSL.  I'll wager that if we took this diagram and showed it to a non-technical person, with a little help (maybe adding a key/legend), they'd quickly understand what's going on with the system.  And if you're building it with the domain expert, you'll have that dialog done already so that everyone will be on the same page.

Sure, there will be further refinement of both the domain model and the code; the nice thing about using the class designer DSL is that tweaking the model tweaks the code, so the two stay in sync.  We already mentioned the need to focus on the data at some point, and depending on your situation, you can do this with the domain experts or maybe you'll have an existing data model to work with.  As the developer, you're going to want to get in there and tweak the classes and methods to use best coding and framework practices, things that aren't best expressed in such a model.  You will have other concerns in the system to think about like security, performance, logging, user interface, etc., but that's all stuff you need to do regardless of how you approach analyzing and modeling your domain. 

In the end, you will have a fairly refined model of the domain (or part of the domain) that is using a language that everyone gets and agrees on (Eric Evan's "ubiquitous language"); you'll have identified the objects in the domain accurately based on their behaviors and relationships, and you'll even have a starting point in code for the implementation.  You also have objects that are responsible and that collaborate to get the job done, so in that way you avoid code complexity by reducing imperative control constructs.  All in all, it seems like a great foundation upon which to build the software.

Notes
1. Such as Microsoft Group Events, Community Megaphone, and Eventbrite.
2. Okay, so maybe I was tempted once or twice, but I fought the urge. :)
3. I suppose another option would be to create LINQ-based DTOs; I have to think more about how best to meld this kind of domain modeling with LINQ.

Monday, October 1, 2007 4:59:37 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Saturday, September 29, 2007

I finally got around to finishing The Timeless Way of Building, by Christopher Alexander (most well known in software for being the source of the patterns movement).  The last part of the book is called "The Way" to build things.  His focus is physical architecture, but it is interesting how closely it resembles agile software development.

There are a few similarities that I see.  First, he advocates (or at least shows in his example) working directly with the folks who are going to be using the building(s) when designing it with the pattern language.  You design it together with them.  Similarly, agile seems to advocate the same process of working as closely as possible with those who will be using the system.1

But Alexander goes on to say, using this real-world example of a health care complex he helped to build, that it almost failed (in terms of having the quality without a name) because even though it was initially designed using the pattern language, it was in the end passed off to builders who conformed the design to "drawings" (think UML) that ultimately caused it to lose a large amount of the quality

The point he goes on to make is that you can't just use the language up front and then go translate it into formal design techniques and end up with the quality.  Rather, you have to build using the language, and in particular, build each part of the structure piecemeal, to best fit its particular environment, forces, context, and needs.  This is the only way that you can get the quality.  Here I see another similarity with agile and its focus on iterations and regular feedback.  You build in pieces, adapting each piece to its whole as it is built and ensuring that it best fits the needs, context, forces,  and environment. 

He also says that invariably our initial ideas and designs for a solution don't exactly reflect the ways in which the solution will be used.  And this disparity between our design and reality gets worse as the solution grows in scope.  Again, this is true in software and why the regular feedback is important, but Alexander proposes repair as a creative process in which we better adapt the solution to its environment based on deepening understanding of needs or when the design just isn't working or breaks.  This is akin to what we call refactoring, and like we do in software, Alexander advocates a continual process of repair (refactoring).  And this process doesn't stop when the initial thing is built--we keep tweaking it ad infinitum.

This seems somewhat intuitive, yet in software we're always talking about legacy systems and many have and continue to suggest "rewrites" as the answer to software woes.  While I understand that this is one area where software differs from real-world building (in the relative ease that something can be redone), I do think that we software folks tend to err too much on the side of rewriting, thinking that if only we can start from scratch, our new system will be this glorious, shining zenith of elegance that will last forever. 

It is this thinking, too, that even causes many of these rewrites to fail because so much time is spent trying to design a system that will last forever that the system is never completed (or becomes so complex that no one can maintain it), providing the next impetus for another "rewrite of the legacy system."  On the contrary, some of the best software I've seen is that which has simply been continuously maintained and improved, piece by piece, rather than trying to design (or redesign) an entire system at once. 

What is interesting to me in all this is the similarities between the process of building physical structures and that of building software, the general applicability of Alexander's thought to the creation of software.  I continually see this in Alexander's writing.  In part, it is good to see a confirmation of what we've been realizing in the software industry--that waterfall just doesn't work, that pre-built, reusable modules don't really work well, that we need regular, repeated input from stakeholders and users, that we shouldn't try to design it all up front, that we shouldn't use formal notations and categories that create solutions that fit the notations and categories better than their contexts, environments, and needs, that we should create and use pattern languages that are intelligible by ordinary people, and more.

There is one last observation I'd make about The Timeless Way of Building, regarding the "kernel of the way."  Alexander says that when it comes down to it, the core (the kernel) of the timeless way of building is not in the pattern language itself (the language is there to facilitate learning the timeless way); he says the core is in building in a way that is "egoless." 

In some ways, I think the concern about ego is less pronounced in the software world--rarely is a piece of software admired as a piece of art--but at the same time, the underlying message is that you build something to fit just so--not imposing your own preconceptions on how the thing should be built.  For software developers, I think the challenge is more in learning to see the world for what it is, to really understand the problem domain, to look at it through the eyes of the users and design a solution to fit that rather than trying to foist the software worldview onto the users.  To put it another way, we need to build software from the outside in, not the inside out.  The timeless way is really about truly seeing and then building to fit what you see.

Notes
1. At this point, another interesting thought occurs to me about pattern languages; I see a relation to Eric Evan's "ubiquitous language" in that the language you use needs to be shared between the builders and those using the thing being built.  What stands out to me is the idea of building a pattern language that is intelligible enough by non-software experts to be incorporated into the ubiquitous language shared by both the domain experts and the software experts.  Software patterns vary on this point; some are intelligible and some are not so intelligible; we need to make them intelligible.

Saturday, September 29, 2007 9:31:12 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, September 21, 2007

As I sit here on the train home, I've been thinking (and writing) about a lot of stuff.  But I figured I should put this post together for completeness and finality, even though I only made it to one session today before I left  early.  Last night I was shocked and somewhat dismayed to find that I had somehow managed to book the train for return on Saturday afternoon rather than today.  I looked at my reservation email, thinking surely the ticket was misprinted, but nope, the reservation says the 22nd clearly in black and white.

Now, those who spend much time with me know that I tend to be sort of the absent-minded professor type.  I often have trouble keeping track of the details of day-to-day things (but I can tie my shoes!).  I like to think good reasons for this, but whatever the reasons, that's me.  So I can totally imagine that somehow I tricked my brain into thinking that the 22nd was the day I wanted to return when I booked the train.

That said, I think this is a good opportunity to observe a way in which the UX of the reservations system could be improved.  If it had simply said somewhere that the return trip was on SATURDAY and not just used these obscure things called numeric dates, I'd immediately have seen and avoided my mistake.  But nowhere online nor in the email nor on the ticket does it say Saturday.  In fact, there is SO MUCH GARBAGE on the ticket, that the non-initiate has trouble finding anything of value.  So think about that if you're designing some sort of booking system--show the day of the week, please. :)

Lean Process Improvement
So this morning, on top of being tired because I stayed up late writing, I was late for the class I wanted to attend, one called Agile Architecture.  Unfortunately, it was in the smallest room in the conference (same one as the manager meeting yesterday), and unfortunately, the planners didn't anticipate attendance to that session correctly.  Plus, this room had this odd little old lady who felt it was her duty to prevent anyone from attending who had to stand. 

Yesterday, I watched her try to turn away (a few successfully) quite a few folks, even though there was plenty of room on the far side to stand.  She kept saying "there really is no room," but there was.  What made the whole scene kind of comical was that she refused to go sit OUTSIDE the door, so rather than simply preventing folks from coming in and causing a distraction, she let them come in, then animatedly tried to convince them to leave, causing even more distraction.

Well, when I peeked in the door this morning, saw the full room and saw her start heading toward me, I knew I was out of luck.  I just didn't have the heart to muscle by her and ignore her pleading to go stand on the other side, and besides, I don't like standing still for 1.5 hours anyway.  So I was off to find an alternative.

I knew there wasn't much else I wanted to see during that hour, but by golly I was there and this was the only slot I could make today, so I was going to make it to a session!  After two more failed entries into full sessions and studiously avoiding some that sounded extremely dull by their titles, I finally found one that sounded nominally interesting and had a lot of open space.  I really had no clue what I was getting into...

It ended up being somewhat interesting.  It was about applying the "lean process" from the manufacturing space to software development.  I'm personally not really into process and methodologies, particularly when they come from disciplines that are only marginally like our own.  But this did sound like it could be useful in some situations, particularly in software product (i.e., commercial) development. 

He talked about value stream mapping, which is basically modeling the process flow of specific activities in product development from beginning to end (so you'd do one for new feature dev, one for enhancements, one for hot fixes, etc.).  It sounds like it does have potential to be useful as long as you don't spend too much time on it.  Particularly if you think you have a problem in your process, this method can help you to both visualize and identify potential problems.  If you do product development, it's worth a look.

Final Thoughts
After that session, I made off to go to the 12:05 mass at the chapel outside the convention center.  My deacon friend had let me know about it, and I was glad of it.  And he was there, so after mass, we went back into the conference to grab lunch together.  Talked more about the usual, and then I had to run off to catch my train.

Looking back, I feel that this is definitely a conference worth attending.  Of course, your mileage will vary.  I wouldn't come here to go to a bunch of sessions on topics you're already an expert on.  But the nice thing about this conference over others I've been to is that it really is focused on best practices.  It's not really focused much on technology-specific stuff (though there was a bit of that), so you can derive value whether you do Java, C/C++, .NET, or whatever. 

Also, it is a good place to come to meetings of minds from other technology experts, so you get some more exposure than you might normally to how folks are doing software outside of your technological community.  And one interesting thing I noticed is that there is a tangible presence of software product developers, and that's a different and valuable perspective for those who are more used to, say, standard custom/consulting/corporate IT software.

Overall, if you look over the sessions and see topics that you haven't had a chance to explore in depth or maybe you want to just get exposed to other ideas in the software space, this seems like a good conference for that.  I really enjoyed it.

Friday, September 21, 2007 4:27:11 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Thursday, September 20, 2007

Today I stumbled into Barnes & Noble (because it had the nearest Starbucks), wandered into the notebook section, and was reminded that my current Moleskine notebook was almost full.  Silly me, I still have two back at the office, so I thought it must be fate for me to go ahead and restock while I'm here.  I highly recommend Moleskine; I like the small, book-like ones without lines because small is convenient enough to put in pocket and I don't like to conform to lines or have even the suggestion that I should, but they have all kinds.  Good, tough little notebooks, and supposedly they've been used by some famous people.  This has not been a paid advertisement for Moleskine.  Now we return you to your regular program.

Applying Perspectives to Software Views (Cont'd)
Yesterday I talked about Rebecca Wirfs-Brock's session on software views.  There's a lot more to what she said than what I communicated, but I'm just propounding what stuck with me.  Looking at my notes, I forgot to mention another key thing, which is that you should model these views and model them in a way that effectively communicates to the stakeholders that their needs are being addressed.  She threw up some UML diagrams, commenting that they're probably not good for most business folks.  (I think UML is not good for most technical folks either, but I'm a rebel like that.)  The point she made, though, was regardless of what notation you use, provide a key that let's people know how to unlock the meaning of the model.  Good point for sure.

Actually, this reminds me of Beautiful Evidence, by Edward Tufte.  I recommend Tufte for his area of expertise, though I'd suggest skipping the chapter on Powerpoint (which sadly was released as a separate booklet) because it's not his area of expertise and it shows.  Anyways, when he is sticking to the realm of visual communication, he is excellent, and Beautiful Evidence is a pretty easy read that helps you start thinking about how to communicate "outside the box" as it were.  I bring it up here because applying his ideas in the area of modeling software, particularly to non-technical audiences, is something we should explore.

Now, back to Day III.

Software Managers
The first session I made it to kind of late (and it was absolutely packed--standing room only) was a session on tips for being a good technical/software manager.  Having become one of these this year, it is definitely a subject of interest, and I'm always on the lookout for more tips, though I must say that I think management books (as a rule) are really bad about regurgitating each other.  You get to where it becomes increasingly hard to find new, good insights the more you read them.

But I thought this session would be good since it is specifically focused on managing technical teams.  Some of her points were standard managerial stuff, but it was nice to have it focused in on the IT industry.  I always end up feeling a bit guilty, though, because I know I've already made numerous faux pas (not sure how to pluralize that).  I hope my guys know I love them even though I screw up being a good manager at times. :) 

One recurring theme I keep coming across is having regular 1-1s with your peeps.  I've heard weekly and bi-weekly, but it seems like both of those would be overkill for my group since we have daily meetings, often go out to lunch, etc., so I'm going to try monthly first.  It'll be better than nothing! 

I have to say that managing well is at lot harder than I expected it to be.  For those of us who aren't natural people persons, it is definitely an effort.  I'm sure it is tough regardless, but I gotta think that it'd be easier if I was naturally more a people person.  Anyways, I keep tryin' for now at least.

Designing for User Success
Went to another Larry Constantine session around UX.  This one was really good.  He, like Patton, affirmed that "user experience is about everything."  Again, it's nice to know I'm not crazy, and it takes a burden off me knowing that I won't be a lone voice crying out about that.  It seems that maybe just those who don't know anything about UX think it is "just another term for UI."  Of course, these "UX professionals" are naturally focused in on their areas of expertise (usability, information architecture, human factors, human-computer interaction, visual design, interaction design, etc.), so maybe I'm still a bit odd in my contention that architects must be the chief experience officers on their projects.

Anyhoo, this session focused in on "user performance" as a distinct focus, meaning that you are providing the tools to get the best performance out of people.  Though none of the session was spent explicitly justifying the importance of a focus on UX, implicitly the whole session was an illustration of why it is important.  I have a ton of good notes from this session, but I won't bore you with them (you can probably get most of it from his slides or other presentations he's done).  If you get nothing else, though, it's to change the way you think about designing software--design from the outside in.  If you're a smart person, you'll realize this has huge implications.  And also, recognize that you won't make all parts of your system perfectly usable, so prioritize your usability efforts based first on frequency of use and second on severity of impact (i.e., those things that will have serious ramifications if not done correctly).

Human Factors in API Design
The next session I hit was one related to UX for developers.  Here are some salient one-liners:

  • Consistency is next to godliness.
  • API = Application Programmer Interface
  • When in doubt, leave it out. <-- More specifically, unless you have at least two, real use cases, don't stick it in your API.
  • Use the Iceberg Principle. <-- This means what people see of your code should only be the tip of the iceberg--keep it small, simple, and focused.

This session actually seemed to be a blend of general UX guidelines (yes, they apply here, too, not just on end-user interfaces) and more general framework design principles that only had varying degrees of pertinence to ease of use.  Some highlights:

  • Default all members to private; only raise visibility with justification.
  • Prefer constructors to factory/builder pattern, and setup object fully with constructor where possible.
  • Use domain-specific vocabulary.
  • Prefer classes to interfaces.  Amen!
  • Prefer finality (sealing) to inheritance--minimize potential for overriding.

There's a good deal more, and I'm not offering the justification he proposed (for brevity's sake).  I agree to varying levels of vehemence with most of what he said, but one area where I think I have to disagree is his advice to only refactor to patterns.  I can imagine where this comes from--because patterns can be abused (paternitis as he said).  But I think saying refactor to patterns shows a big misunderstanding of the point and value of patterns.  This is why it's important to pay attention to the context and rationale in a pattern--so you know when to apply it.  But patterns should be used where they apply--they're known, established, tried and true ways of solving particular problems in particular contexts!  If consistency is akin to godliness, using patterns is ambrosia.

One last interesting note from this session was the admonition to consider using or creating a domain-specific language where it helps with the usability of the API.  His example was around JMidi and JFugue, where JMidi is a terribly verbose API, requiring the construction of and coordination of a host of objects to do something simple like play a note, JFugue offers a simple string-based DSL that is based off of musical notation to let you place a whole series of notes very compactly.  Good/interesting advice.

Pair Programming
The last session I went to today was one based on practical pair programming.  I was actually on my way to a class on Business Process Modeling Notation, which would have been potentially more intellectually stimulating, but I walked by the room with the Pair Programming session on it and had a sudden feeling I should attend it.  When I thought about it, I figured that I'd put off giving the idea fair play long enough and that I should take the time to hear it in more depth.  I figured it'd have more immediate relevancy to my current work situation in any respect.

I won't belabor all the points because I suspect with good reason that they're all the standard arguments for pair programming along with a good bit of the "how" to do it in real situations.  He actually has a number of patterns and anti-patterns to further illustrate good/bad practices in pair programming.  It was an interesting extension of the pattern-based approach (to people).  Suffice it to say, I think if you can get buy in in your organization it is definitely worth a try.  There are numerous difficulties with it, chief one being it is hard to do effectively in a non-co-located environment, but I think I'd try it given the opportunity. 

Random Thoughts
One thing that I've come to the conclusion on being here is that TDD seems to be unanimously accepted by those who have actually tried it as a best practice.  The API guy went so far as to say that he won't hire devs who don't have TDD experience.  (I think that's a bit short-sighted, but I take his point.)  It's something to think about for those still hesitating to adopt TDD.

I met up again with the same fella I met last night.  We were both in the pair programming class at the end of the day; he's been doing pair programming on a few teams at his company for years and is a fan, though he definitely attests to the difficulty of dealing with prima donnas, which apparently are more tolerated in his LOB (because they have very specialized knowledge that requires PhD level education).  So he wasn't able to carry XP to his entire company.  He also said that pairing (which was echoed by the presenter) is a taxing process; 4-5 hours max is good.

We also had a good long chat about things Catholic.  It's good to know that we Catholics will be getting another good, solid deacon in him.  I imagine tonight won't be the last time we talk.

All in all, another great day.  Learned a bunch.  No sessions I regret going to thus far, which is I think a big compliment for a conference. :)

Thursday, September 20, 2007 10:48:34 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Wednesday, September 19, 2007

Hi again.  Today was another good day at the conference. 

User Experience Distilled

The first class I attended was a whirlwind tour of user experience.  I was heartened to learn that I am not alone or crazy in recognizing that there are a number of disciplines that go into this thing we call UX, and the presenter, Jeff Patton, also recognizes that actually virtually every role in developing software has an effect on UX, which is also something I have come to the conclusion of (as I hint at on IG's UX area).  I develop the idea more explicitly in an unpublished paper I'm working on.  (I'm hoping the inputs I get from this conference will help me to finish that out.)  

I actually think that all of this UX stuff falls under the architect's purview because (in my mind at least) he or she is primarily responsible for designing the software as a whole.  This means that architects need to have a conversational familiarity (at least) with the different disciplines that people traditionally think of as user-oriented disciplines, but I'd take it a step further and say that the architect needs to be the chief experience officer, as it were, on a software project.  The architect needs to ensure that the appropriate expertise in user-oriented disciplines is brought to bear on his or her project and also needs to understand how the other aspects of software design and development impact UX and optimize them for good UX. 

That discussion aside, Jeff had a pretty clever graph that showed how the kind of software being developed affects the perceived ROI of expenditure on UX.  His talk also was about as effective an introduction to UX that I can imagine.  He dealt with what it is, why it's important, and then offered a high-level overview of key bits of knowledge for people to make use of.  I want to steal his slides! :)

Global Teams & Outsourcing Agilely

The keynote during lunch today was done by Scott Ambler.  It was nice to finally see/hear him in person since I've heard so much about him.  I got the feeling (from what he even admitted) that he was presenting stuff that wasn't just his--he was from what I could tell presenting an overview of a book that IBM publishes (related) on the subject.  But that didn't take away from the value of the knowledge by any means.  I'd definitely check it out if you're going to be dealing with geographically distributed teams.

Usability Peer Reviews

In my continuing quest to learn more about UX (part of which is usability), I attended a class by Larry Constantine about lightweight usability practice through peer review/inspection (related paper).  I was actually surprised because he has a very formal methodology for this, which means he's put a lot of thought into it but, more importantly, he's used it a lot in consulting, so it is tested.  I personally am not a big fan of being too formal with these things.  I understand the value in formalizing guidance into repeatable methodology, but I've always felt that these things should be learned for their principles and less for their strictures.  Of course, that runs the risk of missing something important, but I guess that's a trade off.  Regardless of if you follow it to a T or not, there's a ton of good stuff to be learned from this technique on how to plug in usability QA into the software process.

Applying Perspectives to Software Views

After that, I slipped over to another Rebecca Wirfs-Brock presentation on applying perspectives to software views in architecture.  (She was presenting the subject of this book.)  To me, the key takeaway was that we should figure out the most important aspects of our system and focus on those.  It echoed (in my mind) core sentiments of domain-driven design, though it used different terminology and approach.  I think the two are complementary--using the view approach helps you to think about the different non-functional aspects.  Using strategic DDD (in particular, distilling the domain) helps you and stakeholders to focus in on the most important aspects of the system from a domain strategy perspective, and that will inform which views and perspectives are the ones that need the focus. 

This approach also echoes the sentiment expressed by Evans yesterday that says you can't make every part of the system well-designed (elegant/close to perfection).  Once you accept that, you can then use these approaches to find the parts of the systems where you need to focus most of your energies.  I really like that this practical truth is being made explicit because I think it can help to overcome a lot of the problems that crop up in software development that have to do with the general idealistic nature that we geeks have.

Expo

After the classes today, they had the expo open.  In terms of professional presentation, it was on par with TechEd's Expo, but certainly the scope (number of sponsors) was far smaller.  That said, I popped into the embedded systems expo.  That was a new experience for me.  It was interesting to see almost every booth with some kind of exposed hardware on display.  As a software guy, I tend to take all that stuff for granted.  They even had a booth with specialized networked sensors for tanks of liquid.  This stuff stirred recollections of weird science and all the other fun fantasies that geeky kids have about building computerized machines.  The coolest thing there was the Intel chopper, which apparently was built by the Orange County Chopper guys, but it had a lot of fancy embedded system stuff on it.  I didn't stick around to hear the spiel, but it was pretty cool.

After the expo, I bumped into a guy at Cheesecake factory.  We started chatting, and it turns out that he's in the process of becoming a Roman Catholic deacon.  Pretty cool coincidence for me!  We talked about two of my top passions--my faith and software development (as exemplified here on dotNetTemplar!).  It was a good dinner.  He works at a company that does computer aided engineering; sounds like neat stuff with all that 3D modeling and virtual physics.  Way out of my league!

As I said, another good day here at SD Best Practices.

Wednesday, September 19, 2007 9:54:31 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 

I meant to write this last night, but I didn't get back to my room till late and just felt like crashing.  I'm at the SD Best Practices conference in Boston this week, which is a new experience for me.  It's one of a very few non-MS-oriented conferences I've attended, and I really wanted to come because best practices are a passion for me (and part of my job).  Infragistics was kind enough to send me.  I thought I'd share my experiences for anyone else considering going (and just for my own reference.. hehe)  Anyways, enough of the intro...

Day 1 - Tuesday, 18 September 2007

First off, let me say I like the idea of starting on a Tuesday.  It let me work for a good part of the day on Monday and still make it out here by train on Monday night.  I've found in the past that attending sessions non-stop for a few days can really wear you out, so four days seems about right.

The conference is in the Hynes convention center, and I'm at the Westin, a stone's throw away.  Also, it's right next to the Back Bay Station, so thus far the logistics aspect has worked out quite well for me.  I'd personally much rather take a train over a plane anytime. 

Responsibility-Driven Design

Tuesday was a day of "tutorials," which are half-day sessions.  So in the morning, I attended Rebecca Wirfs-Brock's tour of responsibility-driven design (RDD?).  I actually had her book at one point because it was mentioned in a good light by Dr. West in his Object Thinking, but somewhere along the line I seem to have lost it.  Anyways, I was glad to get a chance to learn from the author directly and to interact. 

From what I can ascertain, RDD has some good insight into how to do good object design.  It seems to me that thinking in terms of responsibilities can help you properly break apart the domain into objects if you struggle with just thinking in terms of behavior.  It's potentially easier than just thinking in terms of behaviors because while behaviors will certainly be responsibilities, objects can also have the responsibility to "know" certain things, so it is a broader way of thinking about objects that includes their data.

That said, it doesn't really negate the point of focusing on behaviors, particularly for folks with a data-oriented background because I do think that focusing on the behaviors is the right way to discover objects and assign them the appropriate responsibilities.  I think the key difference is that with the object-thinking approach, you know that there will be data and that it is important to deal with, but you keep it in the right perspective--you don't let it become the focus of your object discovery.

Another beneficial thing I think Ms Wirfs-Brock has is the idea of using stereotypes as a way to discover objects in the domain.  This is more helpful, I think, when dealing with objects that are more part of the software domain than those in the business domain because the stereotypes are very software-oriented (interfacers, information holders, etc.). 

In terms of process, she advocates this idea of having everyone on a team write their thoughts down about the problem being faced in a few sentences, focusing on what seems like it'll be a challenge, what will be easy, what you've run into before, etc.  Then have everyone bring those to the initial design meetings.  I like the idea because it bypasses the introvert-extrovert problem you sometimes get in meetings and you can start out with a lot of ideas to really jump sta

rt the design.  It's a good way to ensure you don't miss out on ideas due to personality issues.

The other thing I like in her process is writing down a purpose statement for objects as you discover them and thinking of them as candidates.  This is part of the CRC card process (the first C is now "candidates").  The reason I like it is that it helps you to focus on the point of the object and sort of justify its existence, which can help weed out some bad ideas. 

What I don't like about the process is the overall CRC card idea.  While it surely is more lightweight than many ways to approach object design, you still end up with a bunch of paper that you then have to translate into code at some point.  I much prefer to use a tool that will literally be creating the code as I design.  I've found the VS class designer serves this purpose quite well.  In fact, on the way up here, I spent some time doing up a sample class diagram using the object thinking approach to share as an example of domain modeling.  I'll be sharing it soon, but I just mention it to say this is not just speculation.  It was actually very lightweight and easy to discover objects and model the domain that way, and at the end I had literal code that I can then either fill out or hand off to other devs to work on who can then further refine it.

Domain-Driven Design

The second session I attended was one by Eric Evans on strategic domain-driven design.  Eric wrote a book on the subject that's been well received by everyone I've encountered who spent time with it.  I've seen a presentation on it, and I've read parts of Jimmy Nillson's Applying Domain-Driven Design and Patterns book.  So I thought I was acquainted well enough with the ideas, but as I often find to be the case, if you rely on second-hand info, you'll inevitably get a version of the info that has been interpreted and is biased towards that person's point of view.

For instance, most of what I've seen on DDD is focused on what Eric calls "tactical" DDD, i.e., figuring out the objects in the domain and ensuring you stay on track with the domain using what he calls the "ubiquitous language."  Eric presented parts of his ideas yesterday that he calls "strategic" because they are more geared towards strategic level thinking in how you approach building your software.  Two key takeaways I saw were what he calls context mapping, which seems to be a really effective way to analyze existing software to find where the real problems lie, and distilling the domain, which is a way to really focus in on the core part of a system that you need to design.

In short (very abbreviated), he claims (and I agree) that no large system will be completely well designed, nor does it need to be.  This isn't to say you're sloppy but it helps you to focus your energies where they need to be focused--on the core domain.  Doing this actually can help business figure out where they should consider buying off-the-shelf solutions and/or outsourcing as well as where to focus their best folks.  It's a pretty concrete way to answer the buy vs. build question.

Anyways, I'm definitely going to get his book to dig in deeper (it's already on the way).  Please don't take my cliff's notes here as the end of your exploration of DDD.  It definitely warrants further digging, and it is very complementary to a good OOD approach.

After all this, I was privileged enough to bump into Eric and have dinner, getting to pick his brain a bit about how all his thinking on DDD came together, his perspectives on software development, and how to encourage adoption of better design practices (among other things).  Very interesting conversation, one that would have been good for a podcast.  I won't share the details, but I'm sure folks will eventually see some influence this conversation had on me.  Good stuff.

Software for Your Head

I almost forgot about Jim McCarthy's keynote.  I've only seen Jim twice (once in person and once recorded).  He's a very interesting and dynamic speaker, which makes up for some of the lack of coherence.  I find the best speakers tend to come across a bit less coherent because they let speaking become an adventure that takes them where it will.  But I do think there was definitely value in his message.  I tend to agree that he's right in asserting that what we all do on a daily basis has a larger impact on humanity than we realize, and I can't argue with his experience in building teams that work.  http://www.mccarthyshow.com/ is definitely worth a look.

Overall, Tuesday was a big success from an attendee perspective.  So far so good!

Wednesday, September 19, 2007 11:17:20 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, September 17, 2007

When searching recently so as to provide further reading for "domain model" in a recent post, I was quite surprised to find that there seemed to be no good definition readily available (at least not by Googling "domain model").  Since I tend to use this term a lot, I figured I'd try to fill this gap and, at the very least, provide a reference for me to use when I talk about it.

So What is a Domain Model?
Put simply, a domain model is the software model of a particular domain of knowledge (is that a tautology?).  Usually, this means a business domain, but it could also mean a software domain (such as the UI domain, the data access and persistence domain, the logging domain, etc.).  More specifically, this means an executable representation of the objects in a domain with a particular focus on their behaviors and relationships1.

The point of the domain model is to accurately represent these objects and their behaviors such that there is a one-to-one mapping from the model to the domain (or at least as close as you can get to this).  The reason this is important is that it is the heart of software solutions.  If you accurately model the domain, your solution will actually solve the problems by automating the domain itself, which is the point of pretty much all business software.  It will do this with much less effort on your part than other approaches to software solutions because the objects are doing the work that they should be doing--the same that they do in the physical world.  This is part and parcel of object-oriented design2.

Nothing New
By the way, this is not a new concept--OO theory and practice has been around for decades.  It's just that somewhere along the line, the essence of objects (and object-oriented design) seems to have been lost or at least distorted, and many, if not most, Microsoft developers have probably not been exposed to it, have forgotten it, or have been confused into designing software in terms of data.  I limit myself to "Microsoft developers" here because it is they of whom I have the most experience, but I'd wager, from what I've read, the same is true of Java and other business developers. 

I make this claim because everyone seems to think they're doing OO, but a concrete example of OOD using Microsoft technologies is few and far between.  Those who try seem to be more concerned with building in framework services (e.g., change tracking, data binding, serialization, localization, and data access & persistence) than actually modeling a domain.  Not that these framework services are not important, but it seems to me that this approach is fundamentally flawed because the focus is on software framework services and details instead of on the problem domain--the business domain that the solutions are being built for. 

The Data Divide
I seem to write about this a lot; it's on my mind a lot3.  Those who try to do OOD with these technologies usually end up being forced into doing it in a way that misses the point of OOD.  There is an unnatural focus on data and data access & persistence.  Okay, maybe it is natural or it seems natural because it is ingrained, and truly a large part of business software deals with accessing and storing data, but even so, as I said in Purporting the Potence of Process4, "data is only important in as much as it supports the process that we’re trying to automate." 

In other words, it is indeed indispensable but, all the same, it should not be the end or focus of software development (unless you're writing, say, a database or ORM).  It may sound like I am anti-data or being unrealistic, but I'm not--I just feel the need to correct for what seems to be an improper focus on data.  When designing an application, think and speak in terms of the domain (and continue to think in terms of the domain throughout the software creation process), and when designing objects, think and speak in terms of behaviors, not data. 

The data is there; the data will come, but your initial object models should not involve data as a first class citizen.  You'll have to think about the data at some point, which will inevitably lead to specifying properties on your objects so you can take advantage of the many framework services that depend on strongly-typed properties, but resist the temptation to focus on properties.  Force yourself to not add any properties except for those that create a relationship between objects; use the VS class designer and choose to show those properties as relationships (right-click on the properties and choose the right relationship type).  Create inheritance not based on shared properties but on shared behaviors (this in itself is huge).  If you do this, you're taking one step in the right direction, and I think in time you will find this a better way to design software solutions.

My intent here is certainly not to make anyone feel dumb, stupid, or like they've wasted their lives in building software using other approaches.  My intent is to push us towards what seems to be a better way of designing software.  Having been there myself, I know how easy it is to fall into that way of thinking and to imagine that simply by using these things called classes, inheritance, and properties that we're doing OOD the right way when we're really not.  It's a tough habit to break, but the first step is acknowledging that there is (or at least might be) a problem; the second step is to give object thinking a chance.  It seems to me that it is (still) the best way to do software and will continue to be in perpetuity (because the philosophical underpinnings are solid and not subject to change).

Notes
1. An object relationship, as I see it, is a special kind of behavior--that of using or being used.  This is also sometimes represented as a having, e.g., this object has one or more of these objects.  It is different from data because a datum is just a simple attribute (property) of an object; the attribute is not an object per se, at least not in the domain model because it has no behaviors of its own apart from the object it is attached to.  It is just information about a domain object.

2. I go into this in some depth in the Story paper in the Infragistics Tangerine exemplar (see the "To OOD or Not to OOD" section).  I use the exemplar itself to show one way of approaching domain modeling, and the Story paper describes the approach.

3. Most recently, I wrote about this in the Tangerine Story (see Note 2 above).  I also wrote publicly about it back in late 2005, early 2006 in "I Object," published by CoDe Magazine.  My thought has developed since writing that.  Interestingly, in almost two years, we seem to have only gotten marginally better ways to deal with OOD in .NET. 

4. In that article, I put a lot of focus on "process."  I still think the emphasis is valid, but I'd temper it with the caveat that however business rules are implemented (such as in the proposed workflow-driven validation service), you still think of that as part of your domain model.  The reason for separating them into a separate workflowed service is a compromise between pragmatism and idealism given the .NET platform as the implementation platform.  I've also since learned that the WF rules engine can be used apart from an actual .NET workflow, so depending on your application needs, just embedding the rules engine into your domain model may be a better way to go than using the full WF engine.  If your workflow is simple, this may be a better way to approach doing validation.

Monday, September 17, 2007 11:41:54 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Saturday, September 15, 2007

As I sit here on my deck, enjoying the cool autumn breeze1, I thought, what better thing to write about than Web services!  Well, no, actually I am just recalling some stuff that's happened lately.  On the MSDN Architecture forums and in some coding and design discussions we had this week, both of which involve the question of best practices for Web services.

Before we talk about Web services best practices, it seems to me that we need to distinguish between two kinds of application services.  First, there are the services that everyone has been talking about for the last several years--those that pertain to service-oriented architecture (SOA).  These are the services that fall into the application integration camp, so I like to call them inter-application services. 

Second, there are services that are in place to make a complete application, such as logging, exception handling, data access and persistence, etc.--pretty much anything that makes an application go and is not a behavior of a particular domain object.  Maybe thinking of them as domain object services would work, but I fear I may already be losing some, so let's get back to it.  The main concern within this post are those services using within an application, so I call them intra-application services.

It seems like these latter services, the intra-application ones, are being often confused with the former--the inter-application services.  It's certainly understandable because there has been so much hype around SOA in recent years that the term "service" has been taken over and has lost its more generic meaning.  What's worse is that there has been a lot of confusion around the interaction of the terms Web service and just plain service (in the context of SOA).  The result is that you have folks thinking that all Web services are SO services and sometimes that SO services are always Web services.

My hope here is to make some clarification as to the way I think we should be thinking about all this.  First off, Web services are, in my book at least, simply a way of saying HTTP-protocol-based services, usually involving XML as the message format.  There is no, nor should there be, any implicit connection between the term Web service and service-oriented service.  So when you think Web service, don't assume anything more than that you're dealing with a software service that uses HTTP and XML. 

The more important distinction comes in the intent of the service--the purpose the service is designed for.  Before you even start worrying about whether a service is a Web service or not, you need to figure out what the purpose of the service is.  This is where I get pragmatic (and those who know me know that I tend to be an idealist at heart).  You simply need to determine if the service in question will be consumed by a client that you do not control. 

The reason this question is important is that it dramatically affects how you design the service.  If the answer is yes, you automatically take on the burden of treating the service as an integration (inter-application) service, and you must concern yourself with following best practices for those kinds of services.  The core guideline is that you cannot assume anything about the way your service will be used.  These services are the SO-type services that are much harder to design correctly, and there is tons of guidance available on how to do them2.  I won't go in further depth on those here.

I do think, though, that the other kind of services--intra-application services--have been broadly overlooked or just lost amidst all the discussion of the other kind.  Intra-application services do not have the external burdens that inter-application services have.  They can and should be designed to serve the needs of your application or, in the case of cross-cutting services (concerns) to serve the needs of the applications within your enterprise.  The wonderful thing about this is that you do have influence over your consumers, so you can safely make assumptions about them to enable you to make compromises in favor of other architectural concerns like performance, ease of use, maintainability, etc.

Now let's bring this back to the concrete question of best practices for intra-application Web services.  For those who are using object-oriented design, designing a strong domain model, you may run into quite a bit of trouble when you need to distribute your application across physical (or at least process) tiers.  Often this is the case for smart client applications--you have a rich front end client that uses Web services to communicate (usually for data access and persistence).  The problem is that when you cross process boundaries, you end up needing to serialize, and with Web services, you usually serialize to XML.  That in itself can pose some challenges, mainly around identity of objects, but with .NET, you also have to deal with the quirks of the serialization mechanisms.

For example, the default XML serialization is such that you have to have properties be public and  read-write, and you must have a default constructor.  These can break encapsulation and make it harder to design an object model that you can count on to act the way you expect it to.  WCF makes this better by letting you use attributes to have better control over serialization.  The other commonly faced challenge is on the client.  By default, if you use the VS Add Web Reference, it takes care of the trouble of generating your service proxies, but it introduces a separate set of proxy objects that are of different types than your domain objects.

So you're left with the option of either using the proxy as-is and doing a conversion routine to convert the proxy objects to your domain objects, or you can modify the proxy to use your actual domain objects.  The first solution introduces both a performance (creating more objects and transferring more data) and a complexity (having conversion routines to maintain) hit; the second solution introduces just a complexity hit (you have to modify the generated proxy a bit).  Neither solution is perfectly elegant--we'd need the framework to change to support this scenario elegantly; as it is now, the Web services stuff is designed more with inter-application services in mind (hence the dumb proxies that encourage an anemic domain model) than the intra-application scenario we have where we intend to use the domain model itself on the client side.

If you take nothing else away from this discussion, I'd suggest the key take away is that when designing Web services, it is perfectly valid to do so within the scope of your application (or enterprise framework).  There is a class of services for which it is safe to make assumptions about the clients, and you shouldn't let all of the high-falutin talk about SOA, WS-*, interoperability, etc. concern you if your scenario does not involve integration with other systems that are out of your control.  If you find the need for such integration at a later point, you can design services (in a service layer) then to meet those needs, and you won't be shooting yourself in the foot trying to design one-size-fits-all services now that make so many compromises so as to make the app either impossible to use or very poorly performing.

My own preference that I'd recommend is to use the command-line tools that will generate proxies for you (you can even include a batch file in your project to do this) but then modify them to work with your domain model--you don't even need your clients to use the service proxies directly.  If you use a provider model (plugin pattern) for these services, you can design a set of providers that use the Web services and a set that talk directly to your database.  This enables you to use your domain model easily in both scenarios (both in a Web application that talks directly to the db as well as a smart client that uses Web services). 

It requires a little extra effort, but it means you can design and use a real domain model and make it easier easier to use by hiding the complexity of dealing with these framework deficiencies for consumers of the domain model.  This is especially helpful in situations where you have different sets of developers working on different layers of the application, but it is also ideal for use and reuse by future developers as well.

One of these days, I'll write some sample code to exemplify this approach, maybe as part of a future exemplar.

Notes
1. The weatherthing says it's 65 degrees Fahrenheit right now--at 1pm!
2. My observation is that it is safe to assume that when other people talk about services and Web services, these are the kind they're thinking of, even if they don't make the distinction I do in this post. 

Saturday, September 15, 2007 6:00:03 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, September 10, 2007

I wasn't going to post about it, but after reading Don's post, I realized that I should so that I can thank those involved in presenting me with this honor.  I was surprised when I was contacted about being nominated to be an INETA speaker, and I was even more surprised when I heard that I'd been voted in.  Looking over the folks on the list, I feel hardly qualified to be named among them.

So without further ado, let me thank David Walker (who's an all around great guy and VP of the Speakers Bureau), Nancy Mesquita (who I've not had the pleasure to meet personally but has been very helpful in her role as Administrative Director), as well as everyone else involved on the Speaker Committee and others (whom I know not of specifically) in welcoming me into the INETA speaker fold.  It's a great honor--thank you. 

Now, I have to get back to work!  My group, UXG, just released Tangerine, the first of our exemplars, and now we're on to the next great thing!

Monday, September 10, 2007 10:19:19 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Tuesday, August 14, 2007

Thanks to a sharp co-worker of mine, I was recently introduced to "Magic Ink: Information Software and the Graphical Interface," by Bret Victor.  It was quite an interesting read; Victor makes a lot of good points.  For instance, he suggests that we should view information software as graphic design, i.e., taking the concerns of traditional graphic design as paramount and then taking it to the next level by availing ourselves of context-sensitivity, which he defines as inferring the context from the environment, history, and, as a last resort, interaction.

Minimizing Interaction

The thrust of the argument is around reducing interaction and making software smarter, i.e., more context aware and, eventually, able to learn through abstractions over learning algorithms.  I think we can all agree with this emphasis, but I do think he unnecessarily latches onto the term "interaction" as a bad thing, or rather, I think he presents "interaction design" in an overly-negative light. 

True, the smarter we can make computers (and consequently require less interaction from users) the better, but that doesn't negate the usefulness of interaction design, human factors, information architecture, and usability.  There are many, valuable things to be learned and used in all of these interaction-oriented fields, and we shouldn't deride or dismiss them because they focus on interaction.  I felt that Victor's negative emphasis on this and his speculating that why software sucks in relation to this took away from the value of his overall message.

The Problem of Privacy

There is one problem that I don't think he addressed in terms of increasing environmental context awareness, and that is security, specifically, privacy.  It is tempting to think about how wonderful it would be for a computer to know more about our environment than us and thus be able to anticipate our needs and desires, but in order to do this, we, as humans, will have to sacrifice some level of privacy.  Do we really want a totally connected computer to know precisely where we are all the time?  Do we really want it to be "reporting" this all the time by querying location aware services?  Do we really want a computer to remember everything that we've done--where we've been, who we've interacted with, when we did things?

I think the trickier issues with context awareness have to do with questions like these.  How do we enable applications to interact with each other on our behalf, requiring minimal interaction from us, while maintaining our privacy?  How does an application know when it is okay to share X data about us with another application?  Do we risk actually increasing the level of interaction (or at least just changing what we're interacting about) in order to enable this context sensitivity? 

If we're not careful, we could end up with a Minority Report world.  People complain about cookies and wire taps, the world of computer context-sensitivity will increase privacy concerns by orders of magnitudes.  This is not to negate the importance of striving towards greater context sensitivity.  It is a good goal; we just need to be careful how we get there.

Towards Graphic Design

One of the most effective points he made was in illustrating the difference between search results as an index and search results as a tool for evaluation itself, i.e., thinking about lists of information in terms of providing sufficient information for a comparative level of decision making.    It is a shift in how developers can (and should) think about search results (and lists in general).

Similarly, his example of the subway schedule and comparing it to other scheduling applications is a critical point.  It illustrates the value of thinking in terms of what the user wants and needs instead of in terms of what the application needs, and it ties in the value of creating contextually meaningful visualizations.  He references and recommends Edward Tufte, and you can see a lot of Tufte in his message (both in the importance of good visualizations and the bemoaning of the current state of software).  I agree that too often we developers are so focused on "reuse" that we fail miserably in truly understanding the problems we are trying to solve, particularly in the UI.

That's one interesting observation I've had the chance to make in working a lot with graphic/visual designers.  They want to design each screen in an application as if it were a static canvas so that they can make everything look and feel just right.  It makes sense from a design and visual perspective, but developers are basically the opposite--they want to find the one solution that fits all of their UI problems.  If you give a developer a nicely styled screen, he'll reuse that same style in the entire application.  In doing so, developers accidentally stumble on an important design and usability concept (that of consistency), but developers do it because they are reusing the design for maximum efficiency, not because they're consciously concerned about UI consistency!  It is a kind of impedance mismatch between the way a designer views an application UI and the way a developer does.

The Timeless Way

I'm currently reading Christopher Alexander's The Timeless Way of Building, which I hope to comment on in more depth when done.  But this discussion brings me back to it.  In fact, it brings me back to Notes on the Synthesis of Form as well, which is an earlier work by him.  One of the underlying currents in both is designing a form (solution, if you will) that best fits the problem and environment (context).  The timeless way (and patterns and pattern language, especially) is all about building things that are alive, that flow and thrive and fit their context, and the way you do that is not by slapping together one-size-fits-all solutions (i.e., reusing implementations) but in discovering the patterns in the problem space and applying patterns from the solution space that fit the problem space just so.  The reuse is in the patterns, at the conceptual level, but the implementation of the pattern must always be customized to fit snugly the problem. 

This applies in the UI as well as other areas of design, and that's the underlying current behind both Tufte's and Victor's arguments for the intelligent use of graphic design and visualization to convey information.  You must start by considering each problem in its context, learn as much as you can about the problem and context, then find patterns that fit and implement them for the problem in the way that makes the most sense for the problem.  But more on the timeless way later.

A Good Read

Overall, the paper is a good, thought-provoking read.  I'd recommend it to pretty much any software artisan as a starting point for thinking about these issues.  It's more valuable knowledge that you can put in your hat and use when designing your next software project.

Tuesday, August 14, 2007 10:41:14 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, July 30, 2007

Are you passionate about software development?  Do you love to share your knowledge with others?  Do you like working in a vibrant, fun culture working on the latest and greatest technologies with other smart and passionate people?  If so, I think I may have your dream job right here.

We're looking for another guidisan to help craft guidance using best practices for .NET development.  The word guidisan ('gId-&-z&n) comes from a blending of "guidance" and "artisan," which really speaks to the heart of the matter.  We're looking for software artisans who have the experience, know-how, and gumption to explore strange new technologies, to seek out new applications and new user scenarios, to boldly go where other developers only dream of going in order to provide deep, technical guidance for their colleagues and peers.

What do guidisans do? 

  • Help gather, specify, and document application vision, scope, and requirements.
  • Take application requirements and create an application design that meets the requirements and follows best known practices for both Microsoft .NET and Infragistics products.
  • Implement applications following requirements, best practices, and design specifications.
  • Create supplemental content such as articles, white papers, screencasts, podcasts, etc. that help elucidate example code and applications.
  • Research emerging technologies and create prototypes based on emerging technologies.
  • Contribute to joint design sessions as well as coding and design discussions.

What do I need to qualify?

  • Bachelor’s Degree.
  • 4+ years of full-time, professional experience designing and developing business applications.
  • 2+ years designing and developing.NET applications (UI development in particular).
  • Be able to create vision, scope, and requirements documents based on usage scenarios.
  • Demonstrated experience with object-oriented design; familiarity with behavior-driven design, domain-driven design, and test-driven development a plus.
  • Demonstrated knowledge of best practices for .NET application development.
  • Accept and provide constructive criticism in group situations.
  • Follow design and coding guidelines.
  • Clearly communicate technical concepts in writing and speaking.

If you think this is your dream job, contact me.  Tell me why it's your dream job and why you think you'd be the next great guidisan.

Monday, July 30, 2007 3:01:27 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Sunday, July 15, 2007

Thanks to all who came to my "suave sessions" session yesterday at Tampa Code Camp.  Now you're all "it getters," and you get some free code, too.

Download the Session Management Code

Enjoy!

Sunday, July 15, 2007 7:50:42 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Sunday, April 29, 2007

Essential Windows Presentation Foundation Essential Windows Presentation Foundation is precisely what the title says it is.  What more can you ask for in a book?  There are already several books on the RTM of WPF, and there are bound to be more.  The unique value this one has is that it is written by Chris Anderson, who as most know, was an instrumental architect in designing WPF, and this (along with his direct connection to the others who worked on it) gives him insight that you just otherwise can't get. 

In particular, I like that he often provides the thinking that went into particular design decisions.  He readily admits in several places that the design of this or that was hotly debated, and one can only imagine that they would be.  Having worked at a few commercial software vendors myself, I know how difficult it can be to know the best way to design a thing, and it can only be more challenging as your audience widens. 

After this, the main thing that makes the book valuable is that it is deeply conceptual.  The point of the book is not to be a reference, a recipie book, or a smattering of tutorials.  Rather, the book provides, in a coherent form, the key principles underlying the different aspects of WPF.  And by elaborating these principles, Chris establishes a strong sense that the Foundation was designed in a similarly coherent manner.

My favorite chapters were the one on Data, the one on Actions, and the Appendix.  For a solutions architect and developer, these I think provide the most interesting meat.  Of course, these types will likely want to delve into the first three chapters as well.  In fact, the only one that I'd suggest you can probably get away with skipping is the one on Visuals; I found this one pretty dry and hard to push through.  Designers and those more interested in graphics per se will likely enjoy these.

The chapter on Styles took me by surprise, but then, that's because the concept of styles in WPF is a tad surprising.  Being the language-oriented person that I am, I am a bit bothered by the choice of Style to encompass everything that you can do with styles in WPF.  Needless to say, it's not just UI goodness--devs will need to be pretty familiar with this stuff.

Other than that, my only contention is with the assertion that apps today are all about data.  This won't come as a surprise to those who've read my articles or talked to me about architecture much, but despite my philosophical objection, when it comes to UI, I'll admit that LOB apps are in fact largely about the data, i.e., largely about displaying and manipulating data since thus far, we seem to have mainly used computers to help with data storage and retrieval.  In any case, it is certainly important to have good data binding mechanisms in the UI, and I have to say, WPF nails this better than any UI tech I've bumped into thus far.

But I digress.  The book is good; I recommend it as a starting point or to complement other WPF learning resources.  It is the essentials with which you can start effectively creating WPF applications.  You'll need the docs and/or other more comprehensive books to really figure it all out, but you should read this one regardless.

Sunday, April 29, 2007 8:12:40 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, April 27, 2007
Just a one question survey. 
 
If you are evaluating a software product, what do you prefer to do:
A) Download everything, including help, samples, SDK, etc. at once, even if it may be half a gig.
B) Just download the product bits first and then either download the help, samples, SDK, etc. separately as you need them (or never download those and just use online help/samples).
C) Download a shell installer that lets you pick what you want and only downloads/installs what you pick?
D) Try out the bits in an online VM environment.
E) Other, please specify.
 
You can either just pick one or put them in order of preference.
 
Thanks in advance for any opinions!
Friday, April 27, 2007 2:31:06 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Wednesday, April 11, 2007

Christopher Alexander is a noted traditional (i.e., not software) architect who's been writing about design since well before I was born.  Some of his books, most notably A Pattern Language, are the basis of the patterns movement (for lack of a better word) in the software industry.  Anyone who writes on software patterns includes his works in their bibliographies, so I figured there must be something to it.

Not being one to trust others' reductions and paraphrasing any more than I have to, I've been wanting to dig into his work myself for some time.  I finally got around to it in early March.  I've started with Notes on the Synthesis of Form, which seems to be the first book in the series on patterns.

Apart from loving the plain black cover and white block lettering and of course the obscure sounding title, I also enjoyed the innards.  It really is interesting how similar the problems and processes of three-dimensional design and architecture are with those of software design and architecture.

I dare not reduce this work or ask you to depend upon my fuzzy recollections for a precise summary, but what follows is what I recall of the book, those things which made enough of an impression to stick with me at least these few weeks since my reading.

First, I recall the observation that we often only really know the proper form (solution) by recognizing things that are out of place (misfits).  What's interesting about this is how utterly incompatible this is with the idea of waterfall design, i.e., trying to imagine and gather all the requirements of a solution up front.  We simply lack the imagination to create solutions that fit perfectly using the waterfall approach, and the more complex the problem, the more likely this approach is  to fail.

This is in part why agile, iterative development and prototyping works better.  It enables us to create a form (a solution) and see how well it fits against the actual problem.  We can easily recognize the misfits then by comparing the prototype or iteration to the problem and make small adjustments to eliminate the misfits, ultimately synthesizing a much better-fitting form than we could ever imagine up front.

Second, I found the approach to the composition of the individual problems into the most autonomous groups (problem sets) possible to be insightful.  But the key observation here is that this composition should be based in the realities of the problems, not in the preconceived groupings that our profession has set out for us. 

For instance, rather than starting with the buckets of security, logging, exception handling, etc., you identify the actual individual problems that are in the problem domain, group them by their relative interconnectedness, and then attempt to design solutions for those groupings.  The value in this observation lies in keeping us focused on the specifics of the problem at hand rather than attempting to use a sort of one-size-fits-all approach to solving design problems. 

Further, if we take this approach, we will have more success in creating a form that fits because the groupings are along natural boundaries (i.e., areas of minimal connectedness) in the problem domain.  Thus when we create a solution for a set of problems, the chance that the solution will cause misfits in other sets is diminished.

Finally, as we identify these natural sets in the problem domains, we see recurring, like solutions (patterns) emerge that can be generalized to create a sort of rough blueprint for solving those sets of problems.  The patterns are not rote algorithms with no variation or creativity but rather are like an outline from which the author can craft the message using his or her particular genius. 

This avoids the pitfall of the one-size-fits-all solution, provides for competition and creativity, and ultimately has the best chance of enabling designers to create a system of forms that integrate harmoniously and address the actual problems at hand.

And the idea is that these sets are also hierarchical in nature such that one can create sets of sets of problems (and corresponding patterns) to create higher and higher level coherent views of extremely complex problem domains.  This, of course, fits nicely with the way we deal with problems in the software world as well (or in managing people, for that matter), dealing with problem sets and patterns all the way from enterprise application integration down to patterns governing individual instructions to a CPU (or from the C-level management team down to the team supervisors).  What can we say, hierarchies are convenient ways for us to handle complex problems in coherent ways.

So what does it all mean?  Well, I think it in large part validates recent developments in the industry.  From agile development (including test-driven design) to domain-driven design to, of course, the patterns movement itself.  We're seeing the gradual popular realization of the principles discussed in this book. 

It means that if we continue to explore other, more mature professions, we might just save ourselves a lot of trouble and money by learning from their mistakes and their contributions to human knowledge.  It's like avoiding a higher-level Not Invented Here Syndrome, which has long-plagued our industry.  We're a bunch of smart people, but that doesn't mean we have to solve every problem, again!  Why not focus on the problems that have yet to be solved?  It makes no more sense for a developer to create his own custom grid control than it does for our industry to try to rediscover the nature of designing solutions to complex problems.

It also means that we have a lot of work to do yet in terms of discovering, cataloguing, and actually using patterns at all levels of software design, not for the sake of using patterns but, again, for the sake of focusing on problems that have yet to be solved.  I look forward to continuing reading The Timeless Way of Building and to the continued improvements of our profession.

Wednesday, April 11, 2007 11:27:33 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Thursday, March 1, 2007

Don't forget!  This Saturday is NYC Code Camp III!  Unfortunately, I won't be able to make it as I'm helping to prepare for our upcoming 2007 Volume 1 launch next week.  We've got some great stuff in the works from the User Experience Group, so keep an eye out for it. :)

Thursday, March 1, 2007 11:39:10 AM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, November 14, 2006

I finally got a chance to start looking at the blog posts that have been piling up in my Newsgator for a few months now, and I was pleasantly surprised by probably the best thoughts on SOA I've seen in a long time.  I'm really glad that Rocky's fighting the good fight on this one, and he's been consistent, too.  Another, more thorough commentary on the subject is provided by a good friend of mine, Tom Fuller, in his article last year on The Good, the Bad, and the Ugly of Service-Oriented Architecture.

The bottom line, IMO, is that working towards SOA is a good thing, but we have to be very cautious and extremely deliberate in how we get there.  I think most good architects know this, but we have to get the message out there and overcome the hype to minimize the trough of despair.

Tuesday, November 14, 2006 6:47:42 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, November 13, 2006

Last week, the head geek at Telligent told me about this new service they’re offering called blogmailr.  It’s a pretty cool concept; it allows folks to post to their blogs using email.  So I thought I’d try it out.  If this works or not, it went through blogmailr.  It’s worth a look.

Monday, November 13, 2006 11:28:40 AM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 

I was just reminded by our local Dev Evangelist, Peter Laudati, that we've got our third NJ CodeCamp coming up this weekend.  Code camps are a fun way to get to know other local devs, learn some cool stuff, and generally get at least a free lunch!  So you should go!

Monday, November 13, 2006 9:55:40 AM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, November 10, 2006

So Infragistics had a pretty cool release today, if I do say so myself.  We've released a beta patch for our NetAdvantage for ASP.NET product that supports Microsoft's ASP.NET AJAX Beta 1 and Beta 2. 

Support Details

  • Our controls will register themselves with the UpdatePanel to ensure proper operation within it.
  • Our Javascript Client-Side Object Model (CSOM) continues to work alongside the Microsoft AJAX Library.
  • Infragistics controls will not interfere with the Microsoft AJAX Library.
  • Infragistics controls can be embedded in and work with ASP.NET AJAX Control Toolkit controls.

I'm pretty pumped about ASP.NET AJAX, especially the Microsoft AJAX Library.  It should make client-side development across browsers much easier, and with the AJAX Extensions, it helps make adding AJAX to your apps in ASP.NET considerably easier, and the UpdatePanel is an indisputable help in that respect.

Infragistics is committed to the ASP.NET AJAX platform.  We'll be supporting it throughout the beta, the release, and beyond.  We've been adding AJAX-powered features since 2004, and it is only going to get better for us and everyone else thanks to the new platform and tool enhancements that are coming down the line.

Friday, November 10, 2006 7:39:04 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 
# Thursday, October 26, 2006

Today we launched our new web site.  It was not just a simple update; we revamped the whole deal and made it Web 2.0 compliant <grin>.  If you remember our old site, I trust you'll immediately see the improvement.  Please take a minute to check it out and let me know what you think.  Also, if you run into any problems with it, please feel free to let us know.

Thursday, October 26, 2006 7:24:38 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [3]  | 
# Monday, October 23, 2006

I just spent an hour or more that I DON'T HAVE debugging a mysterious caching issue.  I suppose in some cases it might be obvious, but in this one, it was not.  To sum up, we're using an XmlDataSource control generically and setting its Data property programmatically (and using an XSL--don't know if that matters). 

Anyways, apparently the dang control defaults to "cache indefinitely" and won't refresh until the file it depends on changes.  I guess the thing is that it doesn't look for changes when you set the Data property, so it caches indefinitely to be sure.  Set EnableCaching to false, and voila, the problem is solved.

This just highlights a rule that all general APIs should follow--don't do any automatic caching.  You can't account for all the ways your customers will use your stuff, so just don't do it.  It's not hard to make them flip a bit to turn it on. 

Argh!

Monday, October 23, 2006 10:39:43 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Tuesday, October 17, 2006

And overall I think it went quite well.  My Suave Sessions session was attended by a whopping ONE PERSON!  I seem to recall his name is Mark, and he runs the Ft. Worth DNUG, so kudos to him for picking a great session!  I know it wasn't sexy, but good session handling is something we should all be concerned about, certainly more so than getting an intro to DNN by the great Shaun Walker (who was presenting at the same time and whom I blame for stealing all my potential attendees).  The good news is that it's recorded and Wrox will be hosting it on their web site, so all of you folks who made the unfortunate decision not to attend can still get the session.  :)

Download the dotNetTemplar Session Management Module (for the Suave Sessions Session) - Even if you didn't see the session, you can start adding good session handling to your pages right away.  There's demo web project there to show how to use it.  If you want the demos from the presentation, let me know.

The EntLib session didn't go quite so well.  Apparently, I should really check to ensure my old demos work before the day of when I give a repeat session, he thought, embarrassed.  So I apologize again to all the troopers who toughed out the session with no running demos.  Thankfully, the core concepts could still be expressed; it just wasn't as fun as it could be.

Download the ELMAH EntLib Exception Handler/Logger - This can be used to both specify ELMAH as a custom EntLib exception handler and use EntLib for your db access in ELMAH.

To use it, configure ELMAH as usual.  If you want to use the EntLib logger, use GotDotNet.Elmah.EntLibErrorLog as the error log type instead of the standard SQL one. 

To use the custom exception handler in EntLib, you just need to choose it in the EntLib GUI by loading the ELMAH DLL and picking the GotDotNet.Elmah.ElmahEntLibExceptionHandler as the handler type.  It should look something like this in the standard config:

<add type="System.Exception, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
  postHandlingAction="None" name="Exception">
  <
exceptionHandlers>
    <
add type="GotDotNet.Elmah.ElmahEntLibExceptionHandler, GotDotNet.Elmah, Version=2.0.50727.42, Culture=neutral, PublicKeyToken=58d6fbf09c89f721" name="Elmah EntLib Exception Handler" />
  </
exceptionHandlers>
</
add>

The public key token will differ, though.  I just reconditioned this for public use real quick like, so let me know if you have any issues.

Download the Slides From Both Presentations - In case you didn't get the DVD.

Other than that, I have to give some big kudos to David Walker and his team for putting the conference together.  I've spoken at a number of code camp activities, and this was definitely one of the best organized and professionally done.  I can't help but think that their not shunning sponsors (like Infragistics) helped in making it better.  While I appreciate the academic ideal of trying to keep the code camp focused on devs sharing with devs, I think it is perhaps not in the best interests of anyone to shun sponsorship.  The vendors who sponsor conferences like that have tools that are supposed to make devs lives better, so in my opinion, it only makes sense to welcome them in as long as it is done tastefully.

And no, I didn't just start thinking this now that I'm working for a vendor; you can ask Joe Healy--I was pushing for sponsors when I was helping organize the Tampa code camp.  After all, it's not like Microsoft's stuff is free, and if the conference is about using Microsoft's technologies, why limit the vendor sponsors and topics to Microsoft?  Microsoft does a lot to make software development better, and we all welcome that.  I'm just suggesting the same thinking be extended to other companies who do the same thing.

Anyways, I didn't intend to rant about that really; I mainly wanted to say that David et al did a great job.  It was good to visit my hometown again, and while I didn't make it out to Ron's Chli & Hamburgers Too for that sausage chili cheeseburger I've been missing, I still thoroughly enjoyed the visit.  Tulsa certainly has been growing its dev community, and I hope they continue to do so.

That's it.  Hope everyone's having a great day!  Sorry bout the delay in getting this up.

Tuesday, October 17, 2006 3:55:26 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Saturday, September 30, 2006

A while back, I decided I needed to add browser-specific capabilities to my web application.  While there are those who advocate using capability testing rather than browser sniffing, there is at least one good reason to prefer sniffing.  That is that you want to be sure your site works as well as possible in all browsers but you want to take advantage of capabilities only available in some. 

In itself, this is not a reason to prefer sniffing; the key, however, is that you "want to be sure," which means testing; otherwise, your stuff may or may not work, which isn't very reassuring.  If you don't have the resources to do testing in all target browsers or the time to develop Javascript workarounds based strictly on specific capability detection, then sniffing is a good alternative because it allows you to only use "advanced" functionality in the browsers that you have tested and fall back to standard functionality for the rest.  This of course assumes you have architected your stuff in such a way as to make downgrading possible and still offer fairly equivalent services in a less rich presentation.  That in itself can be challenging and is far too involved and a bit off topic for this post.

So let's just assume you can fall back.  The next question is where you do the downgrading.  You can do it in Javascript, which if you only want to alter some functionality on the client side is fine, but if you want to, say, avoid Javascript altogether or emit significantly different script based on browser, the choice is to detect on the server and act appropriately.

Thankfully, most web server technologies support browser sniffing, and ASP.NET has expanded on and improved on this with their control adapters in 2.0.  But you can still use the old browser capabilities approach.  To do this in 2.0, you simply add the special App_Browsers folder to your web site or project (if you're using WAP).  In there you just add a file with the .browser extension, and you can put in your own custom browser capabilities there.  Here's an example:

<

browsers>
  <
browser refID="MozillaFirefox">
    <
capabilities>
      <
capability name="supportBubblePopup" value="true" />
      <
capability name="supportAjaxNavigation" value="true" />
    </
capabilities>
  </
browser>
  <
browser refID="IE6to9">
    <
capabilities>
      <
capability name="supportBubblePopup" value="true" />
      <
capability name="supportAjaxNavigation" value="true" />
    </
capabilities>
  </
browser>
</
browsers>

That's it, if all you want to do is extend the existing browser definitions.  If you want to define new ones or to find out more about the browser schema, you can consult the MSDN docs.  Using refID just lets you reference an existing definition and extend it.  You can view existing definitions that ship with .NET in the Framework CONFIG\Browsers directory (e.g., C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\CONFIG\Browsers), but be sure not to modify those directly to avoid your customizations being overwritten with later patches.

Note: You might be tempted to set a default value using refID="Default" as the docs suggest, but I've confirmed that there is a bug that causes the default to actually overwrite the more specific settings.  Microsoft tells me that they have scheduled a fix for this bug and that it will be released with the next Service Pack, but if you need it sooner, you can create a support incident and get a hotfix.  So the workaround is to not use Default and have your code check against null to determine default.  It's not the nicest approach, especially when you're using a Boolean value that would be better to just parse to bool, but it's not a travesty either.

You can then test for the capability in your code like so:

this

.Page.Request.Browser["supportAjaxNavigation"] == "true"

and do your downgrading if need be.  Of course, if you have some serious alternative rendering that needs to occur, you should consider using control adapters (especially the PageAdapter) to avoid complicating code with lots of conditional statements.

 

Saturday, September 30, 2006 2:30:26 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, September 15, 2006

Hi all, I just wanted to spread the word about the Tulsa TechFest next month.  As some of you know, I originally hail from Tulsa, OK, and I'm glad to be going back to my home town for this conference.  I'll be doing one session on EntLib for ASP.NET (modified version of my TechEd talk) and one on professional session handling (talking about things like HTTP modules, handlers, custom controls, etc. to gracefully handle sessions in ASP.NET). 

Of course, I'm just one of many speakers who'll be presenting on a large variety of topics, not just .NET.  They'll also have vendors with lots of goodies--I know Infragistics has some neat giveaways planned.  So if you're in the region, you should definitely check it out.  It looks like it'll be the best tech event that the heartland has seen.  Kudos to David Walker and the other organizers.

Friday, September 15, 2006 12:09:19 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, September 11, 2006

I just ran across what appears to be a nasty bug in Firefox today.  Without explaining why I'm doing it, suffice it to say that I just wanted to call a script in a parent frame (from an IFrame) that ultimately results in an XMLHttpRequest.  It works in IE 6 and 7, but in FF 1.5, it just doesn't.  In fact, the result of the request seemed to be the result of the previous request that was executed, and to make it more interesting, the responseXML returned null while the text showed the results of the previous call.

Oddly enough, no errors were thrown--it acts like a regular call with an error free response; it just doesn't actually seem to perform the request and in the meantime loses its XML document.

Anyways, I was starting to despair when I ran across this blog post.  It seems I'm not the only one who's bumped into this one (or some variation), yet I must say it was hard to find that blog entry based on my searching.  So I want to raise its visibility with this post. 

For me, the solution was simply to use window.top.setTimeout(myfun, 50).  That appears to give FF the context it needs to properly execute the request.  But of course, that breaks it for IE, so you gotta do a check for Firefox (e.g., if (navigator.userAgent.indexOf('Firefox') != -1)) do the timeout if so; otherwise, make the call directly.

Phwew!  Thanks to those who've gone before me! 

               
Monday, September 11, 2006 10:06:32 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Wednesday, September 6, 2006

I just want to spread the word about a little utiltiy on gotdotnet that generates a strongly-typed profile for WAPs.  It's not the easiest thing to come across:

http://www.gotdotnet.com/workspaces/workspace.aspx?id=406eefba-2dd9-4d80-a48c-b4f135df4127

Note that the profile goo in 2.0 should just work, even with WAPs.  This just gives you a better design-time experience.

Wednesday, September 6, 2006 11:04:03 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Wednesday, August 30, 2006

I know the job market for .NET devs right now is really hot, and those with experience typically have a lot of choices.  So let me add another choice to the list.  Yes, I'm talking about Infragistics.  No, they're not paying me to say this.  Well, yeah, they're paying me, but not to say this. Rather than just stick a job description out, I hope you'll let me tell you why I like working for Infragistics.  I've been here nigh on four months, and I'm still liking it, so that's something.

Why I Like Infragistics

I've worked at more than my fair share of companies in my career, sometimes consulting, more often as an employee, and this is the best company I've worked at.  Sure, at other companies there are good people; I've worked with many.  But Infragistics not only has good people, it is a good company--it has good culture and actively works to improve it.

It's not just that it is a software company, though I think that helps.  I've worked at more than one commercial software company, and the cultures between them are as starkly contrasted as light is from darkness, happiness from sorrow.  It's not just its size because I've worked at others about our size, larger and smaller.  It's not just that the dress code is as relaxed as it gets, though that's nice. 

Other companies talk about passion, but here it is ingrained in the culture.  People care about what they are doing and strive to do their best, and it starts at the top.  Probably no one here is more passionate about the company than our CEO and the rest of our management team.  And the enthusiasm spreads into every department, even to folks like me, in case that wasn't obvious.  For me, that makes all the difference--working with positive, enthusiastic, and creative individuals at a company that fosters that kind of an environment.

For those who are still new enough in their career to think that stuff like this isn't important, that salary is all that matters, I hope you can take it to heart from me that probably more than anything--more than salary, more than benefits, more than location--the culture of your company is what makes or breaks whether or not you are happy in your job.  Now I'm not naive enough to think that our culture fits everyone; it won't.  But if you are a motivated and talented individual who likes to be challenged, likes to make a difference, and wants to get experience working with, learning from, and sharing your knowledge with other similar individuals, Infragistics just may be the place for you.

Being a community guy, I also like that Infragistics has done and is doing a lot of community support.  We host the local .NET and Java user groups at our HQ.  We often sponsor and send speakers to code camps and tech fests.  We sponsor user groups worldwide, and if you like being involved in the community, Infragistics goes out of its way to help you in that.  And we're always looking for new ways that we can support our communities, so if you have ideas, shoot them over to me.

Of course, being a geek, I like that Infragistics gives me the opportunity to work on the latest and greatest technologies.  If you're stuck in a job where they're taking the "safe" course of not upgrading, you won't face that problem here.  In fact, we're challenged to be and stay on the edge of the technological spectrum.

And as an aesthetically sensitive person, I appreciate that Infragistics provides a good working environment.  The building is nice; the work area is nice; the equipment is nice, and there is a degree of freedom to make your space your own.  If you want to have medieval action figures along the tops of your cube (like I do), you can do that.  Or if you are into feng shui, so be it. 

In terms of location, I think it's great.  I've already blogged about that.  We're now rounding into fall (already!) and our hottest temps this summer were a few days of maybe a hundredish.  Coming from Tampa, I can now avow that it in fact does not get (or at least seem) as hot and humid here--it was stickier when I left Tampa in early May than it got here the entire summer.  And you don't have to worry about hurricanes really.  Of course, Wally recently counseled me that I might want to wait until I've lived through a winter before I sing the praises of the weather, but I'll take a livable summer and cold winter any day over unbearable, six-month-long summers and mild winters (that don't really even qualify as winters).  I've always said, you can bundle up as much as you need to, but you can only take so much off! 

But again, like company culture, I know there are different strokes for different folks (Joe!). :)  The only reason I mention this is to counteract the common misconception about New Jersey being an undesirable place to live.  If you're into culture, plays, or clubbing, New York City and Philadelphia are just a stone's throw away by car or train.  If you like small town family feel, we've got that, too; I just went to a butterfly festival with my family a couple weeks ago, and they've had others (like insect, peach, etc. festivals all over the place).  If you like history, you can't go far without running into some monument commemorating where Washington did something or where, e.g., some of our founding fathers went to college.  Shopping?  Route 1 is the place to be (or, again, NYC).  Like to travel?  The Newark airport is one of the largest in the US.  Compared to the other places I've lived, it fares quite well on the pros v. cons.  So if the "armpit of the US"/Sopranos stereotype is all that's stopping you from joining us here in central Jersey, don't let it!

What's Available

Now that I know you're chomping at the bit to work for Infragistics, I guess it wouldn't hurt to mention the positions we're hiring for.  You can see a full list of open positions on our careers pages.  You'll need to use the quick links to see the list by location.  Yes, we do have more spots open than at our HQ here in NJ, and if those appeal to you, the more the merrier.  But looking specifically at our HQ openings (and since this is a developer-oriented blog), I'll highlight the Sr. R&D Engineer position for our .NET web controls.  We need someone who is very strong with web UI development and, of course, .NET.  It's a tall order, but I'm sure you're out there.  If you think you're an ASP.NET web UI expert, you should definitely consider it.  That position has challenges that most of us devs never have to face.

We also have some other dev positions in the internal systems department, so if the R&D position doesn't seem like it would fit you, you might check those out.  There is a lot of mobility possible in this company, so you might start in internal systems and then move to other areas that you later find more interesting, e.g., evangelism, R&D, etc., as positions become available. 

It's certainly a fun, interesting, agile, and challenging place to work.  All of these positions involve cutting edge technologies, working with great people, in a great culture.  Maybe you see some other position that's open and interests you, or even if there isn't a perfect fit on the web site but you think you have something to offer a company like I've been describing, you can just send me your resume, and I'll ensure it gets into the right hands.  Yes, we do have a referral program, and yes, I will take you out to lunch if you get hired on as my referral.  I had to mention that because, hey, you wouldn't believe me if I said I'm just doing it to help my company (no matter how true it is). :-p No more waiting.  Do it!

Wednesday, August 30, 2006 7:36:22 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Wednesday, August 23, 2006

Everybody who has done any cross-browser web development is aware of the many quirks of IE6, so it is good to read, as Scott points out, that the IE team is addressing many of them with IE7.  While it isn't 100% yet, I've noticed that more often than not, if you get it right in Firefox, it looks right in IE7 and vice-versa.  That certainly cannot be said of IE6.

And on that note, I'd like to draw more attention to the IE web developer toolbar, which is mentioned in the IE team's blog.  Those who use Aardvark in Mozilla should definitely give this add-in a whirl.  It's not the same, but it does offer many similar features and some that Aardvark doesn't.  It helps take the guesswork out of troubleshooting layout and styling issues.

Wednesday, August 23, 2006 11:43:45 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, August 18, 2006

Here's a quick tip for something pretty neat in ASP.NET 2.0.  You can specify output cache profiles in the application configuration like so (this is under the system.web configuration section):

<caching>
 
<outputCache enableOutputCache="true" />
  <
outputCacheSettings>
   
<outputCacheProfiles>
      <
add name="StandardPages" duration="300" varyByCustom="browser;culture" />
    </
outputCacheProfiles>
  </
outputCacheSettings>
</
caching>

Then in your Global.asax code, add the following handler:

public override sealed string GetVaryByCustomString(HttpContext context, string custom)
{
 
string[] variances = custom.Split(';');
  System.Text.
StringBuilder response = new System.Text.StringBuilder();
  foreach (string variance in variances)
  {
   
switch (variance)
    {
     
case "browser":
        response.Append(
this.Request.Browser.Type);
       
break;
      case "culture":
        response.Append(System.Globalization.CultureInfo.CurrentUICulture.LCID.ToString());
        break;
      }
    }
 
return response.ToString();
}

Now, on any pages you want to be cached like this, you can just add the output cache directive like so:

<%@ OutputCache CacheProfile="StandardPages" %>

Of course, you can add other profiles for pages that, say, vary by parameters, controls, and the like, but this makes it easy to control the caching of all of a kind of a page via the web.config and also shows how you might set up a profile that varies by browser and culture.  And with an implementation of GetVaryByCustom like this, you can add your own variances and mix and match them in your profiles as desired.  It makes for a pretty flexible caching system.  For instance, you could add a check for authentication and, if authenticated, effectively turn off caching by appending a user ID or name and the current date and time (DateTime.Now.ToString()).  That way pages would be cached for anonymous visitors but for authenticated users, where the content is more dynamic, you might not want to cache.

Friday, August 18, 2006 4:54:02 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Wednesday, August 2, 2006

I just ran across Ryan Plant's post about the "Web Architect."  He seems affronted at the idea, but this was one of my titles in a former life.  So it's not so surprising to me.  There are indeed a lot of considerations (some of which Ryan talks about) that you have to think about when designing web applications that you don't have to think about in other kinds of applications.  It really does take a specialized set of skills.

That said, I don't think that having those specialized skills would necessarily qualify one as a web architect.  Given my previous thoughts on the subject (illuminated here and elsewhere), many of which seem to echo or be echoed in other publications on the quesiton of what is IT architecture, I tend to think that the web architect role would be a valid role if it was thought of as the individual responsible for a company's web presence.  There are I think distinct questions that have to be thought about in terms of the business and how it is represented on the web (at least on the properties controlled by the business). 

Depending on the company, there may be warrant to have an individual in a web architect role, which would of course assume the knowledge of the specific skills Ryan speaks to, but, more importantly, this role would be responsible to consider how to strategically take advantage of the web to address business needs.  In some companies, such a role may be subsumed into the greater enterprise architect or solutions architect roles, but in others, I could see it being a peer or possibly a report to the enterprise architect and a peer with other IT architects, working with him to coordinate technology application for the business specifically on the web.  This assumes that there is sufficient business need for such a distinguished role, not just a need for the web skill set.

Wednesday, August 2, 2006 2:13:41 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, August 1, 2006

I was just reminded via email today that I never came through on my intention to post something about my Oracle ELMAH provider.

There's not a lot to it, and if you want to understand it, just look at the ELMAH docs for how to write a provider.  There are some SQL files in the project that you'll need to run on your Oracle db.  Alternatively, there is a CreateElmahTable static method you can call from code to automate setting the table up.

Note that in addition to your standard connection string, there is a schema attribute you need to set in your ELMAH config to tell Oracle what schema to use for the ELMAH stuff.  Other than that, it should be plug-n-play just like ELMAH.  Enjoy!

Without further ado: Oracle ELMAH Provider (15kb)

 

Tuesday, August 1, 2006 11:31:30 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Wednesday, July 26, 2006

Okay, so I'm a little late to this bandwagon, but on an email list I'm on, Michael Campbell alluded to Robert McLaws' shell extension idea to run the ASP.NET web development server for any folder.  Then, in a comment on that blog, Chris Frazier provides the code for a console application that overcomes some of the limitations that Robert's idea had and adds the nice feature of opening your default browser to the new root.

This is all fine and dandy, but where's the dang EXE?!  So I took Chris' code, incorporated Daniel Fisher's suggestion for getting the path to the web dev server.  In fact, I took it a step further and created a simple installer so taking advantage of this neat idea is as simple as running an installer.  So without further ado:

OpenCassini.msi (355kb) <-- This will do it all for you, including making it easy to remove via Add/Remove Programs

OpenCassini.zip (11kb) <-- You'll have to manually place the files and modify the .reg file to get it going.

I hope others find this useful and maybe a bit easier than setting all this up on your own.  Also, I'll be happy to share the project with anyone who cares, but there's not much to it beyond what's outlined in Robert's post.

UPDATE: I just found the post that Robert alluded to (too bad it wasn't linked from the original).  Unfortunately, his link is giving me a 404, so maybe my time wasn't totally wasted after all. :)

Wednesday, July 26, 2006 4:48:28 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Thursday, July 20, 2006
I've recently been informed that "the .NET Framework groups are offering another series of Live Meeting presentations called “Live from Redmond”. These talks are given by product team members, so it’s content directly from the program managers and developers on the team. The feedback from the first series that was done in partnership with INETA was overwhelmingly positive so Microsoft has decided to offer another set of talks."  Looking over the list, it looks like there's a lot of good stuff, so I thought I'd pass 'em along.

Client Talks

Date

Title

Speaker

Registration URL

16-Aug

Smart Client: Offline Data Synchronization and Caching for Smart Clients

Steve Lasker

Click here

23-Aug

Windows Forms: An Overview of Windows Forms in Microsoft Visual Studio 2005

Saurabh Pant

Click here

30-Aug

Visual Studio: Developing Local and Mobile Data Solutions with SQL Server Everywhere

Steve Lasker

Click here

13-Sep

(WinFX) Windows Forms: How to Leverage Windows Forms and Windows Presentation Foundation in a Single Hybrid Application

Scott Morrison

Click here

20-Sep

Windows Forms: Solutions to the Most Common Windows Forms Development Challenges

Scott Morrison

Click here

Web Talks

Date

Title

Speaker

Registration URL

25-Jul

ASP.NET: An Overview of ASP.NET and Windows Workflow Foundation Integration

Kashif Alam

Click here

3-Aug

ASP.NET: Building Real-World Web Application UI with Master Pages, Themes and Site Navigation

Pete LePage

Click here

10-Aug

ASP.NET: Creating Web Applications Using Visual Studio 2005 Team System

Jeff King

Click here

17-Aug

ASP.NET Atlas: A Developers Introduction to Microsoft Atlas

Joe Stagner

Click here

22-Aug

Best Practices and Techniques for Migration Visual Studio 2003 Web Projects to Visual Studio 2005

Omar Khan

Click here

24-Aug

ASP.NET: An ASP.NET Developer’s Look at Using RSS

Joe Stagner

Click here

7-Sep

ASP.NET: Under the Covers - Creating High-Availability, Scalable Web Applications

Stefan Schackow

Click here

14-Sep

ASP.NET: Using User, Roles, and Profile in ASP.NET 2.0

Joe Stagner

Click here

21-Sep

ASP.NET: Comparing PHP and ASP.net

Joe Stagner

Click here

28-Sep

ASP.NET: Security Tips & Tricks for ASP.NET Developers

Joe Stagner

Click here

Commerce Server Talks

Date

Title

Speaker

Registration URL

1-Aug

Multi-Channel, Connected Commerce (BTS/CS integration)

Caesar Samsi

Click here

15-Aug

Commerce Server 2007 Overview

Mark Townsend

Click here

12-Sep

Commerce Server 2007 Architectural Deep-Dive

David Messner

Click here

.NET Compact Framework Talks

Date

Title

Speaker

Registration URL

29-Aug

.NET Compact Framework 2.0: Optimizing for Performance

Ryan Chapman

Click here


Thursday, July 20, 2006 8:49:01 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Wednesday, July 19, 2006

While reading over the latest issue of Perspectives from the IASA, it struck me my current thinking about philosophy rings true for how I'm thinking about architecture, at least in one rather important aspect.  You see, in considering all of the various philosophical systems developed over human history, it strikes me that there is no one philosophy that suits all people, at least not realistically speaking.

Sure, as a devout Roman Catholic and amateur philosopher myself, I do think, ideally speaking, that Catholicism is the best philosophy for all human beings.  The problem is that, first, not all humans are philosophers.  Second, and vastly more importantly, all philosophers and non-philosophers alike are humans. 

As humans, we're made up of more than just plain ol' objective reasoning.  Indeed, I rather think that we are first and foremost a bundle of nerves and emotions, and only a few among us even try to tame that bundle into something resembling objective and rational thought.  Even those are still far and away subject to the non-rational whims of humanity, including prejudices, presuppositions, and all that other non-rational goo that makes us who we are.

This is why I say, realistically speaking, there is and can be no unifying philosophy that all humans can follow, as much as I might like for it to be otherwise.  I think this much has proven true in that neither by force nor by argument has any one philosophy been able to subdue humanity in all our history, despite attempts at it by both the very strong, the very intelligent, and the very persuasive among us.

If this is true, what is then the best thing that we can do?  Right now, it seems to me that perhaps the best thing that philosophers can do is to try to discover philosophies that are the best for persons with a given background, a given culture, and at a given time.  I don't think this is the same thing as relativism because, first, we can still talk about the best objective philosophy for all humans (even if all humans will never follow it), and second, we can talk about an objectively best philosophy for persons of similar backgrounds, cultures, and times.  We can still say that our philosophy is the best for humanity while realizing that perhaps the best for this person over here is another, given all the factors that have shaped him or her.

About now, my technical readers will be wondering when I'll get back to talking about architecture and how it relates to these ramblings, and, happily for them, here we are.  The most recent issue from the IASA has several articles purporting what it means to be an architect, how to become an architect, and how best to educate for architecture, among other things.  In reading these, I was struck (I should say again) that there doesn't seem to be one unifying idea of what it means to be a IT architect or how to become one.

Certainly, there are commonalities and core competencies, but I think that ultimately, the question of whether or not one can know if he is an IT architect (shall we say, the epistemology of IT architecture) and consequently whether or not you can tell someone else you are one, depends largely on the context of the question.  Just as there are many different industries, company sizes, and corporate cultures, so it seems there should be many different categories of architects to match. 

In an earlier blog post and article this year, I tried to throw out some ideas about what software architecture is and how we should be thinking about it.  I still think that the distinctions I was drawing are valid as are the key differentiators between software architects and developers, and incidentally, I'd suggest that the distinctions are also valid for the infrastructure side of IT.  It seems to me that the key defining aspect of an architect is the ability to tangle with both the business and the technology problems and effectively cut through that Gordian Knot, arriving at the best solution.

If so, then what makes a person an IT architect depends on the business at hand and the technology at hand, not on some presupposed host of experience with different businesses and architects.  The issue I take with Mr. Hubert in his "Becoming an IT Architect" (IASA Perspectives, Issue 4) is that it sounds as if one must have visited all his "stations" in order to know one is an architect.  While he starts out the article saying he is just recounting his particular journey, most of the article smacks of an attempt at generalizing his individual experience into objective truth, in much the same way that some philosophers have tried to draw out the best objective philosophy based on their own experiences and cultures.  In the end, such attempts invariably fall flat. 

Without digging into the specifics of the "stations" that I don't think are core to becoming an IT architect, let's stick to the central proposition at hand (which makes such a specific deconstruction unnecessary), namely that IT architecture at its essence is the previously described weaving of business and technology skill, with an admittedly stronger technical than business bent.  If that is the case, there is no one definition for what it means to be an IT architect, nor is there consequently any one path to become one.  With that in mind, reading Mr. Hubert's story is valuable in as much as one wants to know how to become a software architect at the kinds of companies, projects, and technologies that Mr. Hubert works with today, but it is only one story among many in the broader realm of IT architecture.

Rather than trying to establish some single architect certification that costs thousands of dollars and requires specific kinds of experience to achieve, we should think in terms of what it means to be an architect for a company of this size, with this (or these) primary technologies, this culture, and at this time in the company's life.  Only within that spectrum can we realistically determine the best definition of an IT architect, much like there may be a best philosophy for individuals within the spectrum of particular backgrounds, cultures, and times.

Does this mean we can't talk about skills (truths) that apply to all architects?  I don't think so.  The chief skill is what I've already mentioned (solving business problems with technology), but perhaps we could say that all architects need deep experience and/or training in a technology (or technologies).  Similarly, we could say that architects need training or experience in business in general (those concepts and skills that span different industries).  We might also say that they need training or experience in particular industries, at least one.  These individual truths combine to form something of an objectively best architect, but the specific best architect definition will vary depending on the context.

This kind of talk provides a broad framework for speaking about IT architecture as a profession while leaving room for the specific categories that could be specified to enable better classification of individuals to aid in both education and recruiting.  We already have some of these definitions loosely being developed with such terms as "solutions architect," "enterprise architect," and "infrastructure architect."  However, I feel that these may still be too broad to be able to sufficiently achieve an epistemology of IT architecture.  Maybe "enterprise" is the best one among them in that it historically does imply a large part of the context needed to have a meaningful category within IT architecture, but I tend to think that "solutions" and "infrastructure" are still too vague and lacking context. 

I don't propose to have the solution all worked out, but I do think that the key things, both in philosophy and software architecture, are to provide contextual trappings to determine the most meaningful solution to the problem at hand.  If that means speaking of a software architect for a local, small, family-owned brewery on the one hand, and an infrastructure architect for a multinational, Fortune 500, telecom company on the other, so be it.  But if we can generalize these sort of highly-contextual categorizations into something more usable for education and certification, all the better.  Granted, we won't have categories that sufficiently address every meaningful variation (as is the case with all taxonomies), but as long as we're working forward with the necessary framework of context, I think we'll get a lot closer than many of the current attempts that result in over generalization (and thus lose meaning as categories per se). 

In the meantime, I'd suggest that my assertion that the key distinction is in one's purpose (see the aforementioned article) is the best way to establish a basic epistemology of IT architecture.  I think it is certainly sufficient for individual knowledge and broad group identification, though clearly more needs to be worked out to assist in the development of training, education, and certification that will feed into trustworthy standards in the various categories of IT architecture.

Wednesday, July 19, 2006 10:30:40 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [2]  | 
# Saturday, July 8, 2006

Well, I've finally settled on the software I'm going to use to write my blogs as well as read them.  I'm pretty picky about UI these days, and most of the software out there is just not that great when it comes to that.  But of course, I also need it to integrate and minimize the time it takes to set things up and the time I have to spend jacking with it on a regular basis. 

Authoring
For authoring, I've decided to go with WB Editor 2.5.1 by Yiyi Sun.  I like the UI.  It currently uses the webby L&F, which when done right has a pleasant, light feel to it.  One thing that immediately strikes me as nice is that when I save a post for the second time, unlike BlogJet, it just saves to the same file I saved before and doesn't prompt me to pick a new file and then, when I select the same file, ask me if it's okay to overwrite it.  That really bugged me about BlogJet.  With WB Editor 2.5.1, CTRL-S works just like you'd hope, although it does pop up a notification saying it was saved, which is a bit annoying but can be dismissed with a spacebar slap.  I'd prefer the notification be in the status bar, but it's still much better than BlogJet in terms of saving drafts.

Post Authoring 
Post Authoring with WB Editor 2.5.1

Note that the color is green; it comes with three theme (skin) options: Blue (default), Green, and Pink.  I've always had a penchant for green.  Note also that the coloring of the post itself is like my dotNetTemplar blog; you can set this up using the options by specifying styles.  It's kind of nice so that you get a better feel for what it looks like, but it would be helpful to 1) allow for a stylesheet per blog and 2) encapsulate the entire post in a div to better capture the L&F of a single entry on a blog site (not sure how this would work in the editor, though).

I also like how ridiculously easy it is to insert images and screen shots.  When you click the insert/upload image icon, it has a friendly dialog that lets you pick the image or even paste from the clipboard.  It offers the option to automatically create a thumbnail and upload them both either via FTP or to Flickr.  I haven't tried the Flickr option, but it works great with FTP.

Adding Images
Adding Images with WB Editor 2.5.1

The HTML itself is clean, too, and it has a nifty little snippet insert drop-down for common stuff.  This is important to me because I don't want my editor using any markup--I want to leave it to my style sheets.  And it seems to play well with that.  It also highlights nicely, and the highlighting colors are personalizable.

HTML Editor
HTML Editing with WB Editor 2.5.1

Being a sucker for good UI, I enjoy the main screen that shows your registered blogs.  Yiyi has gone to the trouble to get images for the major blog engines (needs to update .Text to CS), so you get that along with a screen shot of your blog, the URL, and the categories.  And yes, you can of course cross post to multiple blogs, which is one reason to use an editor like this.

 WB Editor Home
WB Editor 2.5.1 Home

One of the really nice things about WB Editor from a .NET developer's perspective is that it has a plug-in architecture (currently running on .NET 1.1). 

Plugins
WB Editor 2.5.1 Plugins

An important plug-in for devs is a code highlighter.  It may not be the nicest formatting, but it works.  If you don't like it, you could easily write a plug-in to use a formatter that you do like.

[Serializable]
[XmlRoot("links")]
public class NavigationRoot
{
    NavigationLinkCollection items = 
        new NavigationLinkCollection();
    [XmlElement("link")]
    public NavigationLinkCollection Items 
    { 
        get 
        { 
            return items; 
        } 
        set 
        { 
            items = value; 
        } 
    }
}

Another feature that I like about WB Editor is its roadmap, which promises to stay on top of the latest technologies from Microsoft, such as 2.0 and ClickOnce (coming in the next version) and ultimately .NET Framework 3.  It's a project that I could get excited about working on, and as you can see from the blog, it is actually being worked on.  Of course, it has other features that you can read about in its features list; I'm just highlighting the ones I think are cool.

So in short, it has everything that I'm looking for in a rich-client blog editor, and I'd recommend it over the much lauded BlogJet.  It is also competitive in pricing, currently at $19.99, which for a great piece of software like this is outstanding.

Reading
Now, I did mention at the beginning that I'd also settled on an RSS Reader.  I looked at a few, RSS Bandit, FeedDemon, Windows Live, Awasu, and probably a few others that don't readily come to mind.  My issue with all of these is the amount of work involved in setting them up.  It's not that they're particularly troublesome if you can live with a straight list of blogs from your OPML file, but if you like to categorize like I do, then it becomes troublesome, especially when you use multiple machines with multiple OSes on them.  Having to repeatedly setup my subscriptions kills me, and it's one reason I have always avoided using newsgroups.

Ideally, I'd like to just set them up once, and be able to read them either online or in a rich client, and have both of those stay in sync.  The only such RSS reader I ran across that met the bill was NewsGator, and in particular, their Inbox product that integrates with Outlook.  I might have gone with their FeedDemon product, except I am in fact one of those users that almost always has Outlook open, and I figure why have another app that is always running.  Also, I bought Newsgator way back in '04 when it first came out, so having an already-purchased license (with a free upgrade to the latest version) helped me decide.  Naturally, I had long since lost my license info, but they have a nifty license retrieval mechanism, so it was painless to get it going. 

The feature I most like, in case I wasn't already clear, is that they integrate and synchronize with their online reader, so I can have the best of both worlds, and when I want to set the rich client up, I can just grab the stuff from my already setup online source.  I can make new subscriptions and reorganize them on either the rich client or web, and it will keep them in sync.  It's the best of Plaxo for RSS.  Ah, now this is software for the connected world.

So that's about it.  I just thought I'd pass along my findings to you in case you've found yourself in a similar predicament.  I'm not trying to convince you to change if you're already happy with your setup, but if you aren't happy, consider these two products for the total blog/RSS reading and authoring solution.  I hope it helps!

Saturday, July 8, 2006 1:45:13 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Saturday, July 1, 2006

This is a repost.  The new blog authoring software I tried tonight overwrote this post instead of posting a new one.  I'll be reporting that bug. :)

------

For some, this may be old news, but I was just visiting again today, and I thought this is something that more people should know about.  So in case you haven't heard, the ASP.NET teams have put up a little section on asp.net to host side projects that they're working on; they call it the sand box.  Right now, they have 10 projects up there, including the famed "Atlas."

There are a number of other tantalizing tidbits that you might find interesting.  I just downloaded and installed the CSS Properties Window, which is a pretty neat way to edit your styles.  It will let you see the styles applied to a particular element while in design view and let you modify them, even ones in a linked style sheet, even using themes.

One they just recently released is this thing called the Blinq Prototype, which as I understand it is a tool that will generate an ASP.NET application using DLinq (ORM paradigm) based on a database.  Now, I'm a strong advocate of object-oriented and domain-driven design, but if you need a good starting place for an application that is based on an existing database schema that uses objects (not data sets) and integrated language query (which is totally awesome!), this promises to be a RAD tool (in both senses).

Yet another nice tool is the table profile provider.  The out-of-the-box profile provider in ASP.NET stores profiles data as a blob-like manner; this tool enables you to store the profile properties separately in the database as distinct columns.  I don't know about you, but that's how I prefer it.  It's always nice when you get free code done pretty much the way you want it.

One tool there that I find absolutely indispensible is the Web Application Projects.  That one's actually a tad more official than some of the others as it is a released add-on, but despite its official suit and tie status, it's still a really cool tool.  I find that I pretty much want to do all my ASP.NET apps as WAPs, maybe because it is more familiar from the 1.x days, but mostly because most of the apps I work on are team projects, and WAP plays much better with VSS, in my experience, than the 2.0 web site projects.

So the sand box is definitely a spot you should keep your eye on for new, cool stuff that comes out of band from the regular release cycles from Microsoft.  As far as I know, there is not currently an RSS feed for the sand box, but if you watch Scott Guthrie's blog, he's usually good about talking up the stuff they put out there.

Saturday, July 1, 2006 9:44:01 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, June 27, 2006

I recently got an email about a new service to add my blog RSS feed to Live.com (note the new icon on my blog, if you visit it).  For some time, I've been wanting to look into an AJAX-based web client for reading my blogs because I've found, like newsgroups, I just don't like having to set up everything I'm subscribed to on every computer I use.  At the same time, I do want a good UI.

Well, I've been putting off doing the research for it (and my blog reading has suffered for it).  Today I thought I'd check out what Live.com is like as an RSS reader, so I first tested using my new link to add to Windows Live.  It works and basically adds a little RSS reader gadget for my blog.  So then I thought I'd check out how it'd work with all my blogs, so I got the latest OPML (based on my blog roll here) and used Live.com's import feature to import them all.

At first, I was a bit disoriented because it said it imported but it wasn't showing them anywhere (I expected them to be put on the page I was looking at when I imported them).  But then I found them in the My Stuff section.  So I started building out my layout.

I started with the default two-column, but I quickly realized that wouldn't work, so I switched to four column, which seems to be just right at 1280x1024.  I knew I wouldn't want them all on one page, but I did want some categorization, so I came up with non-technical blogs, architecture blogs, and other technical blogs, one Live page for each.  Then, if it made sense, I categorized by column.  The results follow.

Other Technical Blogs
This is the "Other Technical Blogs" page.

Architecture Blogs
This is the "Architecture Blogs" page.

Now great, you may be thinking, I can use this as well.  Let me warn you, there were a couple MSDN blogs that repeatedly and totally hosed IE7 (I'm running Vista B2 x64 on this box).  I figured out which blogs they were and removed them from my stuff.  But even doing that, IE was still having problems, and as you can see from the image below, there's a reason for that.

taskman.JPG

Note the top entry.  IE is running at 50%, but this is a dual-core Athlon CPU, which means on some machines it'd be trying to use 100%, and the memory usage is out of this world (350MB), even bigger than Visual Studio!

From the people I've talked to, having Live.com eat up CPU and RAM is not unusual.  Not being a Javascript and AJAX guru, I'll withhold any harsh judgments as I can readily imagine how it could be problematic.  But all I'm sayin' is that it ain't ready for primetime blog reading at a very basic level.

Beyond the performance issues, it also has no tracking of read/unread and no notification of new posts, both of which I think are indispensible for any kind of RSS reader.  Now, I understand that maybe I'm abusing what they intend for the usage scenarios to be, but why else make it possible to subscribe to RSS feeds than to be an RSS reader?  As it is, the gadget is only good for limited use for maybe news services or the like where you don't care about having your read/unread tracked.

I will say that it has a neat little image capture feature where it'll grab any images in the feed and thumbnail them for you, even do a fade in/out if there is more than one.  It also has neat little mouseover previews, which I like.  It's not totally unusuable in terms of features to be sure, but it would be nice to see a better blog reader gadget that maybe would offer some basic categorization, read tracking, and possibly some sort of notification, though I'm not sure how that'd fly given it is web based.  I'm going to keep trying out Live.com like this to see if they improve it.

On the positive side, this motivated me to blog about it and, in the process, try out a new blog authoring tool, WB Editor 2, based on the recommendations of John Forsythe.  It has a pretty friendly interface, is easy to set up (as these things go), and it is cheap.  This post  is being authored with it, so if there are any issues, well, there you go, but it was easy to add the images, and it created the thumbnails for me and uploaded them along with the main images.  I also like that it has a plug-in architecture that is .NET based, even if it is 1.1.  So far, I like it even better than BlogJet.

Tuesday, June 27, 2006 3:10:45 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Thursday, June 22, 2006
Come see .NET guru Bill Wolff tonight as he presents on SQL Reporting Services for the central New Jersey user group!

Time:
6:15 - 6:30pm - Pizza
6:30 - 6:45pm - Intro, Announcements
6:45 - 8:15pm - Main Speaker
8:15 - 8:30pm - Raffle

For Directions

Check out NJ.NET for more info and to sign up for future meeting notices.

Spread the word!  I'll see you tonight!
Thursday, June 22, 2006 10:15:24 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, June 19, 2006

Is anyone else as frustrated as I am with the multifarious password policies you run into across systems?  It seems like everyone and his brother has "the best" idea of what a strong password should be, which translates into having to keep up with N passwords and which systems they map to. 

That's bad enough, but then you have these people who think that making you change your password every N days is a good idea and that you can't use the last N passwords you've already used.  To make it worse, some brilliant minds out there think that forcing us to have "strong" usernames is a good idea too, so you end up with something like N^N permutations of usernames and passwords that you have to track. 

"So what?" you say.  "We've got a nifty 'Forgot Password' option on our site/app/etc.." 

But I have to ask, is that really ideal?  Perhaps if we didn't have to keep track of N^N passwords mapped in matrices to the N! systems we use, we wouldn't forget them so often! 

I'm not saying that having strong passwords is a bad idea, not at all.  I'm suggesting that we all work toward agreeing on what a strong password is and come up with, dare I suggest, standards based on data sensitivity.  So for instance, here are some ideas:

  1. If all you've got for a particular system is generic profile data, that would require a very low strength password, say minimum six characters, no special chars or numbers required. 
  2. Then you might have a next level for systems that keep your order history (but no financial data per se).  These kinds of systems might require eight characters with at least one number.
  3. You might then have systems that store financial data, such as credit cards, but are still a commerce system; these could require eight characters with at least one number and one special character.
  4. Then there are the actual banking, trading, etc. systems, and these might require ten characters with at least one number and special character.
  5. For systems above this level (e.g., company VPN), you would want to have some kind of dual authentication with a strong password and RSA tag, smart card, bio, etc.

Anyways, the point is not necessarily that these are the best specific guidelines; I don't consider myself a security expert, but I know enough to understand that what we have going on is not likely adding to our general security because in order to keep track of all these authentication tokens, we have to write them down somewhere, store them in some vault, file, sticky pad, etc., which in the end likely makes our security less, and it certainly adds to both individual and organizational administration overhead to manage password resets, fetches, etc.

If we had standards like I'm suggesting that were well published, then every Joe that goes to write a new system would easily be able to put in place a policy that is both secure, appropriate for the data being protected, and manageable for everyone involved.  If we only had maybe four passwords to remember, even if they're odd and funky (with special characters and numbers) or if they were pass phrases, we would have to write them down or forget them or manage getting them reset all the time.  In other words, we'd be more secure and happier.  And if we do have such standards, they need to be far more publicized and talked about when the subject comes up because I've not heard of them, and I don't think I live in the proverbial cave.

Monday, June 19, 2006 1:53:29 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Sunday, June 18, 2006

I just finally got Windows Vista up and running on my DFI LANPARTY SLI-DR board (has nVidia's nForce4 chipset).  Plugged into that are an AMD Athlon Dual Core X2 3800+ chip, 2 GB RAM, 2 Western Digital 36.7 GB 10K SATA (setup as RAID 0), and an nVidia GeForce 6600 GT (not running SLI yet), among other less important peripherals. 

It wasn't easy getting this going.  nVidia has 64 bit Vista drivers for its chipset, but they're incomplete and the instructions they post on their site don't work for me (and others).  Thankfully, someone else has put together an install guide, but even with that, it took me two tries to get it going (it didn't like my USB drive the first time apparently). 

The silly thing is that Vista B2 won't ask me for my drivers before it summarily decides that it can't find any information about my disks, so you have to start from an existing XP installation and run the installer from there and install on a secondary partition.  I hope they get this resolved by release because I'd really like to repartition my drives and install it on my C drive.  Maybe my blogging this with my specs will help others who are in a similar situation.

Anyways, it's up and running and it is pretty nifty so far.  I'm one to go in for eye candy, and I love the new Flip3D and Glass (about all I've really had a chance to play with thus far).  I can say it is a bit annoying that when it prompts you to run something as admin that the whole screen blanks out; don't know what that's all about.  Maybe it's intentional just to ensure they have your attention...

One thing I can't seem to get working now is the gadgets.  I show the sidebar and it is just blank.  When I try to add gadgets, nothing happens.  Will google more for the answer...

Sunday, June 18, 2006 11:33:23 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Wednesday, June 14, 2006

If you're reading this and you attended my session on Monday morning at 9a but haven't yet filled out an evaluation, please do so.  I've been told the room seats over 800 and it was packed, but only 208 thus far have submitted evals.  I'd really like to know what EVERYONE thought, not just those few who've filled it out thus far.  It only takes a minute, and you get a chance to win an XBox if you do it sooner rather than later. 

Info again: 6/12/2006 - 9:00-10:15, WEB301 - Accelerating Web Development with Enterprise Library.

Just go to: http://msteched.com, log in, and go to fill out evals for breakouts (menu on left).

And remember, be honest but kind!  :-p

Wednesday, June 14, 2006 5:32:30 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, June 5, 2006

As most of you know who follow my blog at all, I recently joined Infragistics.  Well, I finally got around to getting my company blog set up, so if you're curious or interested, feel free to check it out and maybe subscribe.  While you're there, if you are a customer or potential customer, you might want to look around at the other blogs and maybe subscribe to some of them to stay on top of Infragistics stuff.

Monday, June 5, 2006 11:16:05 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Sunday, June 4, 2006

I ran into an odd problem the other day that I figured I'd blog for any other poor souls stricken with the same enigma.  Without going into the details of why I was trying to setup the indexing service on my Windows XP SP2 box, I found that when I tried to get into it from MMC (Computer Management), it would give me an error when I tried to expand the Services and Applications node saying that it failed to initialize the snap-in for the indexing service.

Searching on various combinations of the error message really didn't help, on Google or MS.  Everything appeared to be in order (the service acted like it was running) except that it wouldn't run the snap-in, and when I looked at the Windows Components tab in Add/Remove Programs, it showed that Indexing Service was unchecked.  Even if I checked it and clicked next (at which point it'd act like it was installing and configuring it, it would still show up as unchecked.

I had also noticed in recent days that I'd occasionally get one of those application crashed, do you want to debug messages about this SearchFilterHost.exe app.  When I first got the message, nothing came up for it on Google.  When I searched again on Friday, I found a few indicating that it was part of Office 2007 Beta 2, which I've been running since the day it was released, more or less.  I had kind of assumed that, but I just ignored the error and moved on.

Well, those two things gelled in my mind to suggest that maybe it was something with Office 2007 Beta 2 that was hosing up the Indexing Service.  More specifically, I suspected it had to do with the Windows Desktop Search that Outlook and OneNote 2007 prompt you to install.  On this hunch, I uninstalled the desktop search, and voila, my Indexing Service snap-in worked again, as did the program I was running that wanted to use it.

So the moral is that if you're having odd issues with Indexing Service, this is one thing you'll want to try.  It worked for me.  Now, I wish I could run the desktop search to optimize searching in Office.  I logged a bug on the beta site, but I figure my problem is probably just odd enough as to not be reproducible. :)  We'll see...

Sunday, June 4, 2006 8:40:25 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [9]  | 
# Tuesday, May 9, 2006

Just doing my part to spread the good word:

http://weblogs.asp.net/scottgu/archive/2006/05/08/445742.aspx

Tuesday, May 9, 2006 4:59:25 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Wednesday, May 3, 2006

Just thought I'd stick these out there for anyone else who might run across them.  Those of us reared under the friendly wing of SQL Server are in for regular surprises when interacting with Oracle...  But hey, what doesn't drive you mad makes you stronger, right?

1. Using a DDL statement inside a transaction automatically commits any outstanding DML statements.  I ran into this the other day when I was trying to have a transaction that added a row to a table and added a trigger (dependent on that row) to another table.  (This is actually part of my implementation of an OracleCacheDependency, which I intend to share in an article at some point.)  If you stepped through the code, everything appeared to function as expected, the exception would be thrown on the add trigger statement, RollBack would be called on the OracleTransaction, and... the new row would remain in the database.

It was actually driving me buggy.  I was beginning to wonder if Oracle supported ADO.NET transactions at all because every example (all two of them) that I could find looked just like my implementation.  I even tried both the System.Data.OracleClient and the Oracle.DataAccess.Client, which, by the way, require different implementations as the Transaction property on the Oracle-provided provider is read only (you have to create the commands from the connection after starting the transaction, which is, umm, inconvenient in some scenarios).

So I was pulling my hair out, about to give up, when I ran across a single line in the help docs that says "The execution of a DDL statement in the context of a transaction is not recommended since it results in an implicit commit that is not reflected in the state of the OracleTransaction object."

Okay, I guess I'm just spoiled by Microsoft (yes, I am), but I would expect an EXCEPTION to be thrown if I try to do this and not have the code happily carry on as if everything was hunky dory.  You'd think that a database that is picky enough to be case sensitive might be picky enough to not let you accidentally commit transactions.  And that leads in my #2 gotcha for the day.

2. Oracle is case sensitive when comparing strings.  Let me say that again (so I'll remember it).  Oracle is case sensitive when comparing strings.  Now this point, in itself, is not particularly gotchaful; however, when coupled with a red herring bug report, it can really sneak up on ya and bite ya in the booty.  This is just one of those things that you need to keep in the upper buffers when debugging an app with Oracle.

3. (This one is just for good measure; I ran into it a while back.)  Oracle 10 no longer uses the old Oracle names resolution service.  This means that if you try to use the nifty Visual Studio Add-in and your organization is still using the old Oracle names resolution, you'll have to create manual entries in your tnsnames.ora file(s) just so that you can connect.  Even when you do this, it has to be just so or it won't work. 

I've had it where you can connect in the net manager but can't connect in the Oracle Explorer using the connections, which is sees and reads from the tnsnames file.  In particular, if I removed the DNS suffix from the name of the connection (to make it pretty), it wouldn't work.  It'd see the connection but not be able to connect.

4. (Another oldie, but importantie.)  Oracle, as of now, does not support ADO.NET 2 System.Transactions at all, if you use the Oracle-provided provider.  From what I could tell, although I wasn't able to test successfully, the Microsoft-provided one looks like it should, at least it should use DTC, but the jury is out.  Feel free to post if you've gotten it to work.

5. There is no ELMAH provider for Oracle.  I implemented one, though, and will be sharing in an article at some point.  Feel free to email me for it in the meantime.

6. There is no Oracle cache dependency.  See #5.

7. There is no Oracle roles, membership, etc. provider.  Sorry, I've not done that yet.

There are other bumps and bruises that you will get when dealing with Oracle if your main experience is SQL Server.  Many of them are just due to lack of familiarity, but there are some issues that I think truly make it a less desirable environment to work with.  So I thought I'd just share a few of them here for others who might find themselves in similar binds and need the help, which is so hard to find for Oracle.

Wednesday, May 3, 2006 2:34:41 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Saturday, April 29, 2006

I just updated this site to the latest version of dasBlog.  Many, many thanks to Scott for helping me out with getting it (given that I am a total noob to CVS and, apparently, picked a bad time to start since SF was having issues).  Most notably (that I know of), this version incorporates using Feedburner, which I guess is the latest and greatest for distributing your feed and lowering bandwidth usage, though I'm sure there are some other goodies in there.

Anyhoo, let me know if you suddenly start running into any problems with my blog.  Have a good un!

Saturday, April 29, 2006 2:19:18 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Thursday, April 27, 2006

I decided to try something new in ASP.NET 2.0 based on our needs on a current project.  To do this, I followed the tip that K. Scott Allen shares in his blog.  The difference in my case is that I thought I'd do this and make the user control in the App_Code be the base control for a handful of others.  The point here is to achieve visual polymorphism.  Put another way, I want to load up the control differently based on the sub classes rather than, say, having a big switch statement.  This is just good OOD.

The problem is that if you just set a user control in the App_Code directory as a base for other user controls, you will most likely see the following exception.  I was able to consistently reproduce it.  When ASP.NET constructs your derived user control, .NET, as usual, calls the base constructors as well.  When the base constructor is called like this (for reasons beyond my ken), I see this error:

 System.Web.HttpException: An error occurred while try to load the string resources (FindResource failed with error -2147023083).

Looking at what is actually happening (in the call stack), the error appears to be in a call that the parsed constructor makes to System.Web.StringResourceManager.ReadSafeStringResource(Type t).  If you Google like I did, you probably won't find any help (except this blog now, of course).  So on a hunch, I called the LoadControl overload like Scott Allen suggests in his piece, and it loaded fine.  On a further hunch, I then tried to use the derived control again, and voila, it worked fine.

So, since I don't have time to open a case with Microsoft, I just created that as a work around.  I have the application call LoadControl once on the base control so that ASP.NET will properly set it up for use, and then I can use it as a base.  You could do this (presumably) in Global.asax, but I just put this in the page that uses the control(s) in question.

static bool _baseUCLoaded = false;

protected override void OnInit(EventArgs e)
{
  base.OnInit(e);
  if (!_baseUCLoaded)
  {
    ASP.MyUserControlBase thing = (ASP.MyUserControlBase)this.LoadControl(
      typeof(ASP.MyUserControlBase), null);
    _baseUCLoaded = true;
  }
}

And it appears to work.  Maybe someone will enlighten me on this, but I have a hunch that it is just an unexpected scenario (a hack) that wasn't covered by testing.  In any case, it's pretty neat that it works.

Thursday, April 27, 2006 10:41:34 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, April 24, 2006

Not long ago, I polled subscribers as to what they're interested in.  There seemed to be a fairly even divide between what I'll roughly call Technical posts and Non-Technical posts.  In fact, my goal with this blog is to be a blend of those two general categories.  At the same time, as much as it hurts to admit it, I know that some folks really don't care about my opinions on non-technical matters.  So it struck me (some time ago, actually; I've just been lazy) to create two general categories using the creative taxonomy of Technical and Non-Technical. 

Why?  This is because dasBlog (and most other blog systems, I imagine) allow you to subscribe to category-based RSS feeds as well as view posts by category.  So from this day forward, in addition to the more specific categories, I'll be marking all posts as either Technical or Non-Technical.  If all you care about is one or the other, you can just subscribe to one or the other and never be bothered with the stuff you don't care about.

You can view/subscribe to the feeds using the feed icon next to each category in the list (of categories).  Here are direct links as well:

Technical

Non-Technical

I hope this helps!

Monday, April 24, 2006 10:28:33 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 

In a recent post, I mentioned the acronym OPC, meaning "Other People's Code."  Somehow I doubt I'm the first person to use the acronym, so I don't intend to claim it as my own.  I can't say I've seen it before, but it seems so obvious that it should be one because OPC is so prevalent and we should all be talking about OPC much more than we do.  In an industry that arguably values reuse over every other virtue, you'd think that OPC would have long been canonized.

Yet it seems to me that when most people speak of reuse, they mean Their Own Code (TOC, I love overloading acronyms!) or, from their perspective, My Own Code (MOC).  In essence, they want other people to reuse their code, but there ain't no chance in heck that they're going to use OPC as a means to achieve the ultimate goal.  I want MOC to be reusable.  How can I impress my friends by writing code that can be reused by as many other people as possible?  This is something I think most of us that strive to be great at software think at one point or another, perhaps not in so many words, but ultimately, there is a sense of great pride when you polish off that last speck on your chrome-plated masterpiece, showing it to your buddies or the world in general and saying "that's MOC." 

The funny thing is that more often than not, the really ardent software folks among us, and even the less ardent, have a predilection for the NIH syndrome.  It's because we're all so damned smart, right?  Surely, those other folks at NIH Co. couldn't possibly have done it as well as I could have!?  Of course, we've got all the rationalizations lined up for when the business folks ask:

1) "I can't support that because I don't know it--I need the source code."
2) "You know, it won't meet our needs just right, not like I could do it for you custom."
3) "How much?  Geez.  I could write that in a day!"
4) "It's not using X, which you know is our preferred technology now."
5) "Did they promise to come work here if they dissolve the company?  I mean, you're just gambling on them."

And the list goes on.  We've probably all done it; I know I have.  Why?  Because, as one developer friend once put it (paraphrased): "I love to invent things.  Software is an industry where you get to invent stuff all the time."  In other words, we're creative, smart people who don't feel that we're adequately getting to express our own unique intelligence unless we write the code ourselves.

And now we finally come to what prompted this post.  I recently looked over an article by Joshua Greenberg, Ph.D. on MSDN called "Building a Rule Engine with SQL Server."  I'm not going to comment on the quality of the solution offered because I hardly think I am qualified to do so.  What I was completely flabbergasted by is the total omission of the rules engine being built into Windows Workflow Foundation.  Surely someone who has put that much thought into the theory behind rules engines, which, as is mentioned in his conclusion, are probably best known in workflow systems, would be aware of WF's own?  Surely one of the editors at MSDN Mag, which has done numerous articles on WF, including one on the engine itself published in the same month, would think it worth noting and perhaps comparing and contrasting the approaches?

Now, I don't want to draw too much negative attention to the article or Mr. Greenberg.  He and the editors are no more guilty of ignoring OPC than most of us are.  It is just a prime example of what we see over and over again in our industry.  On the one hand, we glorify reuse as the Supreme Good, but then we turn around and when reusable code (a WinFX library, no less!) is staring us in the face, an SEP field envelops reuse, enabling us to conveniently ignore OPC and start down the joyous adventure of reinventing the wheel.

This has got to stop, folks.  I'm not saying that this ignorance of OPC is the primary cause of the problems in our industry (I happen to think it is only part of the greater problem of techies not getting the needs of business and being smart enough to hide it).  But it is certainly one that rears its ugly head on a regular basis, as we guiltily slap each others' backs in our NIHA (NIH Anonymous) groups.  We have a responsibility to those who are paying us and a greater responsibility to the advancement of our industry (and ultimately the human race) to stop reinventing the wheel and start actually reusing OPC.  I'm not saying there is never a justification for custom code (God forbid!), but that custom code needs to be code that addresses something that truly cannot be adequately addressed by OPC. 

There will always be plenty of interesting problems to solve, which give way to creative and interesting solutions.  Just imagine if all this brainpower that goes into re-solving the same problems over and over again were to go into solving new problems.  Where would we be now?  Now that's an interesting possibility to ponder.

Monday, April 24, 2006 10:21:02 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [2]  | 

Disclaimer
The opinions expressed herein are solely my own personal opinions, founded or unfounded, rational or not, and you can quote me on that.

Thanks to the good folks at dasBlog!

Copyright © 2014 J. Ambrose Little