On this page.... RSS 2.0 | Atom 1.0 | CDF
# Friday, June 20, 2008

I'm not sure why this didn't occur to me before...  I read recently another brief article about the negative impact of email on productivity the other day, so I was thinking about a way to deal with it that didn't involve, e.g., closing Outlook and maybe even setting an "I'm not available by email until 3p today" out of office type message--seems a bit extreme, and it would also preclude my getting meeting reminders. 

It occurred to me that what usually happens is I get the nifty little toaster popup notification while doing something, almost always click on it for more detail, and then get drawn into a distraction over it.  Similarly, I was using one of those Gmail Vista gadgets that would highlight when I had Gmail waiting, or I'd leave it open and minimized and see the Inbox count in the taskbar.  The problem was not (for me) so much getting too much email as having the regular interruptions that were occasioned by these terribly useful notification mechanisms. 

Having isolated the problem, i.e., having framed the question correctly (which usually the most important part of solving a problem), I asked "How can I make these notifications go away?"  And the answer was immediately apparent: turn them off. :)

To that end, I went into Outlook advanced email options (Tools -> Options -> Email Options -> Advanced Email Options--who knew notifications were advanced?!) and deselect all the notification options:

Advanced E-mail Options Dialog

I then removed the Gmail notifier gadget, and I close my Gmail when done with it.  The magic is that I still get my task and meeting reminders, but I don't get the regular interruptive notifications.  This had an immediate noticeable effect--I could work through to a good stopping point on the thing I was working on, i.e., a point I'd normally take a break, and then I'd check my email.  Wow!  Who knew something so simple could make such a difference?  I figure if it is critical, somebody will call or come knocking on my door. :)

As a complimentary technique to that, I have taken my Inbox strategy to the next level, following a bit of advice given by Mark Hurst (who wrote a book on Bit Literacy [that I haven't read]).  One of his suggestions to avoid information overload is to keep your Inbox empty.  I previously already worked to do that because I used my Inbox like a to-do list (and don't like having a long to-do list), but Mark's advice is precisely not to do that--use it as an Inbox and get stuff out of it immediately. 

Having not read the book (in which I'm sure are tons of helpful little tidbits), I take that to mean act on it immediately if possible, file it if need be, or set up a task to do something with it later.  I was already doing the first two, but I've found this additional third technique to be a nice add.  There is a distinct satisfaction (for me anyway) to having an empty inbox--maybe it's my personality type. :)

I hope this maybe helps others out there in the same boat.

Friday, June 20, 2008 5:28:31 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Thursday, May 15, 2008

I haven't done any research, so maybe it is out there.  But I had a thought the other day as I accepted yet another invite to connect from yet another social networking site from someone I have connected with numerous times. 

Wouldn't it be great if I could have one, unified set of social contacts, my social network, that I could then just share out to various social networking sites?  I mean, sure, folks would have to opt into it, someone would have to think about the privacy issues, but good grief, it seems like we need something like that...

Thursday, May 15, 2008 1:02:49 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Friday, March 28, 2008

I finally gave in and bought a graphics tablet.  My budget being as huge as it was, I opted for the Wacom Bamboo, which retails at $79, but ANTOnline (via Amazon) had it for $50 plus shipping ($58 total).  I haven't been this tickled to get a new gadget in a while.

The whole experience thus far has been grand.  I placed the order at about 10p on Tuesday night.  I got an email Wednesday night saying it had shipped, and when I opened it Thursday morning and clicked the tracking number, I was informed it was out for delivery--and I paid for standard shipping.  Awesome.

I got the box later Thursday morning, and opened it to find a sleek box wrapped in tissue paper, as if it were a gift.  After sliding it out of the tissue paper, here's what I saw:
Wacom Bamboo Box

Not bad styling.  Let's open 'er up:
Wacom Bamboo Welcome Messages

"This is your Bamboo.  Use it to get more out of your computer.  Let us know how it goes..."  In many languages.  Then it is signed by, presumably, the creators.  Very nice touch, I thought.  I felt like a proud owner already.  Then you lift up that insert, and there's the tablet in all its beauty.  Grab it out--there's the cord, the pen, the pen holder.  Great.  Simple. Obvious.  Beneath that is another tissue wrapped gift, a stylish little black box that has some simple instructions on getting going and the DVD.

Wacom Bamboo Open Box

Just opening the thing was a pleasure.  Honestly, these folks know what UX is, and this is just for an $80 graphics tablet. 

I plugged it in, and it immediately just worked.  Having read a comment somewhere, I just went to the Web site to download the latest drivers.  That was easy.  Install.  I had to try twice; it got hung up for some reason, but then, I did have 30 apps open at the time and they did suggest closing them all. :)

I immediately opened OneNote and went to town.  I started drawing the simple stuff as Dan Roam suggests in his new book, The Back of the Napkin.  (I attended his session at Mix and liked it enough to buy the book.)  Then I really went out on a limb and drew a self-portrait:

Ambrose Self Portrait

Not bad, eh? 

Well, it was a first shot.  I tried writing and realized just how bad my penmanship has become over the years.  Trust me; it's bad.  Nice thing is that maybe I'll get some of it back and improve it now that I have this (who knows?). 

I'm now on Day 2 of using my Bamboo, and I really like it.  My wrist, which had been hurting more as of late, has been loving me.  One of the reasons I tried this was to see if it'd be better to avoid "repetitive strain injury," and I noticed an immediate difference.  The other reason was because I get so tired of being constrained by drawing programs in terms of what I want to represent visually.  SmartArt in Office really, truly (as cool as it is) only goes so far. :)

So my first real use was to start diving into my Agile UX Design Process diagram to replace a particularly painful slide (Slide 19) in my Building Good UX talk.  It (both the drawing and the process) is a work in progress; just trying to visualize some of my thinking about it right now.

Agile UX Design Process

If you look hard, you can see my chicken scratch compared to the nice, free Journal font I picked up.  The point of this diagram is to show how to integrate UX pros into an Agile process.  Not saying this is all fleshed out or perfect, but it's a start. :)  One important point is that even if you don't have the pros, you can start doing the UX stuff yourself.

A Few Tips Using Bamboo (thus far)

  1. Use Mouse mode.  When you install the driver, it switches to Pen mode, which tries to map your screen(s) to the tablet.  Even though Wacom recommends this mode (even provides exercises to get use to it), I found it frustrating when trying to draw on my right screen--I felt too close to the edge for comfort. 
  2. Disable acceleration.  While it can be a nice feature when using it literally like a mouse, it messes you up when drawing.
  3. Switch to the dreaded single-click mode in Explorer.  Back when the single click mode was added (XP?), I tried it out and was disgusted.  But double-clicking w/ the pen is just not easy, and actually, the single-click mode feels really natural with the pen.
  4. Switch to scroll on touch ring. I don't feel too strongly about this, but honestly, I don't use zoom (the default) enough to have it as a top-level feature on the tablet.
  5. Upgrade to Vista?  I think that you must not get ink in Office 2007 w/o Vista?  I can't figure it out, but it's not there for me in XP.  The Wacom site mentions Vista explicitly, and my searches haven't turned up anything useful.  Folks talk about "Start Inking" as if it is just always there, but it may also have something to do with Tablet PC.  I'll let you know if I figure it out.

It is taking some getting used to, of course, but so far I think it's a big improvement.  Ask me in a few weeks. :)

And now for the gratuitous signature:

J. [Ambrose] Little

 

 

 

 

Nice.

Friday, March 28, 2008 5:32:00 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [2]  | 
# Saturday, December 22, 2007

I've been getting friendly with Windows Live lately, and after getting terribly tired of having to switch to HTML view in Windows Live Writer in order to insert a note (could be a footnote or endnote depending on how you look at it), I decided to see if I could write a plug-in to make my life easier.

So was born the Blog Notes plug-in.  Unfortunately, there is no extensibility for just marking up existing text (e.g., adding a superscript button to the markup toolbar), so I had to go with the option to insert some HTML using the  interface.  I really was trying to keep it simple and lightweight (for my own sanity), so it is pretty basic.

The functionality is pretty straightforward.  Thanks to Mark James for the free icons.  Once the plug-in is installed, you should see an "Insert Blog Notes..." option in the Insert pane on the right side as shown below.

Insert Blog Notes in Insert Pane

Clicking on it brings up the Blog Notes dialog:

Blog Notes Dialog

Clicking "New Note" will insert a new superscript number (the next one in the sequence).

Clicking "Reference Note" will insert the selected number as superscript.  You can also just double-click the number to do that.

The "Notes Section" button will insert a notes section.1

Lastly, "Write Note" simply adds the selected note plus a period and couple spaces.

As you can see, it's pretty basic, but it saves a few seconds for each note (assuming you bother to switch to HTML view, find the number, and put <sup></sup> tags around it like I do [did]).  You can also tweak one option/setting.  Go to Tools -> Options, and select the Plug-ins tab:

Live Writer Plug-ins Options

Clicking Options... on the Blog Notes plug-in brings up a tres simple dialog:

Blog Notes Options

This one option will toggle whether or not the plug-in uses in-page anchor links for the notes so that the superscript numbers would link down to the corresponding note in the Notes section.  I originally added this feature without realizing the implications.  Because blog posts are often aggregated and otherwise viewed in unexpected places, using in-page anchors is iffy at best.  Community Server seems to strip them out, and dasBlog keeps them, but since it emits a <base /> tag to the site root, all of the anchor links are relative to the site homepage instead of the current post, which effectively renders them useless.  I looked at the dasBlog code where this happens, and it's in the core assembly.  I was concerned what side effects changing it to use the current URL would have, so I didn't do that.  But if you have blog software that will let you use this feature, by all means, enjoy!

Caveats

  • Because of the way the plug-in framework works, I use a static/shared collection to keep track of the notes.  This means it acts a tad goofy if you close out of Live Writer or write multiple posts while it is open.  If you close and come back to a post, the notes count is reset.  To "fix" this, just re-add however many notes you had (if you want to bother).  If you write multiple posts, you just have to deal with it.  I don't know if there is post-local storage for plug-ins, but I didn't have time to dig into it.
  • Your mileage may vary.  I wrote this mainly to save myself time and get familiar with the Live Writer extensibility model, so it ain't a finished product to be sure.

Get It!
Since there are numerous tutorials on the Web (that I learned from) to write Live Writer plug-ins, I won't go into those details here, but you're welcome to download my code and learn from it directly if you want.  I think I have comments and such in there.

  • Download the Plug-in Only - If you just want to use this plug-in, this is what you want.  Drop the DLL into your plug-ins directory and go (typically C:\Program Files\Windows Live\Writer\Plugins).
  • Download the Source Code - This is a VS 2008 solution for those who want to learn, enhance, extend, whatever.  The license is more or less the MIT license.  You'll need Live Writer installed to reference its API.

Notes
1. This is the "Notes Section."  The button adds the "Notes" header and writes out any existing note numbers.

Saturday, December 22, 2007 2:07:12 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, September 17, 2007

When searching recently so as to provide further reading for "domain model" in a recent post, I was quite surprised to find that there seemed to be no good definition readily available (at least not by Googling "domain model").  Since I tend to use this term a lot, I figured I'd try to fill this gap and, at the very least, provide a reference for me to use when I talk about it.

So What is a Domain Model?
Put simply, a domain model is the software model of a particular domain of knowledge (is that a tautology?).  Usually, this means a business domain, but it could also mean a software domain (such as the UI domain, the data access and persistence domain, the logging domain, etc.).  More specifically, this means an executable representation of the objects in a domain with a particular focus on their behaviors and relationships1.

The point of the domain model is to accurately represent these objects and their behaviors such that there is a one-to-one mapping from the model to the domain (or at least as close as you can get to this).  The reason this is important is that it is the heart of software solutions.  If you accurately model the domain, your solution will actually solve the problems by automating the domain itself, which is the point of pretty much all business software.  It will do this with much less effort on your part than other approaches to software solutions because the objects are doing the work that they should be doing--the same that they do in the physical world.  This is part and parcel of object-oriented design2.

Nothing New
By the way, this is not a new concept--OO theory and practice has been around for decades.  It's just that somewhere along the line, the essence of objects (and object-oriented design) seems to have been lost or at least distorted, and many, if not most, Microsoft developers have probably not been exposed to it, have forgotten it, or have been confused into designing software in terms of data.  I limit myself to "Microsoft developers" here because it is they of whom I have the most experience, but I'd wager, from what I've read, the same is true of Java and other business developers. 

I make this claim because everyone seems to think they're doing OO, but a concrete example of OOD using Microsoft technologies is few and far between.  Those who try seem to be more concerned with building in framework services (e.g., change tracking, data binding, serialization, localization, and data access & persistence) than actually modeling a domain.  Not that these framework services are not important, but it seems to me that this approach is fundamentally flawed because the focus is on software framework services and details instead of on the problem domain--the business domain that the solutions are being built for. 

The Data Divide
I seem to write about this a lot; it's on my mind a lot3.  Those who try to do OOD with these technologies usually end up being forced into doing it in a way that misses the point of OOD.  There is an unnatural focus on data and data access & persistence.  Okay, maybe it is natural or it seems natural because it is ingrained, and truly a large part of business software deals with accessing and storing data, but even so, as I said in Purporting the Potence of Process4, "data is only important in as much as it supports the process that we’re trying to automate." 

In other words, it is indeed indispensable but, all the same, it should not be the end or focus of software development (unless you're writing, say, a database or ORM).  It may sound like I am anti-data or being unrealistic, but I'm not--I just feel the need to correct for what seems to be an improper focus on data.  When designing an application, think and speak in terms of the domain (and continue to think in terms of the domain throughout the software creation process), and when designing objects, think and speak in terms of behaviors, not data. 

The data is there; the data will come, but your initial object models should not involve data as a first class citizen.  You'll have to think about the data at some point, which will inevitably lead to specifying properties on your objects so you can take advantage of the many framework services that depend on strongly-typed properties, but resist the temptation to focus on properties.  Force yourself to not add any properties except for those that create a relationship between objects; use the VS class designer and choose to show those properties as relationships (right-click on the properties and choose the right relationship type).  Create inheritance not based on shared properties but on shared behaviors (this in itself is huge).  If you do this, you're taking one step in the right direction, and I think in time you will find this a better way to design software solutions.

My intent here is certainly not to make anyone feel dumb, stupid, or like they've wasted their lives in building software using other approaches.  My intent is to push us towards what seems to be a better way of designing software.  Having been there myself, I know how easy it is to fall into that way of thinking and to imagine that simply by using these things called classes, inheritance, and properties that we're doing OOD the right way when we're really not.  It's a tough habit to break, but the first step is acknowledging that there is (or at least might be) a problem; the second step is to give object thinking a chance.  It seems to me that it is (still) the best way to do software and will continue to be in perpetuity (because the philosophical underpinnings are solid and not subject to change).

Notes
1. An object relationship, as I see it, is a special kind of behavior--that of using or being used.  This is also sometimes represented as a having, e.g., this object has one or more of these objects.  It is different from data because a datum is just a simple attribute (property) of an object; the attribute is not an object per se, at least not in the domain model because it has no behaviors of its own apart from the object it is attached to.  It is just information about a domain object.

2. I go into this in some depth in the Story paper in the Infragistics Tangerine exemplar (see the "To OOD or Not to OOD" section).  I use the exemplar itself to show one way of approaching domain modeling, and the Story paper describes the approach.

3. Most recently, I wrote about this in the Tangerine Story (see Note 2 above).  I also wrote publicly about it back in late 2005, early 2006 in "I Object," published by CoDe Magazine.  My thought has developed since writing that.  Interestingly, in almost two years, we seem to have only gotten marginally better ways to deal with OOD in .NET. 

4. In that article, I put a lot of focus on "process."  I still think the emphasis is valid, but I'd temper it with the caveat that however business rules are implemented (such as in the proposed workflow-driven validation service), you still think of that as part of your domain model.  The reason for separating them into a separate workflowed service is a compromise between pragmatism and idealism given the .NET platform as the implementation platform.  I've also since learned that the WF rules engine can be used apart from an actual .NET workflow, so depending on your application needs, just embedding the rules engine into your domain model may be a better way to go than using the full WF engine.  If your workflow is simple, this may be a better way to approach doing validation.

Monday, September 17, 2007 11:41:54 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Saturday, September 15, 2007

As I sit here on my deck, enjoying the cool autumn breeze1, I thought, what better thing to write about than Web services!  Well, no, actually I am just recalling some stuff that's happened lately.  On the MSDN Architecture forums and in some coding and design discussions we had this week, both of which involve the question of best practices for Web services.

Before we talk about Web services best practices, it seems to me that we need to distinguish between two kinds of application services.  First, there are the services that everyone has been talking about for the last several years--those that pertain to service-oriented architecture (SOA).  These are the services that fall into the application integration camp, so I like to call them inter-application services. 

Second, there are services that are in place to make a complete application, such as logging, exception handling, data access and persistence, etc.--pretty much anything that makes an application go and is not a behavior of a particular domain object.  Maybe thinking of them as domain object services would work, but I fear I may already be losing some, so let's get back to it.  The main concern within this post are those services using within an application, so I call them intra-application services.

It seems like these latter services, the intra-application ones, are being often confused with the former--the inter-application services.  It's certainly understandable because there has been so much hype around SOA in recent years that the term "service" has been taken over and has lost its more generic meaning.  What's worse is that there has been a lot of confusion around the interaction of the terms Web service and just plain service (in the context of SOA).  The result is that you have folks thinking that all Web services are SO services and sometimes that SO services are always Web services.

My hope here is to make some clarification as to the way I think we should be thinking about all this.  First off, Web services are, in my book at least, simply a way of saying HTTP-protocol-based services, usually involving XML as the message format.  There is no, nor should there be, any implicit connection between the term Web service and service-oriented service.  So when you think Web service, don't assume anything more than that you're dealing with a software service that uses HTTP and XML. 

The more important distinction comes in the intent of the service--the purpose the service is designed for.  Before you even start worrying about whether a service is a Web service or not, you need to figure out what the purpose of the service is.  This is where I get pragmatic (and those who know me know that I tend to be an idealist at heart).  You simply need to determine if the service in question will be consumed by a client that you do not control. 

The reason this question is important is that it dramatically affects how you design the service.  If the answer is yes, you automatically take on the burden of treating the service as an integration (inter-application) service, and you must concern yourself with following best practices for those kinds of services.  The core guideline is that you cannot assume anything about the way your service will be used.  These services are the SO-type services that are much harder to design correctly, and there is tons of guidance available on how to do them2.  I won't go in further depth on those here.

I do think, though, that the other kind of services--intra-application services--have been broadly overlooked or just lost amidst all the discussion of the other kind.  Intra-application services do not have the external burdens that inter-application services have.  They can and should be designed to serve the needs of your application or, in the case of cross-cutting services (concerns) to serve the needs of the applications within your enterprise.  The wonderful thing about this is that you do have influence over your consumers, so you can safely make assumptions about them to enable you to make compromises in favor of other architectural concerns like performance, ease of use, maintainability, etc.

Now let's bring this back to the concrete question of best practices for intra-application Web services.  For those who are using object-oriented design, designing a strong domain model, you may run into quite a bit of trouble when you need to distribute your application across physical (or at least process) tiers.  Often this is the case for smart client applications--you have a rich front end client that uses Web services to communicate (usually for data access and persistence).  The problem is that when you cross process boundaries, you end up needing to serialize, and with Web services, you usually serialize to XML.  That in itself can pose some challenges, mainly around identity of objects, but with .NET, you also have to deal with the quirks of the serialization mechanisms.

For example, the default XML serialization is such that you have to have properties be public and  read-write, and you must have a default constructor.  These can break encapsulation and make it harder to design an object model that you can count on to act the way you expect it to.  WCF makes this better by letting you use attributes to have better control over serialization.  The other commonly faced challenge is on the client.  By default, if you use the VS Add Web Reference, it takes care of the trouble of generating your service proxies, but it introduces a separate set of proxy objects that are of different types than your domain objects.

So you're left with the option of either using the proxy as-is and doing a conversion routine to convert the proxy objects to your domain objects, or you can modify the proxy to use your actual domain objects.  The first solution introduces both a performance (creating more objects and transferring more data) and a complexity (having conversion routines to maintain) hit; the second solution introduces just a complexity hit (you have to modify the generated proxy a bit).  Neither solution is perfectly elegant--we'd need the framework to change to support this scenario elegantly; as it is now, the Web services stuff is designed more with inter-application services in mind (hence the dumb proxies that encourage an anemic domain model) than the intra-application scenario we have where we intend to use the domain model itself on the client side.

If you take nothing else away from this discussion, I'd suggest the key take away is that when designing Web services, it is perfectly valid to do so within the scope of your application (or enterprise framework).  There is a class of services for which it is safe to make assumptions about the clients, and you shouldn't let all of the high-falutin talk about SOA, WS-*, interoperability, etc. concern you if your scenario does not involve integration with other systems that are out of your control.  If you find the need for such integration at a later point, you can design services (in a service layer) then to meet those needs, and you won't be shooting yourself in the foot trying to design one-size-fits-all services now that make so many compromises so as to make the app either impossible to use or very poorly performing.

My own preference that I'd recommend is to use the command-line tools that will generate proxies for you (you can even include a batch file in your project to do this) but then modify them to work with your domain model--you don't even need your clients to use the service proxies directly.  If you use a provider model (plugin pattern) for these services, you can design a set of providers that use the Web services and a set that talk directly to your database.  This enables you to use your domain model easily in both scenarios (both in a Web application that talks directly to the db as well as a smart client that uses Web services). 

It requires a little extra effort, but it means you can design and use a real domain model and make it easier easier to use by hiding the complexity of dealing with these framework deficiencies for consumers of the domain model.  This is especially helpful in situations where you have different sets of developers working on different layers of the application, but it is also ideal for use and reuse by future developers as well.

One of these days, I'll write some sample code to exemplify this approach, maybe as part of a future exemplar.

Notes
1. The weatherthing says it's 65 degrees Fahrenheit right now--at 1pm!
2. My observation is that it is safe to assume that when other people talk about services and Web services, these are the kind they're thinking of, even if they don't make the distinction I do in this post. 

Saturday, September 15, 2007 6:00:03 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, September 10, 2007

I wasn't going to post about it, but after reading Don's post, I realized that I should so that I can thank those involved in presenting me with this honor.  I was surprised when I was contacted about being nominated to be an INETA speaker, and I was even more surprised when I heard that I'd been voted in.  Looking over the folks on the list, I feel hardly qualified to be named among them.

So without further ado, let me thank David Walker (who's an all around great guy and VP of the Speakers Bureau), Nancy Mesquita (who I've not had the pleasure to meet personally but has been very helpful in her role as Administrative Director), as well as everyone else involved on the Speaker Committee and others (whom I know not of specifically) in welcoming me into the INETA speaker fold.  It's a great honor--thank you. 

Now, I have to get back to work!  My group, UXG, just released Tangerine, the first of our exemplars, and now we're on to the next great thing!

Monday, September 10, 2007 10:19:19 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Tuesday, August 14, 2007

Thanks to a sharp co-worker of mine, I was recently introduced to "Magic Ink: Information Software and the Graphical Interface," by Bret Victor.  It was quite an interesting read; Victor makes a lot of good points.  For instance, he suggests that we should view information software as graphic design, i.e., taking the concerns of traditional graphic design as paramount and then taking it to the next level by availing ourselves of context-sensitivity, which he defines as inferring the context from the environment, history, and, as a last resort, interaction.

Minimizing Interaction

The thrust of the argument is around reducing interaction and making software smarter, i.e., more context aware and, eventually, able to learn through abstractions over learning algorithms.  I think we can all agree with this emphasis, but I do think he unnecessarily latches onto the term "interaction" as a bad thing, or rather, I think he presents "interaction design" in an overly-negative light. 

True, the smarter we can make computers (and consequently require less interaction from users) the better, but that doesn't negate the usefulness of interaction design, human factors, information architecture, and usability.  There are many, valuable things to be learned and used in all of these interaction-oriented fields, and we shouldn't deride or dismiss them because they focus on interaction.  I felt that Victor's negative emphasis on this and his speculating that why software sucks in relation to this took away from the value of his overall message.

The Problem of Privacy

There is one problem that I don't think he addressed in terms of increasing environmental context awareness, and that is security, specifically, privacy.  It is tempting to think about how wonderful it would be for a computer to know more about our environment than us and thus be able to anticipate our needs and desires, but in order to do this, we, as humans, will have to sacrifice some level of privacy.  Do we really want a totally connected computer to know precisely where we are all the time?  Do we really want it to be "reporting" this all the time by querying location aware services?  Do we really want a computer to remember everything that we've done--where we've been, who we've interacted with, when we did things?

I think the trickier issues with context awareness have to do with questions like these.  How do we enable applications to interact with each other on our behalf, requiring minimal interaction from us, while maintaining our privacy?  How does an application know when it is okay to share X data about us with another application?  Do we risk actually increasing the level of interaction (or at least just changing what we're interacting about) in order to enable this context sensitivity? 

If we're not careful, we could end up with a Minority Report world.  People complain about cookies and wire taps, the world of computer context-sensitivity will increase privacy concerns by orders of magnitudes.  This is not to negate the importance of striving towards greater context sensitivity.  It is a good goal; we just need to be careful how we get there.

Towards Graphic Design

One of the most effective points he made was in illustrating the difference between search results as an index and search results as a tool for evaluation itself, i.e., thinking about lists of information in terms of providing sufficient information for a comparative level of decision making.    It is a shift in how developers can (and should) think about search results (and lists in general).

Similarly, his example of the subway schedule and comparing it to other scheduling applications is a critical point.  It illustrates the value of thinking in terms of what the user wants and needs instead of in terms of what the application needs, and it ties in the value of creating contextually meaningful visualizations.  He references and recommends Edward Tufte, and you can see a lot of Tufte in his message (both in the importance of good visualizations and the bemoaning of the current state of software).  I agree that too often we developers are so focused on "reuse" that we fail miserably in truly understanding the problems we are trying to solve, particularly in the UI.

That's one interesting observation I've had the chance to make in working a lot with graphic/visual designers.  They want to design each screen in an application as if it were a static canvas so that they can make everything look and feel just right.  It makes sense from a design and visual perspective, but developers are basically the opposite--they want to find the one solution that fits all of their UI problems.  If you give a developer a nicely styled screen, he'll reuse that same style in the entire application.  In doing so, developers accidentally stumble on an important design and usability concept (that of consistency), but developers do it because they are reusing the design for maximum efficiency, not because they're consciously concerned about UI consistency!  It is a kind of impedance mismatch between the way a designer views an application UI and the way a developer does.

The Timeless Way

I'm currently reading Christopher Alexander's The Timeless Way of Building, which I hope to comment on in more depth when done.  But this discussion brings me back to it.  In fact, it brings me back to Notes on the Synthesis of Form as well, which is an earlier work by him.  One of the underlying currents in both is designing a form (solution, if you will) that best fits the problem and environment (context).  The timeless way (and patterns and pattern language, especially) is all about building things that are alive, that flow and thrive and fit their context, and the way you do that is not by slapping together one-size-fits-all solutions (i.e., reusing implementations) but in discovering the patterns in the problem space and applying patterns from the solution space that fit the problem space just so.  The reuse is in the patterns, at the conceptual level, but the implementation of the pattern must always be customized to fit snugly the problem. 

This applies in the UI as well as other areas of design, and that's the underlying current behind both Tufte's and Victor's arguments for the intelligent use of graphic design and visualization to convey information.  You must start by considering each problem in its context, learn as much as you can about the problem and context, then find patterns that fit and implement them for the problem in the way that makes the most sense for the problem.  But more on the timeless way later.

A Good Read

Overall, the paper is a good, thought-provoking read.  I'd recommend it to pretty much any software artisan as a starting point for thinking about these issues.  It's more valuable knowledge that you can put in your hat and use when designing your next software project.

Tuesday, August 14, 2007 10:41:14 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, July 30, 2007

Are you passionate about software development?  Do you love to share your knowledge with others?  Do you like working in a vibrant, fun culture working on the latest and greatest technologies with other smart and passionate people?  If so, I think I may have your dream job right here.

We're looking for another guidisan to help craft guidance using best practices for .NET development.  The word guidisan ('gId-&-z&n) comes from a blending of "guidance" and "artisan," which really speaks to the heart of the matter.  We're looking for software artisans who have the experience, know-how, and gumption to explore strange new technologies, to seek out new applications and new user scenarios, to boldly go where other developers only dream of going in order to provide deep, technical guidance for their colleagues and peers.

What do guidisans do? 

  • Help gather, specify, and document application vision, scope, and requirements.
  • Take application requirements and create an application design that meets the requirements and follows best known practices for both Microsoft .NET and Infragistics products.
  • Implement applications following requirements, best practices, and design specifications.
  • Create supplemental content such as articles, white papers, screencasts, podcasts, etc. that help elucidate example code and applications.
  • Research emerging technologies and create prototypes based on emerging technologies.
  • Contribute to joint design sessions as well as coding and design discussions.

What do I need to qualify?

  • Bachelor’s Degree.
  • 4+ years of full-time, professional experience designing and developing business applications.
  • 2+ years designing and developing.NET applications (UI development in particular).
  • Be able to create vision, scope, and requirements documents based on usage scenarios.
  • Demonstrated experience with object-oriented design; familiarity with behavior-driven design, domain-driven design, and test-driven development a plus.
  • Demonstrated knowledge of best practices for .NET application development.
  • Accept and provide constructive criticism in group situations.
  • Follow design and coding guidelines.
  • Clearly communicate technical concepts in writing and speaking.

If you think this is your dream job, contact me.  Tell me why it's your dream job and why you think you'd be the next great guidisan.

Monday, July 30, 2007 3:01:27 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Friday, April 27, 2007
Just a one question survey. 
 
If you are evaluating a software product, what do you prefer to do:
A) Download everything, including help, samples, SDK, etc. at once, even if it may be half a gig.
B) Just download the product bits first and then either download the help, samples, SDK, etc. separately as you need them (or never download those and just use online help/samples).
C) Download a shell installer that lets you pick what you want and only downloads/installs what you pick?
D) Try out the bits in an online VM environment.
E) Other, please specify.
 
You can either just pick one or put them in order of preference.
 
Thanks in advance for any opinions!
Friday, April 27, 2007 2:31:06 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Tuesday, December 19, 2006

It's that time again.  Time for the SYS-CON Readers' Choice Awards voting.  Actually, it's been that time for a little while now; I'm just slow.

Infragistics has been nominated for several categories in several publications, so if you like Infragistics or even if you're looking for a way to kill a few minutes at the airport, go vote for us.

Here's your friendly voter's guide to make your life easier (note that if you don't know or prefer any choice in a particular category, you can just click Continue to move on):

http://java.sys-con.com/general/readerschoice.htm

#11 - NetAdvantage for JSF

#12 - NetAdvantage for JSF

#20 - JSuite

#24 - NetAdvantage for JSF

#28 - NetAdvantage for JSF

http://soa.sys-con.com/general/readerschoice.htm

#4 – NetAdvantage for .NET

#14 – NetAdvantage for .NET

http://webddj.sys-con.com/general/readerschoice.htm

#3 – NetAdvantage for ASP.NET

#6 – Infragistics Training and Consulting

#7 – Infragistics Training and Consulting

#8 – NetAdvantage for ASP.NET

http://dotnet.sys-con.com/general/readerschoice.htm

#3 – NetAdvantage AppStylist

#4 – Infragistics Training and Consulting

#5 – Infragistics NetAdvantage for .NET

#10 – TestAdvantage for Windows Forms

Tuesday, December 19, 2006 6:28:11 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, November 13, 2006

I was just reminded by our local Dev Evangelist, Peter Laudati, that we've got our third NJ CodeCamp coming up this weekend.  Code camps are a fun way to get to know other local devs, learn some cool stuff, and generally get at least a free lunch!  So you should go!

Monday, November 13, 2006 9:55:40 AM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 
# Thursday, October 26, 2006

Today we launched our new web site.  It was not just a simple update; we revamped the whole deal and made it Web 2.0 compliant <grin>.  If you remember our old site, I trust you'll immediately see the improvement.  Please take a minute to check it out and let me know what you think.  Also, if you run into any problems with it, please feel free to let us know.

Thursday, October 26, 2006 7:24:38 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [3]  | 
# Saturday, July 08, 2006

Well, I've finally settled on the software I'm going to use to write my blogs as well as read them.  I'm pretty picky about UI these days, and most of the software out there is just not that great when it comes to that.  But of course, I also need it to integrate and minimize the time it takes to set things up and the time I have to spend jacking with it on a regular basis. 

Authoring
For authoring, I've decided to go with WB Editor 2.5.1 by Yiyi Sun.  I like the UI.  It currently uses the webby L&F, which when done right has a pleasant, light feel to it.  One thing that immediately strikes me as nice is that when I save a post for the second time, unlike BlogJet, it just saves to the same file I saved before and doesn't prompt me to pick a new file and then, when I select the same file, ask me if it's okay to overwrite it.  That really bugged me about BlogJet.  With WB Editor 2.5.1, CTRL-S works just like you'd hope, although it does pop up a notification saying it was saved, which is a bit annoying but can be dismissed with a spacebar slap.  I'd prefer the notification be in the status bar, but it's still much better than BlogJet in terms of saving drafts.

Post Authoring 
Post Authoring with WB Editor 2.5.1

Note that the color is green; it comes with three theme (skin) options: Blue (default), Green, and Pink.  I've always had a penchant for green.  Note also that the coloring of the post itself is like my dotNetTemplar blog; you can set this up using the options by specifying styles.  It's kind of nice so that you get a better feel for what it looks like, but it would be helpful to 1) allow for a stylesheet per blog and 2) encapsulate the entire post in a div to better capture the L&F of a single entry on a blog site (not sure how this would work in the editor, though).

I also like how ridiculously easy it is to insert images and screen shots.  When you click the insert/upload image icon, it has a friendly dialog that lets you pick the image or even paste from the clipboard.  It offers the option to automatically create a thumbnail and upload them both either via FTP or to Flickr.  I haven't tried the Flickr option, but it works great with FTP.

Adding Images
Adding Images with WB Editor 2.5.1

The HTML itself is clean, too, and it has a nifty little snippet insert drop-down for common stuff.  This is important to me because I don't want my editor using any markup--I want to leave it to my style sheets.  And it seems to play well with that.  It also highlights nicely, and the highlighting colors are personalizable.

HTML Editor
HTML Editing with WB Editor 2.5.1

Being a sucker for good UI, I enjoy the main screen that shows your registered blogs.  Yiyi has gone to the trouble to get images for the major blog engines (needs to update .Text to CS), so you get that along with a screen shot of your blog, the URL, and the categories.  And yes, you can of course cross post to multiple blogs, which is one reason to use an editor like this.

 WB Editor Home
WB Editor 2.5.1 Home

One of the really nice things about WB Editor from a .NET developer's perspective is that it has a plug-in architecture (currently running on .NET 1.1). 

Plugins
WB Editor 2.5.1 Plugins

An important plug-in for devs is a code highlighter.  It may not be the nicest formatting, but it works.  If you don't like it, you could easily write a plug-in to use a formatter that you do like.

[Serializable]
[XmlRoot("links")]
public class NavigationRoot
{
    NavigationLinkCollection items = 
        new NavigationLinkCollection();
    [XmlElement("link")]
    public NavigationLinkCollection Items 
    { 
        get 
        { 
            return items; 
        } 
        set 
        { 
            items = value; 
        } 
    }
}

Another feature that I like about WB Editor is its roadmap, which promises to stay on top of the latest technologies from Microsoft, such as 2.0 and ClickOnce (coming in the next version) and ultimately .NET Framework 3.  It's a project that I could get excited about working on, and as you can see from the blog, it is actually being worked on.  Of course, it has other features that you can read about in its features list; I'm just highlighting the ones I think are cool.

So in short, it has everything that I'm looking for in a rich-client blog editor, and I'd recommend it over the much lauded BlogJet.  It is also competitive in pricing, currently at $19.99, which for a great piece of software like this is outstanding.

Reading
Now, I did mention at the beginning that I'd also settled on an RSS Reader.  I looked at a few, RSS Bandit, FeedDemon, Windows Live, Awasu, and probably a few others that don't readily come to mind.  My issue with all of these is the amount of work involved in setting them up.  It's not that they're particularly troublesome if you can live with a straight list of blogs from your OPML file, but if you like to categorize like I do, then it becomes troublesome, especially when you use multiple machines with multiple OSes on them.  Having to repeatedly setup my subscriptions kills me, and it's one reason I have always avoided using newsgroups.

Ideally, I'd like to just set them up once, and be able to read them either online or in a rich client, and have both of those stay in sync.  The only such RSS reader I ran across that met the bill was NewsGator, and in particular, their Inbox product that integrates with Outlook.  I might have gone with their FeedDemon product, except I am in fact one of those users that almost always has Outlook open, and I figure why have another app that is always running.  Also, I bought Newsgator way back in '04 when it first came out, so having an already-purchased license (with a free upgrade to the latest version) helped me decide.  Naturally, I had long since lost my license info, but they have a nifty license retrieval mechanism, so it was painless to get it going. 

The feature I most like, in case I wasn't already clear, is that they integrate and synchronize with their online reader, so I can have the best of both worlds, and when I want to set the rich client up, I can just grab the stuff from my already setup online source.  I can make new subscriptions and reorganize them on either the rich client or web, and it will keep them in sync.  It's the best of Plaxo for RSS.  Ah, now this is software for the connected world.

So that's about it.  I just thought I'd pass along my findings to you in case you've found yourself in a similar predicament.  I'm not trying to convince you to change if you're already happy with your setup, but if you aren't happy, consider these two products for the total blog/RSS reading and authoring solution.  I hope it helps!

Saturday, July 08, 2006 1:45:13 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, June 27, 2006

I recently got an email about a new service to add my blog RSS feed to Live.com (note the new icon on my blog, if you visit it).  For some time, I've been wanting to look into an AJAX-based web client for reading my blogs because I've found, like newsgroups, I just don't like having to set up everything I'm subscribed to on every computer I use.  At the same time, I do want a good UI.

Well, I've been putting off doing the research for it (and my blog reading has suffered for it).  Today I thought I'd check out what Live.com is like as an RSS reader, so I first tested using my new link to add to Windows Live.  It works and basically adds a little RSS reader gadget for my blog.  So then I thought I'd check out how it'd work with all my blogs, so I got the latest OPML (based on my blog roll here) and used Live.com's import feature to import them all.

At first, I was a bit disoriented because it said it imported but it wasn't showing them anywhere (I expected them to be put on the page I was looking at when I imported them).  But then I found them in the My Stuff section.  So I started building out my layout.

I started with the default two-column, but I quickly realized that wouldn't work, so I switched to four column, which seems to be just right at 1280x1024.  I knew I wouldn't want them all on one page, but I did want some categorization, so I came up with non-technical blogs, architecture blogs, and other technical blogs, one Live page for each.  Then, if it made sense, I categorized by column.  The results follow.

Other Technical Blogs
This is the "Other Technical Blogs" page.

Architecture Blogs
This is the "Architecture Blogs" page.

Now great, you may be thinking, I can use this as well.  Let me warn you, there were a couple MSDN blogs that repeatedly and totally hosed IE7 (I'm running Vista B2 x64 on this box).  I figured out which blogs they were and removed them from my stuff.  But even doing that, IE was still having problems, and as you can see from the image below, there's a reason for that.

taskman.JPG

Note the top entry.  IE is running at 50%, but this is a dual-core Athlon CPU, which means on some machines it'd be trying to use 100%, and the memory usage is out of this world (350MB), even bigger than Visual Studio!

From the people I've talked to, having Live.com eat up CPU and RAM is not unusual.  Not being a Javascript and AJAX guru, I'll withhold any harsh judgments as I can readily imagine how it could be problematic.  But all I'm sayin' is that it ain't ready for primetime blog reading at a very basic level.

Beyond the performance issues, it also has no tracking of read/unread and no notification of new posts, both of which I think are indispensible for any kind of RSS reader.  Now, I understand that maybe I'm abusing what they intend for the usage scenarios to be, but why else make it possible to subscribe to RSS feeds than to be an RSS reader?  As it is, the gadget is only good for limited use for maybe news services or the like where you don't care about having your read/unread tracked.

I will say that it has a neat little image capture feature where it'll grab any images in the feed and thumbnail them for you, even do a fade in/out if there is more than one.  It also has neat little mouseover previews, which I like.  It's not totally unusuable in terms of features to be sure, but it would be nice to see a better blog reader gadget that maybe would offer some basic categorization, read tracking, and possibly some sort of notification, though I'm not sure how that'd fly given it is web based.  I'm going to keep trying out Live.com like this to see if they improve it.

On the positive side, this motivated me to blog about it and, in the process, try out a new blog authoring tool, WB Editor 2, based on the recommendations of John Forsythe.  It has a pretty friendly interface, is easy to set up (as these things go), and it is cheap.  This post  is being authored with it, so if there are any issues, well, there you go, but it was easy to add the images, and it created the thumbnails for me and uploaded them along with the main images.  I also like that it has a plug-in architecture that is .NET based, even if it is 1.1.  So far, I like it even better than BlogJet.

Tuesday, June 27, 2006 3:10:45 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, June 19, 2006

Is anyone else as frustrated as I am with the multifarious password policies you run into across systems?  It seems like everyone and his brother has "the best" idea of what a strong password should be, which translates into having to keep up with N passwords and which systems they map to. 

That's bad enough, but then you have these people who think that making you change your password every N days is a good idea and that you can't use the last N passwords you've already used.  To make it worse, some brilliant minds out there think that forcing us to have "strong" usernames is a good idea too, so you end up with something like N^N permutations of usernames and passwords that you have to track. 

"So what?" you say.  "We've got a nifty 'Forgot Password' option on our site/app/etc.." 

But I have to ask, is that really ideal?  Perhaps if we didn't have to keep track of N^N passwords mapped in matrices to the N! systems we use, we wouldn't forget them so often! 

I'm not saying that having strong passwords is a bad idea, not at all.  I'm suggesting that we all work toward agreeing on what a strong password is and come up with, dare I suggest, standards based on data sensitivity.  So for instance, here are some ideas:

  1. If all you've got for a particular system is generic profile data, that would require a very low strength password, say minimum six characters, no special chars or numbers required. 
  2. Then you might have a next level for systems that keep your order history (but no financial data per se).  These kinds of systems might require eight characters with at least one number.
  3. You might then have systems that store financial data, such as credit cards, but are still a commerce system; these could require eight characters with at least one number and one special character.
  4. Then there are the actual banking, trading, etc. systems, and these might require ten characters with at least one number and special character.
  5. For systems above this level (e.g., company VPN), you would want to have some kind of dual authentication with a strong password and RSA tag, smart card, bio, etc.

Anyways, the point is not necessarily that these are the best specific guidelines; I don't consider myself a security expert, but I know enough to understand that what we have going on is not likely adding to our general security because in order to keep track of all these authentication tokens, we have to write them down somewhere, store them in some vault, file, sticky pad, etc., which in the end likely makes our security less, and it certainly adds to both individual and organizational administration overhead to manage password resets, fetches, etc.

If we had standards like I'm suggesting that were well published, then every Joe that goes to write a new system would easily be able to put in place a policy that is both secure, appropriate for the data being protected, and manageable for everyone involved.  If we only had maybe four passwords to remember, even if they're odd and funky (with special characters and numbers) or if they were pass phrases, we would have to write them down or forget them or manage getting them reset all the time.  In other words, we'd be more secure and happier.  And if we do have such standards, they need to be far more publicized and talked about when the subject comes up because I've not heard of them, and I don't think I live in the proverbial cave.

Monday, June 19, 2006 1:53:29 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Sunday, June 18, 2006

I just finally got Windows Vista up and running on my DFI LANPARTY SLI-DR board (has nVidia's nForce4 chipset).  Plugged into that are an AMD Athlon Dual Core X2 3800+ chip, 2 GB RAM, 2 Western Digital 36.7 GB 10K SATA (setup as RAID 0), and an nVidia GeForce 6600 GT (not running SLI yet), among other less important peripherals. 

It wasn't easy getting this going.  nVidia has 64 bit Vista drivers for its chipset, but they're incomplete and the instructions they post on their site don't work for me (and others).  Thankfully, someone else has put together an install guide, but even with that, it took me two tries to get it going (it didn't like my USB drive the first time apparently). 

The silly thing is that Vista B2 won't ask me for my drivers before it summarily decides that it can't find any information about my disks, so you have to start from an existing XP installation and run the installer from there and install on a secondary partition.  I hope they get this resolved by release because I'd really like to repartition my drives and install it on my C drive.  Maybe my blogging this with my specs will help others who are in a similar situation.

Anyways, it's up and running and it is pretty nifty so far.  I'm one to go in for eye candy, and I love the new Flip3D and Glass (about all I've really had a chance to play with thus far).  I can say it is a bit annoying that when it prompts you to run something as admin that the whole screen blanks out; don't know what that's all about.  Maybe it's intentional just to ensure they have your attention...

One thing I can't seem to get working now is the gadgets.  I show the sidebar and it is just blank.  When I try to add gadgets, nothing happens.  Will google more for the answer...

Sunday, June 18, 2006 11:33:23 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, June 05, 2006

As most of you know who follow my blog at all, I recently joined Infragistics.  Well, I finally got around to getting my company blog set up, so if you're curious or interested, feel free to check it out and maybe subscribe.  While you're there, if you are a customer or potential customer, you might want to look around at the other blogs and maybe subscribe to some of them to stay on top of Infragistics stuff.

Monday, June 05, 2006 11:16:05 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Sunday, June 04, 2006

I ran into an odd problem the other day that I figured I'd blog for any other poor souls stricken with the same enigma.  Without going into the details of why I was trying to setup the indexing service on my Windows XP SP2 box, I found that when I tried to get into it from MMC (Computer Management), it would give me an error when I tried to expand the Services and Applications node saying that it failed to initialize the snap-in for the indexing service.

Searching on various combinations of the error message really didn't help, on Google or MS.  Everything appeared to be in order (the service acted like it was running) except that it wouldn't run the snap-in, and when I looked at the Windows Components tab in Add/Remove Programs, it showed that Indexing Service was unchecked.  Even if I checked it and clicked next (at which point it'd act like it was installing and configuring it, it would still show up as unchecked.

I had also noticed in recent days that I'd occasionally get one of those application crashed, do you want to debug messages about this SearchFilterHost.exe app.  When I first got the message, nothing came up for it on Google.  When I searched again on Friday, I found a few indicating that it was part of Office 2007 Beta 2, which I've been running since the day it was released, more or less.  I had kind of assumed that, but I just ignored the error and moved on.

Well, those two things gelled in my mind to suggest that maybe it was something with Office 2007 Beta 2 that was hosing up the Indexing Service.  More specifically, I suspected it had to do with the Windows Desktop Search that Outlook and OneNote 2007 prompt you to install.  On this hunch, I uninstalled the desktop search, and voila, my Indexing Service snap-in worked again, as did the program I was running that wanted to use it.

So the moral is that if you're having odd issues with Indexing Service, this is one thing you'll want to try.  It worked for me.  Now, I wish I could run the desktop search to optimize searching in Office.  I logged a bug on the beta site, but I figure my problem is probably just odd enough as to not be reproducible. :)  We'll see...

Sunday, June 04, 2006 8:40:25 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [9]  | 
# Wednesday, May 03, 2006

Just thought I'd stick these out there for anyone else who might run across them.  Those of us reared under the friendly wing of SQL Server are in for regular surprises when interacting with Oracle...  But hey, what doesn't drive you mad makes you stronger, right?

1. Using a DDL statement inside a transaction automatically commits any outstanding DML statements.  I ran into this the other day when I was trying to have a transaction that added a row to a table and added a trigger (dependent on that row) to another table.  (This is actually part of my implementation of an OracleCacheDependency, which I intend to share in an article at some point.)  If you stepped through the code, everything appeared to function as expected, the exception would be thrown on the add trigger statement, RollBack would be called on the OracleTransaction, and... the new row would remain in the database.

It was actually driving me buggy.  I was beginning to wonder if Oracle supported ADO.NET transactions at all because every example (all two of them) that I could find looked just like my implementation.  I even tried both the System.Data.OracleClient and the Oracle.DataAccess.Client, which, by the way, require different implementations as the Transaction property on the Oracle-provided provider is read only (you have to create the commands from the connection after starting the transaction, which is, umm, inconvenient in some scenarios).

So I was pulling my hair out, about to give up, when I ran across a single line in the help docs that says "The execution of a DDL statement in the context of a transaction is not recommended since it results in an implicit commit that is not reflected in the state of the OracleTransaction object."

Okay, I guess I'm just spoiled by Microsoft (yes, I am), but I would expect an EXCEPTION to be thrown if I try to do this and not have the code happily carry on as if everything was hunky dory.  You'd think that a database that is picky enough to be case sensitive might be picky enough to not let you accidentally commit transactions.  And that leads in my #2 gotcha for the day.

2. Oracle is case sensitive when comparing strings.  Let me say that again (so I'll remember it).  Oracle is case sensitive when comparing strings.  Now this point, in itself, is not particularly gotchaful; however, when coupled with a red herring bug report, it can really sneak up on ya and bite ya in the booty.  This is just one of those things that you need to keep in the upper buffers when debugging an app with Oracle.

3. (This one is just for good measure; I ran into it a while back.)  Oracle 10 no longer uses the old Oracle names resolution service.  This means that if you try to use the nifty Visual Studio Add-in and your organization is still using the old Oracle names resolution, you'll have to create manual entries in your tnsnames.ora file(s) just so that you can connect.  Even when you do this, it has to be just so or it won't work. 

I've had it where you can connect in the net manager but can't connect in the Oracle Explorer using the connections, which is sees and reads from the tnsnames file.  In particular, if I removed the DNS suffix from the name of the connection (to make it pretty), it wouldn't work.  It'd see the connection but not be able to connect.

4. (Another oldie, but importantie.)  Oracle, as of now, does not support ADO.NET 2 System.Transactions at all, if you use the Oracle-provided provider.  From what I could tell, although I wasn't able to test successfully, the Microsoft-provided one looks like it should, at least it should use DTC, but the jury is out.  Feel free to post if you've gotten it to work.

5. There is no ELMAH provider for Oracle.  I implemented one, though, and will be sharing in an article at some point.  Feel free to email me for it in the meantime.

6. There is no Oracle cache dependency.  See #5.

7. There is no Oracle roles, membership, etc. provider.  Sorry, I've not done that yet.

There are other bumps and bruises that you will get when dealing with Oracle if your main experience is SQL Server.  Many of them are just due to lack of familiarity, but there are some issues that I think truly make it a less desirable environment to work with.  So I thought I'd just share a few of them here for others who might find themselves in similar binds and need the help, which is so hard to find for Oracle.

Wednesday, May 03, 2006 2:34:41 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Saturday, April 29, 2006

I just updated this site to the latest version of dasBlog.  Many, many thanks to Scott for helping me out with getting it (given that I am a total noob to CVS and, apparently, picked a bad time to start since SF was having issues).  Most notably (that I know of), this version incorporates using Feedburner, which I guess is the latest and greatest for distributing your feed and lowering bandwidth usage, though I'm sure there are some other goodies in there.

Anyhoo, let me know if you suddenly start running into any problems with my blog.  Have a good un!

Saturday, April 29, 2006 2:19:18 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, April 24, 2006

Not long ago, I polled subscribers as to what they're interested in.  There seemed to be a fairly even divide between what I'll roughly call Technical posts and Non-Technical posts.  In fact, my goal with this blog is to be a blend of those two general categories.  At the same time, as much as it hurts to admit it, I know that some folks really don't care about my opinions on non-technical matters.  So it struck me (some time ago, actually; I've just been lazy) to create two general categories using the creative taxonomy of Technical and Non-Technical. 

Why?  This is because dasBlog (and most other blog systems, I imagine) allow you to subscribe to category-based RSS feeds as well as view posts by category.  So from this day forward, in addition to the more specific categories, I'll be marking all posts as either Technical or Non-Technical.  If all you care about is one or the other, you can just subscribe to one or the other and never be bothered with the stuff you don't care about.

You can view/subscribe to the feeds using the feed icon next to each category in the list (of categories).  Here are direct links as well:

Technical

Non-Technical

I hope this helps!

Monday, April 24, 2006 10:28:33 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 

In a recent post, I mentioned the acronym OPC, meaning "Other People's Code."  Somehow I doubt I'm the first person to use the acronym, so I don't intend to claim it as my own.  I can't say I've seen it before, but it seems so obvious that it should be one because OPC is so prevalent and we should all be talking about OPC much more than we do.  In an industry that arguably values reuse over every other virtue, you'd think that OPC would have long been canonized.

Yet it seems to me that when most people speak of reuse, they mean Their Own Code (TOC, I love overloading acronyms!) or, from their perspective, My Own Code (MOC).  In essence, they want other people to reuse their code, but there ain't no chance in heck that they're going to use OPC as a means to achieve the ultimate goal.  I want MOC to be reusable.  How can I impress my friends by writing code that can be reused by as many other people as possible?  This is something I think most of us that strive to be great at software think at one point or another, perhaps not in so many words, but ultimately, there is a sense of great pride when you polish off that last speck on your chrome-plated masterpiece, showing it to your buddies or the world in general and saying "that's MOC." 

The funny thing is that more often than not, the really ardent software folks among us, and even the less ardent, have a predilection for the NIH syndrome.  It's because we're all so damned smart, right?  Surely, those other folks at NIH Co. couldn't possibly have done it as well as I could have!?  Of course, we've got all the rationalizations lined up for when the business folks ask:

1) "I can't support that because I don't know it--I need the source code."
2) "You know, it won't meet our needs just right, not like I could do it for you custom."
3) "How much?  Geez.  I could write that in a day!"
4) "It's not using X, which you know is our preferred technology now."
5) "Did they promise to come work here if they dissolve the company?  I mean, you're just gambling on them."

And the list goes on.  We've probably all done it; I know I have.  Why?  Because, as one developer friend once put it (paraphrased): "I love to invent things.  Software is an industry where you get to invent stuff all the time."  In other words, we're creative, smart people who don't feel that we're adequately getting to express our own unique intelligence unless we write the code ourselves.

And now we finally come to what prompted this post.  I recently looked over an article by Joshua Greenberg, Ph.D. on MSDN called "Building a Rule Engine with SQL Server."  I'm not going to comment on the quality of the solution offered because I hardly think I am qualified to do so.  What I was completely flabbergasted by is the total omission of the rules engine being built into Windows Workflow Foundation.  Surely someone who has put that much thought into the theory behind rules engines, which, as is mentioned in his conclusion, are probably best known in workflow systems, would be aware of WF's own?  Surely one of the editors at MSDN Mag, which has done numerous articles on WF, including one on the engine itself published in the same month, would think it worth noting and perhaps comparing and contrasting the approaches?

Now, I don't want to draw too much negative attention to the article or Mr. Greenberg.  He and the editors are no more guilty of ignoring OPC than most of us are.  It is just a prime example of what we see over and over again in our industry.  On the one hand, we glorify reuse as the Supreme Good, but then we turn around and when reusable code (a WinFX library, no less!) is staring us in the face, an SEP field envelops reuse, enabling us to conveniently ignore OPC and start down the joyous adventure of reinventing the wheel.

This has got to stop, folks.  I'm not saying that this ignorance of OPC is the primary cause of the problems in our industry (I happen to think it is only part of the greater problem of techies not getting the needs of business and being smart enough to hide it).  But it is certainly one that rears its ugly head on a regular basis, as we guiltily slap each others' backs in our NIHA (NIH Anonymous) groups.  We have a responsibility to those who are paying us and a greater responsibility to the advancement of our industry (and ultimately the human race) to stop reinventing the wheel and start actually reusing OPC.  I'm not saying there is never a justification for custom code (God forbid!), but that custom code needs to be code that addresses something that truly cannot be adequately addressed by OPC. 

There will always be plenty of interesting problems to solve, which give way to creative and interesting solutions.  Just imagine if all this brainpower that goes into re-solving the same problems over and over again were to go into solving new problems.  Where would we be now?  Now that's an interesting possibility to ponder.

Monday, April 24, 2006 10:21:02 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [2]  | 
# Monday, April 17, 2006

I've been running XP 64 on my new box for a few weeks now, but I've managed to get by thus far without tweaking IIS much.  I was pleasantly surprised when something wasn't working that required the ASP.NET process identity to have write access (dasBlog).  I immediately went to the dir and added ASPNET w/ modify perms on the dir and still got the permission denied error.  So I went back into IIS and lo and behold, it was running IIS 6.  Too cool!

Of course, you may be wondering how I managed to overlook this fact for so long.  It is because I just assumed that it was just running the IIS 6 Manager client.  I usually install that on my XP boxes because I like the updated interface, and it allows me to connect to 2003 IIS servers.  So it didn't phase me at all to see the IIS 6 client.  Somehow I managed to get by without noticing that it was in fact running IIS 6. 

So I just changed the user with write access to Network Service and voila!  Pretty snazzy.  The point is to let you know that if you want IIS 6 with XP, one option is to go out and buy 64 bit machine and run Windows XP 64 on it.  :)  I can tell you that it has a few quirks, but most of them you can work around.  I've not had to boot to my 32bit dual boot at all.  So if you're in the market for a new machine, you should definitely consider 64 bit.

Monday, April 17, 2006 2:44:37 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [2]  | 
# Thursday, April 13, 2006

I'm very happy to announce that I'll be joining Infragistics soon.  Officially, my first day will be May 15th; however, I'll be going to the Alabama Code Camp to represent them next weekend (4/22).  If you're in the Huntsville area, you should definitely check it out; there are tons of great speakers and sessions lined up.  (Mine's not up there yet as it is still undecided which of the ones I submitted I'll be doing.)

Anyhoo, I'll be working for Infragistics as their Codemunicator, a title that they let me coin because the position is kind of a blend of things.  Codemunicator is a portmanteau, as you might guess, from "code" and "communicator."  It sounds like it'll be a lot of fun; I'll get to do a number of things that I enjoy--writing, designing, coding, and even the occasional speaking from what I hear.  And I'll get to work with great guys like Jason Beres (noted author and INETA speaker), Devin Rader (also noted author and ASPInsider), and others whom I've had the pleasure to meet. 

Plus, some other really cool peeps are not far away, like DonXML (MVP and XML extraordinaire), Scott Watermasysk (Mr. .Text himself, ASPInsider, MVP, etc.), Doug Reilly (author, ASPInsider, MVP, etc.), Terri Morton (author, ASPInsider, MVP, etc.), DotNetDude (author, INETA speaker, MVP, etc., though I hear rumors of his not being long for the area), and I'm sure I'm not aware of or forgetting others and/or not getting all of the accolades right (that's my official apology if that's the case).  So all I'm saying is it's a really cool area for .NET experts and ubergeeks. :)  Hopefully we can all get together occasionally for dotNetMoots of some kind.

Of course, this change in employment constitutes a change in locale for me and my family.  We'll be moving from sunny Tampa, FL up to Princeton, NJ (right near East Windsor, home of Infragistics HQ).  I'm sure a lot of folks think such a move is crazy, but the wife and I are not especially keen on the six-month summers down here in Tampa.  We both grew up in cooler climes that have all four seasons, so we're actually looking forward to having them again.  That's not to say that the Tampa area doesn't have lots to recommend it, most notably family, friends, and mild winters, but we still feel this is the right move for us.

We've heard a lot of good stuff about the area we'll be in, both from folks who live there now and who lived there in the past.  Apparently, the whole "armpit of the US" epithet only applies to the Newark/Elizabeth area (in the NE near NYC), and having flown into and out of Newark and driven by Elizabeth, I can believe that.  (No offense to anyone who lives there and likes it!)  But central NJ is actually quite nice from what we saw when we toured the area a bit and from what we've heard.  It's about an hour by train from NYC and Philly, not far from some mountains in Pennsylvania, and not far from the beach and Atlantic City, so we're actually looking forward to it a lot.

All in all, we're pretty psyched about the move, and I'm especially juiced about going to work for a great commercial software company like Infragistics.  They still have openings, so if you think any of them sound interesting, let me know.  I'd love to have more good people come on to work with us.  If any geeks or ubergeeks live in the area and read my blog that I don't know about, give me a shout out.  I'll be helping Jason Beres et al pump up the NJDOTNET user group, so join and come if you're in the area.  You WILL join and come! <waves hand using Jedi mind trick>

TTFN!

Thursday, April 13, 2006 10:13:08 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [9]  | 
# Tuesday, April 11, 2006

Do you find yourself being overwhelmed by the amount of email flowing into your inbox?  Here are some tips I've used to keep my inbox nearly empty over the years.  And most of these tips extend to your real-world and voicemail inbox as well and will (I think) help you remain more sane and organized (and polite).

1 - Do not subscribe to email lists that you are not actively interested in.  This seems obvious to some people, but if you find yourself just deleting every message from a particular email list, newsletter, or other source of emails, just unsubscribe.  Maybe you were really keen on the particular subject at the time you subscribed; maybe you thought it'd be neat to stay on top of X, but if you find that's just not the case--that it's not important enough to you to do so, just cut it out; you can almost always subscribe again later if you get interested.

2 - Think of your inbox like a to-do list; it already is in a certain sense, but make it more formal in your head.  Anything in your inbox needs attention, and it needs it as soon as you can get to it.  The reason this is helpful is that it can help motivate you to not let things pile up.  It also lends towards other helpful things like the next tips.

3 - Try to take action on every email as soon as you read it.  If it requires a response, try to respond right away.  If you need to think on it, it's okay to leave it there for the next pass.  If you think it will be a while until you can respond like you think you need to and the missive is personal (from a real person to one or few persons), respond right away saying you're thinking about it and give a timeframe within which you intend to respond.  This is just polite and will probably save you from getting more emails from that person asking for a status.  If it is something from a list or newsletter that you are interested in, leave it there for the next pass.

4 - I mentioned the next pass in the previous tip.  This is simply a way of further weeding out your inbox, especially for non-personal emails.  If you truly didn't have time to properly take action on the first pass, the next time you sit down to look at your email, give everything a second look.  This takes far less time, typically, than the first pass, and allows you to quickly determine if you feel you can take action on the second pass items.  By the second pass, you should have taken action on 80% or more of the emails in the previous first pass.  Yes, I'm making the percentage up, but I'm just pointing out that if you're finding most emails in the inbox survive the second pass, you're probably not devoting sufficient time to it.  .NET developers can liken this process to .NET garbage collection, if emails survive the first pass, they're promoted to gen1, and so forth.  But the higher the generation, the fewer remaining emails there should be. 

5 - Aggressively delete.  Be willling to decide that you just are not going to get to something and either file it away or, preferably, delete it.  This only applies to non-personal emails that don't require a response (e.g., the newsletter/email list variety).  You may think that you'll get time some day to look at it, but I assure you, if it reaches the third pass and is still not important enough to make time for, you probably never will make time for it.  In my opinion, the only things that should survive the third pass are items that truly require action on your part but that may require more time than the average email response.  For instance, if you get a bill reminder, you can't just choose to delete and ignore that, but you may not have time until, say, the weekend to get to it.  It's fine to just let these lie, but leave them in the inbox so that you don't forget.  You should have very, very few emails that survive the third pass.  If you have lots, you're not giving your email enough time.

6 - I should mention that in the last three tips, there is implied prioritization.  In my book, emails from one person directly to you should always take precedence, even if you're not particularly keen on it (e.g., if someone is asking you for help, which happens for folks like me who publish helpful stuff).  I consider it rude to ignore personal emails, even from recruiters, so I always make an effort to respond, if nothing else than to say that I'm sorry that I don't have time.  To me, this is just common sense politeness, and I hate to say it, but it really irks me when folks don't extend the same courtesy to me.  The good news is that if you follow my tips, you can continue to be a polite person, at least in that regard, because your inbox never gets so full that you don't have time at least for the personal emails.  (And by "personal" I don't mean non-business; I mean from a real person to a real person, so business-related missives are not excluded from this rule.)

7 - Check your email proportionately to how often you get personal email.  It's okay to let newletters and lists pile up because you can safely delete huge swaths of those if they get piled up, but it is not okay (IMO) to let personal emails pile up.  If that's happening, you need to check email more often and/or make more time for it.  Maybe it's not your favorite thing, but it is just part of life.  If you're important enough, get someone to screen your emails for you.

If you follow these guidelines and still find your inbox piling up, you're either really, really important and famous, or you're just not being honest with yourself about what you do and don't have time for.  If nothing else, find a way to stay on top of the personal email.  Even if you don't like my approach to everything else, it is just the polite thing to do.

Tuesday, April 11, 2006 9:45:11 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, March 10, 2006

asp.netPRO is doing their yearly survey to see who's the best of the best.  I hope you will consider voting for ASPAlliance.  We've been working to improve your experience over the last year, and our vision is to keep improving and providing high-quality content for Microsoft developers in the future.

Also, if you don't know whom to vote for in hosting, go for Server Intellect.  They do an awesome job of providing affordable, high-quality hosting and enabling you to take control.

Go vote!

Friday, March 10, 2006 6:18:18 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, February 17, 2006

Have you heard of the International Association of Software Architects?  If you’re a software architect, or even an aspiring one, you need to be aware of this inspiring organization.  While some architecture organizations focus on a top-down, global or vendor-based approach, the IASA focuses on all IT architects at every level, regardless of their vendor affiliations, starting at the local area.  To better serve this end in the central Florida area, Tom Fuller and I have started up the Tampa Bay chapter

The IASA’s sole aim is to provide value to each other and to make IT/software architecture a full-fledged profession with a significant knowledge base and quality controls.  Key goals of the IASA are:
   • To provide the latest news and articles in the architecture discipline.
   • To support the establishment of strong relationships among architects both as peers and as mentors.
   • To support and fulfill the needs for working groups as challenges in our industry call for them.
   • To provide both local and a global forum for debate of issues pertinent to the profession.
   • To enable each and every architect the ability to grow in the profession and to impact the software industry in positive ways.

If you want to know more about this organization, please visit the IASA Web site or contact Tom or me directly (or even comment on this blog).  To get involved and be aware of important announcements (such as meeting times) in the Tampa Bay and central Florida area, register on the site for the Tampa Bay chapter.   We look forward to seeing you there!

Friday, February 17, 2006 4:00:07 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 
# Wednesday, February 15, 2006

This is a continuation of a discussion I began with a short article on ASPAlliance.  Tom Fuller had some good comments, and I've since been discussing the subject with some other experienced developer-architects.  Both Tom and they have very valid and interesting concerns about what I'm proposing.  The biggest pushback has been on my definition of the architect role as being 1) not an überdeveloper and 2) not necessarily more valued or well-paid than developers.  Of course, there is also some vagueness around just what an architect would do, and that much, at least, was intentional.

First, let me say these propositions are not thoroughly baked in my mind, and rather than trying to tell everyone exactly how it should be, I’m just tossing out some ideas that may or may not pan out in the end.  This is the nature of thoughtful dialogue, and I appreciate anyone who has constructively critical feedback.  As such, I’m not promising any sort of unbreakable consistency on my part as the ultimate solution is the goal and not the presupposition of this dialogue.

Now, the core issues with both of the objections mentioned above are, I think, inherent in the way the architect role is perceived today.  Because architects are expected to design systems down to even the class/interface level, they are expected to be developers, and because they’re expected to know everything developers know and more, they should naturally be more valued and paid better by the business.  It may be true that this is the way things should be and continue to be, but what I’m suggesting is that this view of the architect role is in fact not optimal to achieve successful projects or even enterprises.

I think that if we did have a complete career track (such that architecture is in fact a distinct but related discipline), an architect's duties would be distinct enough from a developer's to not require “real-life” dev experience in order to be good.  This would be achieved not just through training or book knowledge but also from experience as an architect, starting as a junior and being mentored by senior architects.  This way you are still learning from experience but instead of learning developer lessons, you're learning architect lessons.

Similarly, because architects are responsible for a different set of problems, they don’t need to be an expert developer in the first place.  While their role may ultimately be perceived as more valuable to a business, it would not be due to simply being a developer++ but rather due to the distinct value they bring to the business.  And even within the architecture discipline, I imagine there’d be different levels of experience as well as different kinds of architecture responsibilities (solution architect, enterprise architect, and infrastructure architect, for example).  You’d have some junior architects who are less valued (paid less) than some mid-level or senior developers.

This does not preclude some cross-over between the disciplines—a senior developer would likely have a faster track to a senior architect role than a junior architect right out of college because there is most definitely a shared body of knowledge between the two. It may be that said developer simply has not had the opportunity to pursue an architecture path, or it may be that due to fiscal constraints, he’s had to play both roles and hasn’t been able to fully pursue architectural interests, which is often the case for smaller shops and consulting firms.

Of course, all of this implies a redefinition to some extent of the role of an architect--he'd take on some of what some people think of as business analyst functions, some of what might today be seen as project manager functions, some of what may be seen as QA functions, and even perhaps some of what might be seen as developer functions. 

The architect would take on the responsibility of understanding and assisting BAs in analysis of the business in terms of how and where technology would best apply to solve business problems.  She’d be responsible not for scheduling and facilitating the keeping of the project on track but rather in ensuring that the business needs are sufficiently being addressed by what the developers are building.  In ensuring the output of the project is aligning with standards and business needs, she serves in a kind of QA role.  And she might do some development of cross-functional solutions like authentication, authorization, and auditing, but at least she’d take on the specification of how these kinds of things should take place to ensure consistency in areas of cross-functional functionality and within the greater enterprise architecture.

I don’t think it is fruitful to further delineate responsibilities at this level because the specific responsibilities will vary based on the size of the development team(s) and, more importantly, the methodology being used.  The key realization is that the architect is the hub between these various disciplines, not some other person (such as the BA or PM).  The reason I think the architect should be this person is that the product being built is software, and you need an individual who is very well-versed in software design, the trade-offs, the technologies, and the patterns but who can also get and jive with the other disciplines to ensure that everything is working together in a complete picture of a well-oiled machine.  It is a very strongly technical position, but it is also a very strongly people- and business-oriented position.

To facilitate this change or (I’d suggest) better definition of an architect’s role, he'd be divesting himself of what some people see as being the “ivory tower” architecture role--the idea of a specification of the system to its lowest level of abstraction and the handing down of such a specification from on high to developers.  This is key in getting developer buy in to any kind of idea of architect, and it is key in the architect being able to take on a more holistic role in terms of the project.

Most devs want to do some design work—within their technologies and within specified business requirements, and I think in this world I'm proposing, they would.  The role of the architect in terms of system design would definitely be the big picture concerns--cross-functional and non-functional, cross-application, technical requirements, etc.--and the architects and devs would have to work together.  Design could not be done in a vacuum in either role.
At this point, the term "architect" comes into question as the metaphorical value becomes less.  Indeed, I think architect is probably not the best metaphor because we're dealing with designing computer systems that make business processes more efficient, not with designing a free-standing physical building.  So perhaps we've approached the problem wrong in the first place--the laws of physics don't change every week, but business processes can and do.

It may be that the role I’m envisioning is not in fact a refining of the architect role but rather the specification of a new distinct role.  But if that is the case, I question the value of thinking of architects and developers as distinct roles, which speaks volumes to the current confusion around the roles today.  Most developers do design work, and if the only distinction between the roles is whether or not design work is done, where’s the value in the distinction?  Why not just call all developers architects or all architects developers?

Truly, though, I think there is a distinction—the one I am trying to draw out via further refinement of what we might think of as the architecture discipline.  It’s partial refinement, partial redefinition, but I tend to think that this refinement and redefinition is necessary to not only enable the discipline to grow but also to be able to communicate a distinct value that a person in the “architect” role brings to the table.  He’s not just a developer (though he certainly can have development skills), he’s the key role—the hub between the disciplines we’ve come to realize should be involved in software creation—that ultimately makes or breaks a project.

That’s not to imply that the other disciplines do not add value by any means or that their failure will not break a project as well.  But the state of things today seems to be that these disciplines have a hard time of coordinating and actually coming up with a coherent solution that truly meets the business needs.  Up to this point, we’ve been refining those disciplines in themselves and trying to define processes, methodologies, and documentation to solve the problem of failing projects or projects that just aren’t solving business issues.  And last I read, as an industry, we’re still doing terribly poorly when it comes to project success from a business perspective.

So I think rather than solving the issue by further refinement of the currently-known disciplines, processes, and methodologies, we need to pull architecture out of the development discipline, pull out of development what needs to be pulled out with it (which does not mean all design), and give architecture new responsibilities (or at least better defined ones) that essentially relate to being the technical hub of the project or enterprise.

If at the end of the day we still call it the architect role is irrelevant to me.  I don’t think it fully speaks to the role I’m imagining, but it does to some extent, and since the term is fully entrenched, it may not be worthwhile completely changing it but rather just redefining it to something more pertinent to our industry that deals with business processes, not gravity, mortar, wood, plumbing, and electricity. 

On the other hand, software is a new thing in human experience, and I tend to think the repeated attempts to redefine terms from other industries in our own may not be the best approach.  Whether it’s likening it to authoring, architecting, developing, constructing, etc., none of these will fully speak to the roles necessary for successful software.  So we need to keep that in mind as we think about how we further refine our industry and be willing to coin new roles as they are necessary.  But that, I suppose, could be a discussion all its own.

Wednesday, February 15, 2006 5:13:31 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [1]  | 
# Tuesday, February 07, 2006

Check out my latest piece on ASPAlliance and let me know what you think!

Tuesday, February 07, 2006 9:00:23 AM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, December 13, 2005

The script below will grant execute on all stored procedures in the selected database to the specified role.  Just change it to use your database and, if desired, change the role name.  After that, just add users to that role.  You should be able to rerun this script as needed.  It was tested in 2005 but should work in 2000.

use [YourDatabase]
go

declare @sprocName sysname,
    @roleName sysname,
    @grantStatement nvarchar(4000)
select @roleName = N'db_exec_all_sprocs';

if not exists(select * from sys.database_principals
    WHERE name = @roleName AND type = 'R')
        exec sp_addrole @roleName;

declare sprocs cursor local for
    SELECT [name] FROM sys.objects
        WHERE type in (N'P', N'PC')

open sprocs
while (1=1)
begin
    fetch next from sprocs into @sprocName
    if @@fetch_status != 0 break;
    select @grantStatement = N'grant execute on '
        + @sprocName + N' to ' + @roleName;
    print N'Granting: ' + @grantStatement;
    exec sp_sqlexec @grantStatement;
end
close sprocs
deallocate sprocs
go

Tuesday, December 13, 2005 5:10:03 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [1]  | 
# Wednesday, November 16, 2005

Yesterday, Mrs. dotNetTemplar wanted to order pizza for lunch.  (I was working from home.)  So I pulled up papajohns.com and put in an order.  About 45 minutes later, the delivery guy arrived, and I met him at the door.  I just happened to be wearing one of the shirts that Microsoft has given me.

"You know, you oughta put that shirt up on eBay," he said.  "You might be able to get two Linux shirts for that one Microsoft shirt."

"Well, ya know, it pays the bills, hehe," I retorted kindly, and he acquiesced on that point, returning to his car.

It wasn't until a minute or two later that the full irony of the exchange hit me.

"You might be able to get two Linux shirts for that one Microsoft shirt," said.. the.. pizza.. delivery guy.. to.. the.. full-time.. professional.. software developer.

Hmm...  maybe he just loves delivering pizza...

Wednesday, November 16, 2005 1:47:10 PM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [3]  | 
# Thursday, November 10, 2005

Yesterday on the aspnet-architecture list (on AspAdvice.com), we had a little fun going off topic, but rather than continue that discussion there, I thought I just post my thoughts here on something that sprung up (again), namely, the "outsourcing threat." 

Now I'm going to be (potentially) elitist.  Simply put, if you are valuable enough, there is no such thing as an outsourcing threat.  All natural-born Americans have something unique to offer, which is their expertise in American culture and American English.  These are not skills that someone can just pick up at an ESL course or even at a four-year college.  This is something you only get by living here for a long time and, likely, since childhood, and these can be used to your advantage in business.

Granted, I'm not going to say that outsourcing isn't replacing jobs that would otherwise be filled by Americans (that would require living in a fantasy world).  But the jobs that can go off shore are only those that don't require the skills mentioned above.  In the software industry, this is typically going to be low-level support and programming, typically jobs that only require a very basic level of proficiency and little interaction with the business.  I know, for instance, a company that sends off a lot of coding to India, but they still employ a lot of Americans to do the business analyzation and design and even some development. 

The reasoning is obvious--the business is American and Americans like working with Americans because of the aforementioned skills; there is no language or culture barrier to overcome.  Getting software right is hard enough without introducing these barriers, so any company worth its salt will see the value in that and hire Americans wherever interaction with American business is required.

So that's one way in which any American can get the upper hand.  Take advantage of your American-ness.  That's something you can't just ship off shore.  And I can promise you there are tons of businesses, especially smaller businesses, that will never consider off-shoring their work or working with non-American companies.

Another way you can make off-shoring irrelevant to you is simply by becoming good and staying good at what you do.  This requires extra effort, but it will pay off.  It requires passion about what you do.  If you're just in the software industry because you think it is a cakewalk and pays well, you probably won't get very far and off-shoring should be a concern for you.  But if you're passionate about software and translate that into just becoming d**n good at it, you will never lack for employment, whether that's working for a company or as an independent contractor.

Let's quit victimizing ourselves.  American economy is built on the free market, and it has proven time and again that it works.  The latest example of this is the recent oil conundrum caused by the hurricanes.  We had many people predicting ridiculously high prices and suggesting government controls, but by simply going up to where they did, demand was strangled and prices were forced to come down.  You can apply the same to the software industry; it is a matter of supply and demand.  You need to identify where there is a demand and fill it.  There will always be a demand for Americans who understand American culture and business as well as software, and there will always be a demand for good people. 

So rather than complaining about the free market and trying to get the government to control American businesses, which is rarely a good thing, focus on yourself, make yourself an indispensable commodity, and you won't have to worry about the fact that some jobs are being off-shored.

Thursday, November 10, 2005 9:24:05 AM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [4]  | 
# Thursday, May 19, 2005

I just ran across the patent application for what appears to be some rendition of the purportedly defunct Object Spaces.  At least, I hope that's what it is and not some attempt by Microsoft to patent the idea of entity mapping itself.  I didn't read the whole thing (who has that kind of time!?), but I can only assume (because patenting entity mapping itself would be preposterous) that it is a patent for their particular solution that they are working on for the WinFS timeframe.

In any case, I guess those who were trying to model their own entity mapping utilities off of object spaces need to be careful if/when MS gets the patent on it.  I'm not really sure I see what's to be gained by patenting their approach.  Microsoft will squash any competition in the space when they get something out there anyways...

Thursday, May 19, 2005 3:47:05 PM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, May 06, 2005
Why pirating music, software, movies, and other seeming victimless crimes are not okay.
Friday, May 06, 2005 2:51:10 AM (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [26]  | 
# Thursday, March 31, 2005
Looks like Server 2003 SP1 went to RTM status recently (like yesterday).  As you can see in the list of Top 10 Reasons to install it, you really should seriously consider getting it ASAP for your 2003 servers.  There are some great security features they're adding. 
Thursday, March 31, 2005 10:01:09 AM (Eastern Standard Time, UTC-05:00)  #    Disclaimer  |  Comments [0]  | 

Disclaimer
The opinions expressed herein are solely my own personal opinions, founded or unfounded, rational or not, and you can quote me on that.

Thanks to the good folks at dasBlog!

Copyright © 2014 J. Ambrose Little