On this page.... RSS 2.0 | Atom 1.0 | CDF
# Saturday, 15 September 2007

As I sit here on my deck, enjoying the cool autumn breeze1, I thought, what better thing to write about than Web services!  Well, no, actually I am just recalling some stuff that's happened lately.  On the MSDN Architecture forums and in some coding and design discussions we had this week, both of which involve the question of best practices for Web services.

Before we talk about Web services best practices, it seems to me that we need to distinguish between two kinds of application services.  First, there are the services that everyone has been talking about for the last several years--those that pertain to service-oriented architecture (SOA).  These are the services that fall into the application integration camp, so I like to call them inter-application services. 

Second, there are services that are in place to make a complete application, such as logging, exception handling, data access and persistence, etc.--pretty much anything that makes an application go and is not a behavior of a particular domain object.  Maybe thinking of them as domain object services would work, but I fear I may already be losing some, so let's get back to it.  The main concern within this post are those services using within an application, so I call them intra-application services.

It seems like these latter services, the intra-application ones, are being often confused with the former--the inter-application services.  It's certainly understandable because there has been so much hype around SOA in recent years that the term "service" has been taken over and has lost its more generic meaning.  What's worse is that there has been a lot of confusion around the interaction of the terms Web service and just plain service (in the context of SOA).  The result is that you have folks thinking that all Web services are SO services and sometimes that SO services are always Web services.

My hope here is to make some clarification as to the way I think we should be thinking about all this.  First off, Web services are, in my book at least, simply a way of saying HTTP-protocol-based services, usually involving XML as the message format.  There is no, nor should there be, any implicit connection between the term Web service and service-oriented service.  So when you think Web service, don't assume anything more than that you're dealing with a software service that uses HTTP and XML. 

The more important distinction comes in the intent of the service--the purpose the service is designed for.  Before you even start worrying about whether a service is a Web service or not, you need to figure out what the purpose of the service is.  This is where I get pragmatic (and those who know me know that I tend to be an idealist at heart).  You simply need to determine if the service in question will be consumed by a client that you do not control. 

The reason this question is important is that it dramatically affects how you design the service.  If the answer is yes, you automatically take on the burden of treating the service as an integration (inter-application) service, and you must concern yourself with following best practices for those kinds of services.  The core guideline is that you cannot assume anything about the way your service will be used.  These services are the SO-type services that are much harder to design correctly, and there is tons of guidance available on how to do them2.  I won't go in further depth on those here.

I do think, though, that the other kind of services--intra-application services--have been broadly overlooked or just lost amidst all the discussion of the other kind.  Intra-application services do not have the external burdens that inter-application services have.  They can and should be designed to serve the needs of your application or, in the case of cross-cutting services (concerns) to serve the needs of the applications within your enterprise.  The wonderful thing about this is that you do have influence over your consumers, so you can safely make assumptions about them to enable you to make compromises in favor of other architectural concerns like performance, ease of use, maintainability, etc.

Now let's bring this back to the concrete question of best practices for intra-application Web services.  For those who are using object-oriented design, designing a strong domain model, you may run into quite a bit of trouble when you need to distribute your application across physical (or at least process) tiers.  Often this is the case for smart client applications--you have a rich front end client that uses Web services to communicate (usually for data access and persistence).  The problem is that when you cross process boundaries, you end up needing to serialize, and with Web services, you usually serialize to XML.  That in itself can pose some challenges, mainly around identity of objects, but with .NET, you also have to deal with the quirks of the serialization mechanisms.

For example, the default XML serialization is such that you have to have properties be public and  read-write, and you must have a default constructor.  These can break encapsulation and make it harder to design an object model that you can count on to act the way you expect it to.  WCF makes this better by letting you use attributes to have better control over serialization.  The other commonly faced challenge is on the client.  By default, if you use the VS Add Web Reference, it takes care of the trouble of generating your service proxies, but it introduces a separate set of proxy objects that are of different types than your domain objects.

So you're left with the option of either using the proxy as-is and doing a conversion routine to convert the proxy objects to your domain objects, or you can modify the proxy to use your actual domain objects.  The first solution introduces both a performance (creating more objects and transferring more data) and a complexity (having conversion routines to maintain) hit; the second solution introduces just a complexity hit (you have to modify the generated proxy a bit).  Neither solution is perfectly elegant--we'd need the framework to change to support this scenario elegantly; as it is now, the Web services stuff is designed more with inter-application services in mind (hence the dumb proxies that encourage an anemic domain model) than the intra-application scenario we have where we intend to use the domain model itself on the client side.

If you take nothing else away from this discussion, I'd suggest the key take away is that when designing Web services, it is perfectly valid to do so within the scope of your application (or enterprise framework).  There is a class of services for which it is safe to make assumptions about the clients, and you shouldn't let all of the high-falutin talk about SOA, WS-*, interoperability, etc. concern you if your scenario does not involve integration with other systems that are out of your control.  If you find the need for such integration at a later point, you can design services (in a service layer) then to meet those needs, and you won't be shooting yourself in the foot trying to design one-size-fits-all services now that make so many compromises so as to make the app either impossible to use or very poorly performing.

My own preference that I'd recommend is to use the command-line tools that will generate proxies for you (you can even include a batch file in your project to do this) but then modify them to work with your domain model--you don't even need your clients to use the service proxies directly.  If you use a provider model (plugin pattern) for these services, you can design a set of providers that use the Web services and a set that talk directly to your database.  This enables you to use your domain model easily in both scenarios (both in a Web application that talks directly to the db as well as a smart client that uses Web services). 

It requires a little extra effort, but it means you can design and use a real domain model and make it easier easier to use by hiding the complexity of dealing with these framework deficiencies for consumers of the domain model.  This is especially helpful in situations where you have different sets of developers working on different layers of the application, but it is also ideal for use and reuse by future developers as well.

One of these days, I'll write some sample code to exemplify this approach, maybe as part of a future exemplar.

Notes
1. The weatherthing says it's 65 degrees Fahrenheit right now--at 1pm!
2. My observation is that it is safe to assume that when other people talk about services and Web services, these are the kind they're thinking of, even if they don't make the distinction I do in this post. 

Saturday, 15 September 2007 18:00:03 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, 10 September 2007

I wasn't going to post about it, but after reading Don's post, I realized that I should so that I can thank those involved in presenting me with this honor.  I was surprised when I was contacted about being nominated to be an INETA speaker, and I was even more surprised when I heard that I'd been voted in.  Looking over the folks on the list, I feel hardly qualified to be named among them.

So without further ado, let me thank David Walker (who's an all around great guy and VP of the Speakers Bureau), Nancy Mesquita (who I've not had the pleasure to meet personally but has been very helpful in her role as Administrative Director), as well as everyone else involved on the Speaker Committee and others (whom I know not of specifically) in welcoming me into the INETA speaker fold.  It's a great honor--thank you. 

Now, I have to get back to work!  My group, UXG, just released Tangerine, the first of our exemplars, and now we're on to the next great thing!

Monday, 10 September 2007 10:19:19 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Monday, 27 August 2007

A week or so ago, I received this nice little letter saying that I'd been nominated to the Cambridge Who's Who, which purports to be an organization that recognizes industry professionals.  All I had to do was fill out a simple form online and I'd be entered, so I did this (never hurts to add a tick mark to your resume...).  A few days later, I was called (today) by them, and they asked for information about me, which I provided.  After congratulating me for being inducted, I was introduced to their "Platinum" and "Gold" membership options, which cost several hundred dollars.

At this point, I'm getting a tad suspicious, and being one who rarely buys something over the phone, I said thanks for the info but I'd have to think about it more.  It was at this point that the true colors of the whole deal became clear.  I was told that in order to publish my info and get me access to all these wondrous benefits of being a member, I needed to decide if I wanted to be gold or platinum.  I balked, saying that most industry accolades don't come with a price tag (at least not the ones I've received).  In fact, they tend to come with benefits.

Well, not so with the Cambridge Who's Who.  You have to pay hundreds of dollars for the honor of being a member.  Maybe for some, it'd be worth it.  But considering I'd never heard of them prior to the letter I was sent, I wasn't about to fork over cash to join.  The "services" they provide are publishing my info and connecting me to the other 250,000 notables.  Wait a sec.  Don't I get that and more for free using things like LinkedIn and Facebook? 

So if you get a letter from them, be forewarned.  Don't waste your time unless you intend to fork over a handful of cash for services you can get for free.

Monday, 27 August 2007 11:49:00 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Tuesday, 14 August 2007

Thanks to a sharp co-worker of mine, I was recently introduced to "Magic Ink: Information Software and the Graphical Interface," by Bret Victor.  It was quite an interesting read; Victor makes a lot of good points.  For instance, he suggests that we should view information software as graphic design, i.e., taking the concerns of traditional graphic design as paramount and then taking it to the next level by availing ourselves of context-sensitivity, which he defines as inferring the context from the environment, history, and, as a last resort, interaction.

Minimizing Interaction

The thrust of the argument is around reducing interaction and making software smarter, i.e., more context aware and, eventually, able to learn through abstractions over learning algorithms.  I think we can all agree with this emphasis, but I do think he unnecessarily latches onto the term "interaction" as a bad thing, or rather, I think he presents "interaction design" in an overly-negative light. 

True, the smarter we can make computers (and consequently require less interaction from users) the better, but that doesn't negate the usefulness of interaction design, human factors, information architecture, and usability.  There are many, valuable things to be learned and used in all of these interaction-oriented fields, and we shouldn't deride or dismiss them because they focus on interaction.  I felt that Victor's negative emphasis on this and his speculating that why software sucks in relation to this took away from the value of his overall message.

The Problem of Privacy

There is one problem that I don't think he addressed in terms of increasing environmental context awareness, and that is security, specifically, privacy.  It is tempting to think about how wonderful it would be for a computer to know more about our environment than us and thus be able to anticipate our needs and desires, but in order to do this, we, as humans, will have to sacrifice some level of privacy.  Do we really want a totally connected computer to know precisely where we are all the time?  Do we really want it to be "reporting" this all the time by querying location aware services?  Do we really want a computer to remember everything that we've done--where we've been, who we've interacted with, when we did things?

I think the trickier issues with context awareness have to do with questions like these.  How do we enable applications to interact with each other on our behalf, requiring minimal interaction from us, while maintaining our privacy?  How does an application know when it is okay to share X data about us with another application?  Do we risk actually increasing the level of interaction (or at least just changing what we're interacting about) in order to enable this context sensitivity? 

If we're not careful, we could end up with a Minority Report world.  People complain about cookies and wire taps, the world of computer context-sensitivity will increase privacy concerns by orders of magnitudes.  This is not to negate the importance of striving towards greater context sensitivity.  It is a good goal; we just need to be careful how we get there.

Towards Graphic Design

One of the most effective points he made was in illustrating the difference between search results as an index and search results as a tool for evaluation itself, i.e., thinking about lists of information in terms of providing sufficient information for a comparative level of decision making.    It is a shift in how developers can (and should) think about search results (and lists in general).

Similarly, his example of the subway schedule and comparing it to other scheduling applications is a critical point.  It illustrates the value of thinking in terms of what the user wants and needs instead of in terms of what the application needs, and it ties in the value of creating contextually meaningful visualizations.  He references and recommends Edward Tufte, and you can see a lot of Tufte in his message (both in the importance of good visualizations and the bemoaning of the current state of software).  I agree that too often we developers are so focused on "reuse" that we fail miserably in truly understanding the problems we are trying to solve, particularly in the UI.

That's one interesting observation I've had the chance to make in working a lot with graphic/visual designers.  They want to design each screen in an application as if it were a static canvas so that they can make everything look and feel just right.  It makes sense from a design and visual perspective, but developers are basically the opposite--they want to find the one solution that fits all of their UI problems.  If you give a developer a nicely styled screen, he'll reuse that same style in the entire application.  In doing so, developers accidentally stumble on an important design and usability concept (that of consistency), but developers do it because they are reusing the design for maximum efficiency, not because they're consciously concerned about UI consistency!  It is a kind of impedance mismatch between the way a designer views an application UI and the way a developer does.

The Timeless Way

I'm currently reading Christopher Alexander's The Timeless Way of Building, which I hope to comment on in more depth when done.  But this discussion brings me back to it.  In fact, it brings me back to Notes on the Synthesis of Form as well, which is an earlier work by him.  One of the underlying currents in both is designing a form (solution, if you will) that best fits the problem and environment (context).  The timeless way (and patterns and pattern language, especially) is all about building things that are alive, that flow and thrive and fit their context, and the way you do that is not by slapping together one-size-fits-all solutions (i.e., reusing implementations) but in discovering the patterns in the problem space and applying patterns from the solution space that fit the problem space just so.  The reuse is in the patterns, at the conceptual level, but the implementation of the pattern must always be customized to fit snugly the problem. 

This applies in the UI as well as other areas of design, and that's the underlying current behind both Tufte's and Victor's arguments for the intelligent use of graphic design and visualization to convey information.  You must start by considering each problem in its context, learn as much as you can about the problem and context, then find patterns that fit and implement them for the problem in the way that makes the most sense for the problem.  But more on the timeless way later.

A Good Read

Overall, the paper is a good, thought-provoking read.  I'd recommend it to pretty much any software artisan as a starting point for thinking about these issues.  It's more valuable knowledge that you can put in your hat and use when designing your next software project.

Tuesday, 14 August 2007 10:41:14 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, 30 July 2007

Are you passionate about software development?  Do you love to share your knowledge with others?  Do you like working in a vibrant, fun culture working on the latest and greatest technologies with other smart and passionate people?  If so, I think I may have your dream job right here.

We're looking for another guidisan to help craft guidance using best practices for .NET development.  The word guidisan ('gId-&-z&n) comes from a blending of "guidance" and "artisan," which really speaks to the heart of the matter.  We're looking for software artisans who have the experience, know-how, and gumption to explore strange new technologies, to seek out new applications and new user scenarios, to boldly go where other developers only dream of going in order to provide deep, technical guidance for their colleagues and peers.

What do guidisans do? 

  • Help gather, specify, and document application vision, scope, and requirements.
  • Take application requirements and create an application design that meets the requirements and follows best known practices for both Microsoft .NET and Infragistics products.
  • Implement applications following requirements, best practices, and design specifications.
  • Create supplemental content such as articles, white papers, screencasts, podcasts, etc. that help elucidate example code and applications.
  • Research emerging technologies and create prototypes based on emerging technologies.
  • Contribute to joint design sessions as well as coding and design discussions.

What do I need to qualify?

  • Bachelor’s Degree.
  • 4+ years of full-time, professional experience designing and developing business applications.
  • 2+ years designing and developing.NET applications (UI development in particular).
  • Be able to create vision, scope, and requirements documents based on usage scenarios.
  • Demonstrated experience with object-oriented design; familiarity with behavior-driven design, domain-driven design, and test-driven development a plus.
  • Demonstrated knowledge of best practices for .NET application development.
  • Accept and provide constructive criticism in group situations.
  • Follow design and coding guidelines.
  • Clearly communicate technical concepts in writing and speaking.

If you think this is your dream job, contact me.  Tell me why it's your dream job and why you think you'd be the next great guidisan.

Monday, 30 July 2007 15:01:27 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Sunday, 15 July 2007

Thanks to all who came to my "suave sessions" session yesterday at Tampa Code Camp.  Now you're all "it getters," and you get some free code, too.

Download the Session Management Code

Enjoy!

Sunday, 15 July 2007 07:50:42 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 
# Tuesday, 29 May 2007

As a subscriber to First Things and Touchstone, I know that musings upon the compatibility or incompatibility of Christian faith with evolutionary theory are not in short supply.  Neither, of course, are the unceasing dialectics on the truth or falsity of evolution, including all the usual suspects and alternatives.  But this may come as a surprise to those who only follow these topics as flare ups occur in the national media--there are no assertions being made in these thoughtful magazines that Catholics, or any Christians, must adhere to literal creationism.

The authors writing for these magazines are well-known names in general (the Pope himself wrote a recent article, as well as Supreme Court Justice Scalia), and First Things in particular hosts some of the icons in the evolutionary debate.  These aren't oddball nobodies, and I only say that to quell any imaginations that the voices in these magazines are on the sidelines--these are people of note and, in many cases, authorities in the fields upon which they're opining.

The most recent discussions have been around the feasibility of the formal notion of "intelligent design," which as I understand it revolves around arguments against chance-based evolution due to irreducible complexity in organsims, such as the eye.  One may note in this that it is not an argument against all evolution, nor is it an argument for literal (seven literal days as in the biblical account) creationism.  I am not going to say much more on the details of the theory because I'm not qualified and would probably get it wrong anyways.  My point is simply that there are respectable Christian positions in the evolutionary debates that are not the oft-touted literalist creationism.

This came up recently for me at work.  We were brainstorming ideas for visualizing something, and the idea of evolution came into play, so I tossed out, half-joking, that we should show a fish crawling out of water and turning into a monkey or something like that.  Rather joltingly, a co-worker blurted out "hey, I thought you didn't believe in that," to which I, dumbfounded that my beliefs were brought up in that context but more annoyed at the misconception of my beliefs, just stared, smiled, and moved on.

Being busy like we are, I had to set it aside and just focus on what needed to be done, but now that I have the luxury to propound what I actually believe (in what I think is a proper medium and place for such exposition), let me say that Christian faith does not presuppose literal creationism.  In fact, it doesn't even preclude strict evolutionary theory, biologically speaking.

This message still seems newsworthy; the popular misconception of the Christian being the ostrich with his head in the sand in regards to evolution (and science in general) is still in force, as evidenced by my co-worker's remark.  This doesn't surprise me; I still recall reading the headline "Pope Says Evolution Compatible with Faith" back at university in 1996 (before I became Catholic myself).  It made an impression on me because I was in fact raised in the milieu that evolution is inimical to the faith, so there is definitely some truth to the stereotype.  Plus, the literal creationists tend to be the ones who make the most noise and controversy, which is likely why the stereotype exists and persists.  So I have to be patient and understanding with those who hold the stereotype, but I also want to do what I can to dispel it--to make some noise of my own.  Sadly, "Catholic Software Creator Says Evolution Compatible with Faith" doesn't promise to make much noise, but I can try.   

As this article mentions, Catholicism has long been reconciled to the possibility of an evolutionary biological mechanism in nature.  Despite the ever-popular sensationalizing of the Inquisition and the Church's treatment of Gallileo, Catholicism has a very positive view of reason and science.  Philosophy and learning have ever been a bulwark of Catholic (Christian) faith. 

For example,  St. Justin Martyr, an early second-century Christian (as in less than 100 years after the Christian Church was founded), championed the idea that there is truth and wisdom to be found in non-Christian learning.  He specifically builds on St. John the Apostle's (the disciple of Jesus and author of several New Testament books) description of Jesus as the "Word" (i.e., Logos, which is Greek for the faculty of reasoning) of God, the Word made flesh.  This passage has been the basis for much deep theological reflection over the milennia, and St. Justin is just one of the earliest examples of the friendship of Christian faith and reason.

One only has to lightly peruse a book on the Fathers of the Church, the discussions and resolutions of Church councils, or a handbook on medieval scholasticism to see that from its very origins and consistently throughout its 2000-year history, Christian faith has been deeply rational and embracingly friendly to learning.  The first universities were Catholic, and many of the greatest thinkers throughout Western history have been Catholic, including several of our current Supreme Court justices.

The exceptions to this friendship have occurred only when there is a perceived threat to the faith, and in those cases, it is not a fear of science per se but rather a sincere and generally well-founded concern for souls.  While I think it is true that this concern was misdirected and even abused at times, the point remains that it is not a general enmity for science or learning that animates those actions we see as negative but rather an overreaching of the pastoral impulse--to protect souls at all costs, even at the cost of the body or of freedom.  It is hard to understand the medieval mind on this point because our society today is different in very dramatic ways,  but that particular point is subject enough for multiple books (and numerous books have indeed been written, such as Characters of the Inquisition). 

The issue here is that even the sensationalist examples that are usually used to support the assertion that Christianity (and, in particular, the Catholic Church) are anti-rational, anti-science, and/or anti-learning are just not true.  In fact, they're patently false, at least for Catholicism.  There are some branches of Christianity, particularly the the Protestant fundamentalist ones, that may live up to the stereotype, but the vast majority of Christianity (in general) and Catholicism, specifically, embraces and has embraced learning that does not directly come from Divine Revelation.

The point at which we depart from a secular approach to learning is the point at which it becomes irreconcilable with Divine Revelation.  And it is, in fact, this point which is the crux when a Christian is bound to deny some scientific theory.  Evolution in particular has long been bound up with an underlying materialist philosophy, and it is this philosophy, rather than the biologicial theory of evolution, that a Christian should reject.  The essential problem of the materialist evolutionary philosophy is the underlying assertion that "this is all there is," i.e., that the material world is all there is, that there is no spiritual reality and, correspondingly, no Supreme Spiritual Being (God).

The popular view of evolution is imbued with this theory, and that is why there has been (and remains to be) so much debate between Christians and non-Christians around evolution (excepting, of course, the literal creationists, who object to anything but a literal interpretation of the creation account).  Those who believe in evolution are stereotypically also materialists because, theoretically, evolutionary processes free one from having to believe in a creator.  If we are, after all, just the product of chance mutations over millions of years, what need have we for a God to have created us?  This thinking extends into cosmology where the study of physics enables us to theorize about a universe that either always has been and/or continually recreates itself.  Freed from a physical or biological need for God, those who desire to reject Him now seemingly have a scientific basis to do so.

Historically, Christians have (and rightly so to some extent) seen these scientific theories as inimical to Christian faith.  The key lies in disentangling the materialist philosophy from the biological theory of evolution and from theories pertaining to the formation of the cosmos.  Inasmuch as a theory does not entail the rejection of Christian faith (which does include God's creation of the cosmos, including humans), Christians are free to believe it.

In the case of evolution, if you don't read the creation account strictly literally, it is conceivable that God could have created the world and in a manner that accords with the theory of evolution, i.e., using natural mechanisms that he built into the fabric of the universe.  The key moment of creation, inasmuch as man is concerned, comes with God's "breathing life" into us. 

It is in fact oddly believable that God did use evolution, allowing our human form to develop until the point at which he imbued us with spiritual life.  This would explain the seemingly sudden generation of civilization from what we think of as pre-history.  It could allow for the development of other physically similar, human-like species that ultimately died out.  The creation account certainly follows something of an evolutionary account from the creation of the cosmos, to the formation of the earth, to the growth of vegetation, to the animal life originating in the seas, then the air, then on land, and then ultimately humans whom he gave the "breath of life." He did not breathe on the other creatures that were also alive, so clearly the creationary moment for man was not the giving of physical life but of spiritual, and it is this that makes us different from the animals--our spiritual, God-like (we were made "in his image") nature.

Divine Revelation is even less specific about the creation and nature of the universe, so many of the theories about the universe that cosmology proposes are acceptable--as long as the universe can ultimately be held to be a creation of God.  I think we could even say that a universe that keeps recreating itself in time could be synthesized with Christian faith because there is still room for God to have set this self re-creation in place.  Even an eternal universe could be conceived of as long as the quality of "eternal" is understood to mean existing as long as time has existed.  In other words, it is possible to conceive of God's eternal nature to be such that he existed prior to the creation of time, that he is "eternal" in the sense that he is outside of time so that temporal terminology and thinking doesn't apply (is absurd) when speaking of him apart from how he interacts with time as a created thing.  Thus in one sense of the word "eternal" (existing as long as time has existed and continuing to exist as long as time does) the universe could be eternal without denying that God created that concept and reality of "eternal" because he himself is "eternal" in the since that he exists outside of time.

The point is not so much to theorize about what is or is not the truth in terms of the creation of the universe and man but rather to illustrate how Christians can faithfully accept what science has to offer.  I should note that although Christian faith can be compatible with evolution and theories about the universe in general, we are under no compulsion to adhere to any of these particular scientific theories. 

I am often amazed at what seem to be boundless extrapolations (from the specifics of dinosaurs to evolution to the creation of the universe), but I am more amazed that popular society seems to accept them all without any critical thinking.  I for one remain non-committed to these theories; I retain the same healthy skepticism for them that many reserve for propositions about God. 

For me, God is much more real, more verifiable than the theory of evolution or the big bang, and I also happen to think that my relationship (or lack thereof) with God has a much greater potential impact on my personal happiness (and those around me).  Therefore, I think it is a far better use of my time to invest in my spiritual life than worrying about whether or not I share 97% of my genes with a chimp.  Seems logical and reasonable to me. :)

To wrap things up, Christian faith is not at enmity with reason or even with material science--it cannot be--because, as St. Justin Martyr highlighted, truth is truth and can be found outside of Divine Revelation in non-Christian philosophy and the material sciences.  Where there is truth, we should embrace it.  Where it seems to conflict with our faith, we should strive to understand how it does not.  Science, when understood correctly, can only serve to enhance our faith, for as our understanding of the amazing complexity and beauty of the material world increases, so should our amazement at and love for our Creator increase.   Our faith should complement and enhance our learning.  Like cocoa without sugar is bitter, so is learning without faith (Eccl. 1:18; a.k.a., "ignorance is bliss").  However, when we combine faith with our learning, we get something joyous, sweet, and delicious.

Tuesday, 29 May 2007 21:43:01 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, 21 May 2007

Party with Palermo If you haven't heard about it yet, Jeffrey Palermo is throwing another Party with Palermo, one of his pre-conference bashes, at TechEd 2007.  These are always a lot of fun--free food, stuff, and fun people. 

This year, Infragistics is sponsoring, which means you get free food and drink on us, we'll be there to chit-chat, and of course, we'll be giving away some really cool swag.

So if you'll be in town Sunday night, you should definitely stop by and hang out for a bit.  It's THE place to be on Sunday night. :)

Monday, 21 May 2007 12:16:03 (Eastern Daylight Time, UTC-04:00)  #    Disclaimer  |  Comments [1]  | 

Disclaimer
The opinions expressed herein are solely my own personal opinions, founded or unfounded, rational or not, and you can quote me on that.

Thanks to the good folks at dasBlog!

Copyright © 2017 J. Ambrose Little