Geek of the Week – Ken Henderson

Ken is a consultant who often works for high-profile customers, including the U.S. Navy and Air Force, as well as H&R Block. In addition to books, Ken writes articles covering all aspects of SQL Server.

An interview with Ken Henderson

by Douglas Reilly

I have never met Ken Henderson, but I have followed his career, read his books, and even reviewed one or two of them for Dr. Dobb’s Journal. When I first saw Ken’s The Guru’s Guide to Transact SQL I almost did not pick it up. As a general rule, I don’t pick up the guru’s guide to anything. I’m glad I overcame this, since the book is little short of amazing.

Ken surprised me again with his latest book, The Guru’s Guide to SQL Server Architecture and Internals . About 400 pages into it, I was impressed by the introduction to what is involved in creating a server-based application, but wondered about its inclusion in a SQL Server book. The next 400 pages made it clear that understanding how servers work on Windows is critical to understanding the implications of how SQL Server runs.

The fact that I was impressed with the book’s server coverage is particularly noteworthy, as I have written a book on server-based applications myself !

Ken is a consultant who often works for high-profile customers, including the U.S. Navy and Air Force, as well as H&R Block. In addition to books, Ken writes articles covering all aspects of SQL Server, especially the low-level details that can trip up all but the most intense database geek. A recent MSDN article of Ken’s, and one that he is uniquely qualified to write, covers the perils of fiber mode . It is a rare person who is at home with both the intricacies of Transact SQL as well as the details of Win32 I/O completion ports. Even more rare is someone who has the ability to communicate this sort of information to his readers.

The question-and-answer session with Ken that follows was conducted via email.

Doug: For years interviewers have tried to ask edgy questions. This being a geek-related interview, I ask you, “Stored procedures or ad-hoc SQL? And why?”

Ken: I favor stored procedures for several reasons. First, they generally perform better. Stored procedures encourage plan reuse and leverage the optimizer’s ability (I work mostly with SQL Server these days) to create efficient execution plans for carrying out user requests. Second, stored procedures encourage modularity in applications. You can update a stored procedure independent of the applications that use it as long as you keep its interface-what it expects and it returns-the same. You can’t do that with ad hoc SQL. Third, stored procedures are more secure. Many SQL injection attacks occur because people use ad hoc SQL where they should instead use parameterized stored procedures.

Doug: When you program in .NET, do you favor VB.NET, C#, or some other language?

Ken: C#, hands-down. As a longtime C/C++ developer, I am at home in C#. It is really just another member of the C family of languages, and one that I wish had existed years ago. To a veteran C++ coder, it feels very much like a cleaned-up C++. Another reason I like it is because it was designed by Anders Hejlsberg, one of the top scientists in the industry, in my opinion. During his time at Borland, Anders also designed Delphi and every version of Turbo Pascal, so he is in my own personal pantheon of heroes.

For obvious reasons, people like to compare C# to Java, but I have a different perspective. As a longtime Delphi aficionado, I see a lot of similarity between C# and Delphi’s Object Pascal language. I see a lot in common between the .NET Framework and the Delphi Visual Component Library. In fact, in many cases all you would need to do to translate a typical app between Delphi and C# is a series of simple find-and-replace operations. They’re that similar. Given that the same person designed both Delphi and C#, and given Anders’ influence on the Framework itself, it’s not that surprising, but it’s something that I don’t think many people are aware of.

Before C#, I don’t think Anders’ work got the attention it deserved. Delphi was extremely popular for a year or two after its release, but that was soon superseded by Java and newer releases of the Microsoft language tools. It’s good to see Anders finally get the credit he deserves, and it’s good to see him in a position to help so many more developers with his unique take on what constitutes software elegance.

As for VB, I have to confess that I have never really liked the language. It has its place, I suppose, but by the time VB became really big, I was already using what I considered to be a superior general-purpose Windows language tool, Delphi, and have never looked back. In fact, until C# came out, I considered Delphi to be the best all-around development environment for Windows application development, bar none.

Until VB.NET, the Visual Basic language lacked true inheritance or polymorphism, and what OOP inclinations it had seemed to be mostly an afterthought. That all changed with VB.NET, though-it’s a full-blown, powerful OOP language in the same way that the other CLR languages are. That said, I don’t see any real reason to use it over C# in my own work, and I see several things that make me wonder whether that would even be wise.

For one thing, it is still weighed down with the historical baggage of the Visual Basic language. Things that take a single keyword in C# or J# take multiple keywords in VB.NET. Things that are consistent and orthogonal in C# are uneven and sometimes even a bit unwieldy in VB.NET. From my perspective, due to the necessity of maintaining some amount of compatibility with VB6 and earlier, VB.NET is not nearly as lean or as elegant a language as C#. I believe this also makes it more difficult to use.

Ironically, C#, the scion of the language with the most historical baggage of them all-meaning C-is a simpler and more concise language than VB.NET because it was able to more or less start with a clean slate. VB.NET, on the other hand, had a lot of external forces pressuring it to meet some arbitrary standard of backward compatibility and forcing things on it that would have been better left behind. Sadly, the very things that were bolted on to it to make older code easier to port may eventually be its undoing if they make it so onerous to use that C# becomes a more attractive option for people migrating apps from VB6.

Doug: Do you do much unmanaged code these days? That is, non-.NET, Win32 code?

Ken: For unmanaged code, I work pretty much in C++. Occasionally, I still get back to Delphi, but not as much as I’d like.

Doug: There have been a number of blog posts, even petitions, to bring back official Microsoft support for VB6. What are your thoughts on this?

Ken: It’s really a shame that such a cottage industry grew up around VB6 and earlier and that so many people are now left with code that is difficult to port due to the language being cleaned up with VB.NET. Many of the changes to the language that have come out in VB.NET were long overdue but, of necessity, break existing applications. I’m all for the clean up and the changes I’ve seen in VB.NET and, if anything, I would have taken them further.

I’m actually fonder of the Visual Basic language now than I have ever been before. And I don’t think I’m alone in that. A lot of experienced Delphi and C++ developers-people traditionally outside the VB camp-have taken a second look at VB.NET and come to the conclusion that it is, at long last, a real OOP language and that it’s just as powerful as many others out there. What remains to be seen is whether there is a compelling reason for people like me to use VB.NET over C# and whether people porting code from VB6 would be better served going straight to C#.

I know a lot of people are feeling some pain due to the differences between VB6 and VB.NET, but extending support for VB6 is not the way to fix those things. Microsoft is already doing things to bring disaffected VBers back into the fold, which is a good thing, and I think the VB community has to take some responsibility itself.

The wisest course would be for VBers to look at the language more holistically, outside the context of whether the changes break existing apps or alter the way they’re used to thinking about things. They should look at the evolution of the language as a whole and ask themselves whether the changes are best in the long run. In most cases, an informed and honest answer to this will be “Yes.” And I think if we can get on the same page vis-à-vis whether or not the changes represent what’s best for the language as a whole, we can then proceed with how best to migrate old apps to it.

Doug: How did you start out working on databases? What was your first database-related job?

Ken: I started working on databases on old mainframes and midrange computers before the PC existed. I guess the first real relational database I used was on the IBM System/38. I had written code for a few machines before that, but they had an assortment of non-relational data management systems such as flat-file DBMSs and hierarchical databases. My first database-related job was writing COBOL, BASIC and RPG for the System/38 in the early 80s.

Doug: How did you learn about databases?

Ken: Having written code for many non-relational databases, I was instantly struck by the logic, efficiency, and the all-around general coolness of the relational database the first time I coded for one. I was hooked. The System/38 was then, and still is, a great machine for its intended user base. Its successor, the AS/400, only furthered that. Believe it or not, it had features 20 years ago that DBMSs like SQL Server are just now getting.

When the PC came out, I began dabbling in PC programming and eventually in PC databases. The first one I worked with was dBase II, an old floppy-based DBMS from Ashton-Tate. (Years later, Ashton-Tate would partner with Sybase and Microsoft to help launch the first release of SQL Server for the PC.) dBase II had a lot of features for a PC program but was relatively slow because its programming language was interpretive. It was also prone to corruption because it was a PC app that often stored data on floppy disks.

From dBase II, I moved to dBase compilers such as WordTech’s db2Compiler and, later, Quicksilver. Quicksilver produced native machine code, but was so tedious to use (it had some five separate “compilation”-type stages) that I wrote my own RAD-style development environment for it. That tool became the Cheetah Integrated Programming Environment, which was a popular multi-language development environment for the PC for a few years, and was especially popular for database application development.

I guess I’ve learned most of what I know about databases by building database applications and managing database systems. I’ve read plenty of books on the subject, but when I came through school, there wasn’t much in the way of good instruction on databases, let alone PC databases. By being there as the PC database evolved, I was able to evolve with it. When the industry went from flat-file databases to network DBMSs, I was there. When it went from network DBMSs to client/server DBMS, I made the move. When the industry went from client/server database applications to multi-tier apps, I was there as well. And when we went from multi-tier databases to disconnected databases and XML-driven data management, I made that jump too.

Doug: Have you done much with XML?

Ken: Well, I wrote a book about it. Seriously, my second SQL Server book, The Guru’s Guide to SQL Server Stored Procedures, XML, and HTML (Addison-Wesley, 2001), covered a lot of the types of work I’ve done with XML. I’m a big fan of it, and rarely a day goes by that I don’t use it in my work in some way or another.

Doug: You currently use Microsoft SQL Server. How did you get to that database?

Ken: After I sold Cheetah and the company I had founded to market it, Software Springs, I returned to building database applications. I had been looking at client/server systems for a few years and was particularly intrigued by one called Emerald Bay, which had been designed by Wayne Ratliff, the original designer of dBase II. Unfortunately, the company that was marketing Emerald Bay ran out of money, and that software never gained much traction in the industry.

Next I took a look at Sybase, the first RDBMS to feature a cost-based optimizer, but, at the time, it ran exclusively on non-PC platforms, mostly UNIX, so I didn’t do much with it. I also did some beta testing of a product called Oracle dbXL, the result of a joint venture between WordTech and Oracle. It used WordTech’s dBase-clone as its front end and Oracle as the back end. Although the product itself had many problems, I could see a lot of potential in the client/server model and hoped to soon be building real apps on it.

Around this time, in the late 80s, Sybase partnered with Microsoft and Ashton-Tate to bring its SQL Server product to the PC running on OS/2. I was doing a consulting gig at the time where they were contemplating going with bigger hardware such as an AS/400 or DEC machine to host what was then considered a huge database of approximately 200GB over its lifetime. I recommended that they give this new PC-based product a chance and did a proof-of-concept project for them that had one expected result and one not-so-expected result.

The first result was that they bought into the idea of hosting their data on the PC and gave the green light to using SQL Server, then at version 1.1. That wasn’t that much of a surprise to me. At least on paper, the PC-based solution would cost a fraction of anything implemented on bigger hardware. The thing that surprised me, though, was how much I enjoyed the new platform. I loved it. I instantly got the utility and elegance of the client/server model. I could see so much promise in building rich client apps that deferred to another special-purpose box to handle database work. We got started on the new project in January 1990, and I’ve never looked back.

Doug: What sort of work are you doing these days?

Ken: I wrote the new SQLDiag tool that is coming out in SQL Server 2005. Previous users of the PSSDiag tool will recognize it. PSSDiag was used across Microsoft’s SQL Server Support organization for collecting SQL Server-related diagnostic data. It has now replaced the old SQLDiag utility and will be included in the box with SQL Server 2005.

Doug: Have you had any chance to work with the betas of Microsoft SQL Server 2005? What do you think about possibly using VB.NET or C# for stored procedures?

Ken: Yes, I’ve worked some with the SQL Server 2005 betas. I’m excited about the prospect of having CLR support in the database engine, but worry about the supportability of it. I shudder to think of some of the support scenarios customers may find themselves in when they begin mixing CLR code, T-SQL, COM objects, managed code, interop, and who knows what else in complex applications. Some of the problems that may arise may be nigh impossible to solve. That said, it’s still an exciting addition to the SQL Server stable of features, and I welcome it.

Doug: I independently “discovered” markup languages long after Tim Berners-Lee developed HTML (http://infomesh.net/html/history/early/). My markup language involved lots of @ signs and was not terribly device independent; I hard-coded things that should not have been. Have you ever “invented” anything, only to discover that something better already exists?

Ken: That’s happened to me, too! And with an HTML-like markup language I developed, no less, so I know how you feel! When this happens to me, I usually think: “It must be a fair idea if someone other than me, especially someone I respect, arrived at it independently,” and “Gee, I wish I’d acted on that sooner so I could have made some money from it!”

Doug: Have you recently read any good database-related books or general programming books? Are you in the process of writing any developer-oriented books?

Ken: I’m always reading. I recently re-read Bruce Eckel’s “Thinking In…” books (http://www.mindview.net/). Bruce is such a master and a pleasure to read. I also like Don Box and Jeffrey Richter, also masters in their own right. And I’m always working-either in my head or literally-on my next book. I’ll have some new stuff out in the near term, though exactly what or when I can’t say.

Doug: Can you think of a particularly cool tip or trick that many database developers don’t know about?

Ken: With a few sp_OA calls, you can make use of many COM components within Transact-SQL. Many of these components are already installed on the typically box. For example, VBScript features a powerful Regular Expressions object, RegEx. By accessing it from T-SQL using sp_OA calls, you can write queries that feature Regular Expressions in their WHERE clauses. I explain how to do this in my latest book, The Guru’s Guide to SQL Server Architecture and Internals (Addison-Wesley, 2003).

Doug: What do you do when you’re not working?

Ken: I don’t watch much TV, so music and books are my usual getaway. I have my CD collection ripped into my XP Media Center Edition machine, so I’ve been known to spend hours mellowing out to either my own collection or the vast archives of Napster and MSN Music.

Do you know someone who deserves to be a Database Geek of the Week? Or perhaps that someone is you? Send me an email at editor@simple-talk.com and include “Database Geek of the Week suggestion” in the subject line.