Last week, I was in a bookstore in New York City and saw for the first time Glenn Johnson’s new book, Programming Microsoft ADO.NET Applications – Advanced Topics. Virtually all of my work involves databases, and virtually all of my work uses ADO.NET, and so picking up Glenn’s book was an easy decision. Glenn runs Glenn Johnson Technical Training, and has more certifications than you can shake a stick at.
The following questions were asked by me and answered by Glenn via email.
Doug: I picked up a copy of your book, and immediately saw that it will be indispensable in the work that I do. I noticed that, like me, you write books as a solo author – and this is a pretty big book for a single author. How do you cope with the ever-shrinking time constraints that publishers place on technical authors these days?
Glenn: Well Doug, I like writing as a solo author because I have full control and take full responsibility for the book. Yup, I’m a control freak. That’s not to say that I’ll always write solo, but I prefer solo to multi-author books because I think that with multiple authors, it’s difficult to produce a book that has a consistent style and flow.
Regarding time constraints, writing a book certainly has the ability to suck the life out of a person for several months; it takes discipline.
Doug: How did you end up doing ADO.NET related work? I see you also work with ASP.NET. Did working on data driven web sites drive you into doing database work?
Glenn: Actually, I have been working with database products since dBASE II/III, FoxPro, and Clipper. When MS Access come out, I made the jump for a short period, but then I found SQL Server 6.5, sometime in 1996. In 1997, I passed the SQL 6.5 exam, and I was the primary developer for an extension to a data driven Windows application.
This is where I really learned the benefits of transactional, SQL based, database products. I really didn’t get into Internet development until 1998ish.
Doug: You are a Microsoft Certified Professional, with many different Microsoft certifications. What do you think of the Microsoft certification programs?
Glenn: I always get this question when I’m teaching, so here’s my answer. First, I love certifications. Certifications don’t replace experience, but they do provide goals that are measurable and obtainable. If you’re competing for a job, the certifications on your resume won’t necessarily get you the job but it’ll at least get you past the first round of filtering.
I have taught classes at companies where the manager demanded that I refrain from talking about certifications. The company was paying for the employees to learn what they needed to do their job, and the manager didn’t want the employees to get certified and start looking for a new job. So here’s the question: after a person is trained, is that person worth more money? I think so. So the employer should also budget a raise for the employee that goes hand in hand with the training. This will keep the employee from racing off to find a better opportunity.
Think about this: should a company make a person sign a contract that states that if the person gets training, that person cannot leave the company for a year? I’ve seen this, but I personally wouldn’t agree to this type of agreement. Remember that getting training is a benefit for the company. If it wasn’t, you can be assured that it wouldn’t happen.
In a nutshell, here is what I tell students about certification: it’s something that you own, that you can take with you when leave the company. It will also help you get the interview; after that, you have to show your experience and yes, you need to have the ability to sell yourself.
Doug: As far as I can tell, you have thus far resisted creating a blog. Any reason in particular?
Glenn: Funny you should ask, Doug. I am right in the middle of creating a new Web site, based on DotNetNuke, and I’m trying to decide if this site will contain a blog. I like blogs, but I certainly understand that a good blog is one that is updated regularly. My schedule has been too hectic, but who knows, maybe you’ll see a blog after I finish my current book project, MCTS/MCPD Self-Paced Training Kit (Exams 70-528): Developing Web Applications (available mid-2006).
Doug: Do you regularly read any blogs?
Glenn: I’m a big fan of the DataWorks blog, but I’m afraid that I only get brief opportunities to peek at this blog.
Doug: I am in the process of writing an article on the role of technical books in the overall career development of a software developer. Do you think there is still a place for books in this world of ubiquitous Internet access, blogs and so on?
Glenn: I think that there is place for Internet, blogs, and development books. There are plenty of people who simply refuse to read large articles on their computer. For them, the book is the answer. Also, many people like the ability to take the book to places where there is either no Internet access (I’m not sure where that is these days) or no power outlets (like at the airport).
I think that the Internet is a place to go to for specific articles, but a book typically provides a means for more structured learning of a variety of topics.
Doug: Have you read any good database-related books lately? Any good, general software development books?
Glenn: No. Actually, I have been on a Dan Brown kick. After reading The DaVinci Code, I read Angels and Demons, Digital Fortress, and Deception Point. With a busy schedule, these books provide some escape. I have to admit, though, that I “read” these as audio books :-)”
Doug: Reading your latest book, you give what I consider is a great example of a task best done using SQLCLR stored procedures (cloning large objects). Can you briefly give that example here?
Glenn: Well Doug. I don’t know if I can do this briefly, but here goes. Picture the following scenario:
You have an application that is very object-centric, perhaps because it uses a database that can store and retrieve objects natively – for example, a SmallTalk application that uses a GemStone/S object database server. This application needs to be ported to C# and the database to SQL Server.
The application maintains very complex technical sales quotes, where over 100,000 data objects that are used to calculate a typical quote. Also, this application contains its own language, which allows the users to enter complex formulas and perform “what-ifs” when working on a given quote. The point is that a quote is a very complex object with tight relationships between all of its data objects.
Quotes are rarely created from scratch. Instead the application has a complex, multipurpose cloning framework that is used to copy a quote, and then the copy is modified to create a new quote. This cloning process requires the application to load the object into the user’s machine, clone it, and save the new object in the database. This is where the problem surfaces. In the C# application, the quote object is represented as a DataSet, and a DataAdapter is used to send the data back to the database. The DataAdapter works on a row-by-row basis, so this translates to 100,000 INSERT statements hitting the database. This is terrible for performance.
So here’s an idea. Why don’t we perform the cloning in SQL Server instead of moving the data back and forth? In a relational environment, we use stored procedures to perform operations such as bulk pricing updates, which is faster than moving the data to the client, modifying it, and sending the data back to the server. Why not use the SQLCLR to do the same thing? Makes sense to me.
Of course, the next question is: “Why should we use the SQLCLR as opposed to just using T-SQL statements in a stored procedure?” The cloning framework already exists in C#, so it can be copied to the SQLCLR relatively easily. Also, the cloning framework cannot be eliminated from the client because there are many situations where cloning takes place using client-side data. With the complexity of the cloning framework, this would very difficult to re-write using T-SQL (although not impossible). Also, maintaining T-SQL and the client-side C# code would be difficult. I envision creating a common component that could be installed at the client and in the SQLCLR.
I’m rambling and that certainly wasn’t brief, but hopefully you got the general idea….
Doug: Security in the SQLCLR environment will be critical in many environments. What do you think of how security is implemented for SQLCLR assemblies?
Glenn: I like the all of the security changes that have been implemented in SQL Server 2005, including some of the last minute change that Microsoft made to the SQLCLR which require the TRUSTWORTHY database attribute to be set before permissions can be set to either EXTERNAL_ACCESS or UNSAFE.
Doug: Any suggestions for improving ease of use for Visual Studio V.next for any developers who might be reading?
Glenn: Well Doug, when I was working on the ADO.NET book, I turned in 54 bug and suggestion reports. One item that comes to mind is the behavior of the new Class Diagrams. If you right click on a class and click “Add Derived Classes” only the classes that are in your current project will be added. I find this to be a problem because there are items in the Framework Class Library for which I may want to create diagrams, and in this scenario the “Add Derived Classes” option simply doesn’t work. For example, try creating a new project and add a reference to the System.Web.dll assembly. Add a Class diagram to the project. Open the Object Browser and drag the DataControlFieldCell on to the Class Diagram. Next, right-click on the DataControlFieldCell and click “Add Derived Classes”. Note that nothing is added. Instead, you have to add the classes manually. I would like it to have the option to add all classes from all assemblies to which I have a reference.
Doug: In the C# versus Visual Basic discussion, you appear to be a C# user. How did you end up using C# as your primary .NET language?
Glenn: You’re correct; C# is definitely my primary language, but I am fluent in both languages. Funny thing is that I was a VB programmer for years. When the original Visual Studio .NET was in Beta 2, I got a contracting gig writing C# courseware. I didn’t know anything about C#, but I decided that I’d give it a try. Well, I swore at C# for a month while I was under pressure to learn it and write about it. After that, I started to swear by C#. All of my current customers are C# shops, so C# is definitely my stronger language today.
By the way, both languages have some unique features and I truly believe that VB.NET is a first class language, but I must admit that my biggest pet peeve about VB.NET is the behavior of arrays. In all other .NET languages that I know of, creating an array of 5 will produce an array of five elements, addressed as 0-4. In VB.NET, creating an array of 5 will create an array of six elements, addressed as 0-5. This kicks my butt every time.
Doug: How do you feel about transmitting results between web services and their clients using DataSets or DataTables? Do you believe it is ever acceptable to use these large, somewhat verbose objects to transfer results between .NET Web Services and their clients?
Glenn: Well, I believe that if you want to build a web service that performs well, you need to keep your messages short. On the other hand, if you need to return lots of data from a web service, then returning a DataSet or DataTable is OK as long as you pay attention to the size of the table and column names. For example, you might need to return a long list of employees from a web method, where the column names are something like EmployeeID, EmployeeFirstName, EmployeeLastName, EmployeeZipCode (got the idea?). An employee in the DataSet will look as follows:
. . .
. . .
See how the overhead of the resulting XML tags increases the overall size of the message? How do think that performs compared to the following?
<emp id="FN=""Glenn" LN="Johnson" Zip="44012" />
The second example is almost one quarter of the size of the first example, which will translate to it being almost four times faster! I’ve seen this problem many times. Keep the table and column names very short. Be sure to change the ColumnMapping property of each of your columns to an attribute. This will cause all of the DataRow elements to be rendered as singleton elements, thus eliminating the end tags. The end result is that you can still use DataSets or DataTables without taking a big performance hit.
Doug: Many years ago, I independently “discovered” a markup language. My markup language involved lots of @ signs and was not terribly device independent, hard coding lots of things that should not have been. Nevertheless, I was very proud of myself until I came across a much cooler, much better thought out markup language, namely Tim Berners-Lee’s HTML . Have you ever “invented” anything, and then discover to your horror, much better prior art?
Glenn: Ha! I know what you mean. I remember when I first got into Web development. I desperately wanted a way to embed VB code into a Web page to create a more robust user experience. I spent lots of time trying to invent the equivalent of ActiveX controls… go figure. I must admit that I felt pretty good about the fact that I identified the same need that Microsoft identified, even if they beat me to the solution.
Doug: Can you think of any particularly cool tip or trick, especially in SQL Server 2005, that many database developers may not know about?
Glenn: Here’s one that you might like. In the past, you would use sp_helptext to find the definition of many of the system objects. New in SQL Server 2005 is OBJECT_DEFINITION, which can be used to show the source for the following object types:
- Check constraint
- Replication filter procedure
- SQL inline table-valued function
- SQL scalar function
- SQL stored procedure
- SQL table-valued function
- SQL trigger
Here is the syntax:
SELECT OBJECT_DEFINITION (OBJECT_ID ('sys.objects') )
Do you know someone who deserves to be a Database Geek of the Week? Or perhaps that someone is you? Send me an email to email@example.com and include “Database Geek of the Week suggestion” in the subject line.