I was chatting to Phil Factor the other day about the slow start-up of some CLR applications. He started telling me, with apparent irrelevance, how he once, a long time ago, developed a database system in Z80 assembler code and a large eight-inch floppy disk. The point he was actually making was this: the more you know about what an application is doing, the more likely you are to be able to fix slow-running code. Through watching the bare disk drive on his desk as it whirred and clicked, he could actually watch the reads and writes, the head position, and the tracking across the drive. It gave him a lesson for life in the importance of performance metrics.
Much has changed, but the need to monitor the performance of an application hasn’t. With the right information, you will see at a glance any unnecessary initialization, premature loading of data or blocking process. You still need a tool that will allow you to monitor down to the details of thread creation, thread switching and context switching,
Performance Counters are the first resort when investigating or understanding a performance problem. There are performance counters for almost every aspect of the CLR and .NET Framework. They are always available and do not change the performance characteristics of your application. Once you have the big picture, then a profiler will soon tell you where the causes of performance problems lie.
Armed with this information, it is a matter of great job satisfaction to be able to make a big gain through a small adjustment to the code. It is possible to tweak any code to make it run faster, usually by changing the algorithm, but it will only have a perceptible effect if it is done in the right place. The art of improving performance is in knowing where to make those alterations.
There are some dreadful ways of making code trying to make code run fast. The worst mistake is to write your code in a machine-Friendly-human -unfriendly way, an art beloved of a few Perl programmers. Another bad idea is to compromise a good Object-oriented model in pursuit of small savings in Garbage-collection, a process equivalent to the sin of de-normalisation. It is programming without all your faculties, blind to what is going on and, even worse, without the faculty of thought.
We encourage your nominations for the least (and most!) effective ways to optimize your code. The best contribution, added as a comment to this blog, will win a $50 Amazon voucher. The winner last time, for their contribution to the “That’s not a database, it’s a spreadsheet” editorial, is digory!