So, at work yesterday, I finally got around to pointing a profiler at the 10,000 lines of code I wrote in December. It's a security prototype that uses RSA signatures to issue security tickets, similar to what Kerberos does. Since it accesses a database to record session information and verify passwords, I fully expected that the profiler would show everything in the noise relative to network roundtrips.
Imagine my surprise when the slowest line of code was this:
RSACryptoServiceProvider rsa = new RSACryptoServiceProvider(csp);
What?! But sure enough, it's "slower than molasses in January", as my dad always says (excuse him, he's from South Dakota). Well, 20 minutes of coding later, I'd added a simple synchronized Queue where I pool the instances when I'm done with them, and the code got a lot faster - like 25% per request. I scheduled some performance tests last night to see exactly how much difference it makes.
What's great is, my client has been after me to make a different optimization - one he thought of. And it's a good optimization, but it's not likely to have a tenth the impact. There's just no way we would have guessed that this particular line of code would be so expensive. In fact, I'm not sure it's worth it at this point to do his optimization at all.
The moral of the story is, "beware premature optimization." Profile, fix the slowest thing you can, and for the most part don't bother optimizing at design time.