Sunday 18 October 2009

T(ech)-Time - simple ways to optimize general code

I was reading Lynn Jones blog about old technology today and it got me thinking about the changes over the years. Hohum - here is a good first tech post I think...

Since I started to do serious things with computers - about 1992 - I have used Turbo Pascal, Ansi C, assembly (which I hated :) ), VB 5 and 6 (and classic ASP), Magic (v5, v6 and v7) and now program in C#.

When I started machines were not overly powerful (we thought they were at the time of course) and you had to think about what you were doing in order to make the program run as quick as possible.

When I was in University we had basic coding optimisation practices drummed into us in our Micro-architecture classes (they didn't teach me to spell it though :) ) - how to make your app work that little bit better by using some simple techniques.

In the modern world when writing desktop apps it's not going to make the world of difference. And that is the problem. By and large you can throw hardware at most performance issues and let's face it even a netbook runs at 1.6GHz and with a decent OS does most home user tasks adequately.

But... The trouble comes when you want to do something with many concurrent users. Say... A web application. Or a mainframe app (OK you don't see this too often these days). If you have got lazy with your desktop development you have a higher chance of running into problems.

You see, when you have a decent development rig (where I work we use quad core machines with raptor 10K HDD's and 4GB ram. They rock :) )you still don't see the problems when you are debuging.

No... The problem shows itself when you have 6000 users trying to use your site at once. And what do the developers say? Throw another server at the problem. If you think about the bottle necks and possible issues that you have though you may not need that server. We rely on a couple of purchased apps for data delivery in the office and we have always complained about one in particular. We can't throw it out as they are the only supplier of this particular data in the country. They made a change recently and as a result we can get rid of 50% of the servers we use *for that one app* and still have more capcity. And all because someone actually started to look at performance seriously.

My first suggestion would be to think about all of the logic of your apps before and during coding. If you have an 'if' statement think which of the logic branches is most likely to happen and ensure that this is in the true branch of the 'if'. The reason? When an if instruction comes through to the processor the pipeline is full of the 'true' login branch. When you need the false branch the pipeline first has to be emptied, and then refilled with the new code branch.

I'll be honest with modern operating systems I don't know how valid this point is anymore. But, the process of thinking about what you need in your app is still a valid point and should lead to improvements. just a thought I've been having recently...

2 comments:

  1. > because someone actually started to
    > look at performance seriously.

    When in doubt, blame the server, then the network and then the desktop. Finally, once all of those avenues have been exhausted: look at the code and remove the bloatware :-\

    Not that I'm jaded :D

    ReplyDelete
  2. I'd like to defend my fellow coders... But I can't :)

    Another example I found recently is that if you SQL query result field list's in C# with string literals of the correct case, "field1" or "Field1", which is a simple job, you write the SQL query you know what the field names are, then it is orders of magnitude quicker. Even our architect got on my nerves that day saying this wasn't an important place for optimization... Needless to say I now make sure that my team does it anyway :)

    Of course we still blame the network and servers :p

    ReplyDelete