Speeding Up Code: The Last Thing You Do Is Optimise

The reason for this post is that on many websites about coding you will see some comment about the speed of various approaches. Often in the comments you will see such things as

I ran code A 200,000 times and code B 200,000 times and code A was 2ms faster than code B.

First of all am I really going to run this code 200,000 times? For example I have seen this comment on an article about having ‘namespaces’ in javascript! If you are going to have 200,000 different ‘namespaces’ then there is definitely something wrong with your program.

I am not saying that optimising your code is not important but it must be done in the correct way.

One major problem is that there is often a trade-off between optimisation and the ability to debug, maintain and even optimise your code.

Write Well Structured Code

Well structured code is not only less prone to bugs, easier to debug and maintain it also makes the job of optimisation a lot easier. Good modular structure with well defined interfaces means that individual parts can be optimised without any unwanted side effects.

Profile You Code

Once you have some code that runs do not even think of optimising until you have profiled it.

I do not use bold type often – when I use it I mean it. I have seen many people spending valuable time trying to speed up code that was not the main problem. I have found that profiling often throws up lots of surprises.

Now that you know what the main problem is you can look at ways to improve it. However, do not overdo it – you may find that a few quick tweaks will mean that this is no longer the bottleneck. Optimisation is an iterative process – profile, optimise, re-profile…

Ways To Improve Speed

Get The Correct Algorithm For Your Problem

There is no point in optimising crap code – make sure you choose an algorithm that is the correct one for your requirements. Beware of so called best algorithms since often what is the best way of doing something depends on the particular application – sometimes that are not even the best algorithms at all. For example I saw many examples of best algorithms which they were for old 4 bit or 8 bit processors but not for multiple 32bit or 64bit processors.

Various sorting algorithms depend on the data set. The standard recursive QuickSort will overflow the stack with large amounts of data. Your data set may be somewhere near the worse case for one algorithm and the best case for another.

The size of a data set may also be important. If you only have five values then an n2 algorithm may run faster than a n log (n) algorithm.

Caching And Look-Up Tables

A very simple case is where you are repeatedly using an expression that is basically a constant – for example ln(π). Rather than evaluating it every time calculate it once and store it as a constant.

If you are likely to want to evaluate a function and know that it is likely that the same  argument is likely to occur

function complicated_function(variable) {
    if (variable not in lookup) {
        value = evaluate function of variable;
        lookup[variable] = value;
    }
    return lookup[variable];
}

If you know the range of your variable and the function is reasonably smooth it may even be worth while creating a look up table at the beginning and use interpolation.

 Now Re-profile Again

Just to re-emphasise but also one other point. Sometimes it is important to profile on the target machine. I get very different results if I profile some PHP code running on the local server on my machine (low memory, single processor, also running editor and browser) than on the web hosting company’s dedicated server.

Size Does Still Matter

With increased memory available on modern computers some people believe that the size of the code does not matter. One person I worked with approached optimisation by inlining all the functions – this is not necessarily a good idea. The reason being caching systems.

Smaller code is more likely to fit in the cache resulting in fewer cache misses.

I have rambled on more that I intended. However, once again to quote Donald Knuth

premature optimization is the root of all evil

 

Share


Leave a Reply

Your email address will not be published. Required fields are marked *

Captcha: * Time limit is exhausted. Please reload CAPTCHA.

Subscribe