How to Time a Running Program

        If you are trying to measure accurately how long it takes a program or a part of a program to run, you really can't use a watch or anything like that. It takes some time for a program to start up, or even for it to notice when you have pressed the enter key. You need to make the program time itself.
        Fortunately, that isn't very difficult. Unix (at least the version we've got on rabbit) provides a little function that reads the system clock, and tells you the time down to the nanosecond. At least, that is what it claims, but it is lying. The system clock does not tick 1,000,000,000 times per second, and the computer tells time by counting ticks, so the maximum accuracy of computer generated times is less than you might imagine.

Here's how it works:
        You must include the library file sys/time.h to get access to the timing functions, and you need to declare a strange variable whose type is struct timespec to receive the answer from the system clock. The beginning of the program might look like this:
          #include <sys/time.h>

          void main(void)
          { struct timespec tim;
(except of course, you would still probably want all the common libraries like stdio.h etc.)
        Then you perform all the setup tasks (such as asking the user for any inputs) before reading the clock. You don't want to include the time it took for some slow-witted human to answer a question in your measurements of the speed of a program.
        When everything is ready, and you are just about to run the interesting part of your program, you read the clock. The way to read the clock is like this:
            clock_gettime(CLOCK_REALTIME, &tim);
(CLOCK_REALTIME tells it to use a clock that says what the time really is, tim is the variable we declared earlier. Don't forget the ampersand & before tim.)
        The special variable tim is a simple structure with two parts, both integers. One counts the number of seconds, the other counts the number of nanoseconds. The best plan is to convert them into a single floating point number, by perfroming this computation:
            now = seconds + nanoseconds/1000000000;
Unfortunately, we have to be careful of two things. One is simple, if nanoseconds is an integer, and less than 1,000,000,000 (which it must be, because 1,000,000,000 nanoseconds make 1 second) the division will always result in zero. in C, when you divide two integers the result is rounded towards zero. We need to force it to do a floating point division, and the way to do that is to write 1000000000.0 instead of 1000000000; in mathematics, puttin .0 at the end of a number means nothing, but in C it is very significant.
        Sadly, the number of seconds in the time is likely to be so large that it uses up all the accuracy available in floating point numbers, and we are unable to handle the fractional part at all. We have to go even further, and use double precision floating point numbers. Declare the variables as double, put an L at the end of any constant, and use the pattern %lf instead of %f when printing (that's a little L, not a one).
        One final irritation is that to access the seconds and nanoseconds fields of tim, you have to use tim.tv_sec and tim.tv_nsec.

        A working program would look like this:
          #include <stdio.h>
          #include <sys/time.h>

          void main(void)
          { struct timespec tim;
            double begintime, endtime;

            int i, n, x;

            /* interact with the user and prepare for running */

            clock_gettime(CLOCK_REALTIME, &tim);
            begintime=tim.tv_sec + tim.tv_nsec / 1000000000.0L;

            n=1000000;
            for (i=0; i<n; i+=1)
              x=x+1;

            clock_gettime(CLOCK_REALTIME, &tim);
            endtime=tim.tv_sec + tim.tv_nsec / 1000000000.0L;

            printf("It took %lf seconds to run a loop %d times\n",
                   endtime-begintime, n); }
Remember that the result is not really accurate doen to the nanosecond, so don't pay attention to all the digits that are printed.

Remember

That you share the computer with everyone else in the class. Do not ruin everything by deliberately running programs that will take more than just a couple of seconds to complete.