Why real time differs from difftime?

Hi,


My C++ project simply computes runtimes from a basic sort funtion.

When I compare the times, obtained from the use of difftime and obtained from the time command, I observe a big difference and I don't understand why. Could someone explain me this point ?


Thanks in advance :-)


Here is the code :


//
//  main.cpp
//  sortCPP
//

#include <iostream>
#include <list>
#include <cstdlib>

int main()
{
    int n = 41000000;

    time_t timer_start;
    time(&timer_start);

    // Fills a list
    std::list<int> myList;
    for(int i=0; i<n; i++)
         myList.push_back(rand() % 100000);

    time_t timer_mid;
    time(&timer_mid);

    // Sorts the list
    myList.sort (std::less<int>());

    time_t timer_end;
    time(&timer_end);

    std::cout << "total      : " << difftime(timer_end,timer_start) << std::endl; // Displays 41s
    std::cout << "generation : " << difftime(timer_mid,timer_start) << std::endl; // Displays 4s
    std::cout << "sort       : " << difftime(timer_end,timer_mid)   << std::endl; // Displays 37s

    return 0;
}

Just to be more precise, executing program code and a question :


g++ test_sort.cpp -o test_sort -O3

time ./test_sort

total : 41

generation : 4

sort : 37


real 0m59.854s

user 0m58.638s

sys 0m1.194s


Is the destruction of objects that take so long?

A very simple answer. Your timer, runs on the same thread as the sort. Hence, your timer

is about as accurate as measuring distance by spitting into the wind. If you need accurate

time, you need to set a start time and store it, then determine a completion time and subtract

the start from the completion time. Any other measurement will always be off.

I understand your answer but I thought the wind is include in sys time from time command. And I'll be very surprised that time command takes so long (20s) because when I do not use it and I measure time with my watch I obtained the same result. More than that, at exaclty 41s, the difftime() results are displayed by the program. It then remains 20 mysterious seconds.


In the program, Time() function returns the number of seconds spend from 01/01/1970. The result of the time function is first stored in the variable timer_start, then in timer_mid and finally in timer_end. So I think I've done things as you advise.


To strengthen this point, I also tried clock() instead of time() and I get the same result :


#include <iostream>
#include <list>
#include <cstdlib>

int main()
{
    int max = 41000000;

    // Fills a list
    std::list<int> myList;
    for(int i=0; i<max; i++)
         myList.push_back(rand() % 100000);

    // Sorts the list
    myList.sort (std::less<int>());

    printf("clock: %f seconds\n", (double) clock()/CLOCKS_PER_SEC);

    return 0;
}

g++ test_sort.cpp -o test_sort -O3; time ./test_sort

clock: 43.412757 seconds


real 1m4.737s

user 1m1.481s

sys 0m2.320s


My first idea was destruction of objects takes 20s, but this seems very long whereas construction of myList and its filling only takes 4s.

Of course, I want to understand these mysterious seconds and how to optimize my program (by keeping the same structures) to avoid them.


And of course, thanks for your answer 🙂

Why real time differs from difftime?
 
 
Q