GNOME Bugzilla – Bug 724687
gmain: make monotonic time really monotonic, everywhere
Last modified: 2014-02-20 23:00:08 UTC
Mac OS doesn't have clock_gettime() but it does have mach_absolute_time().
Created attachment 269610 [details] [review] gmain: fix monotonic time on Mac OS
It's worth noting that we're presently using realtime instead of monotonic time, which means that if someone changes the clock while a program has a timeout source pending, then they could end up waiting for a very long time...
Comment on attachment 269610 [details] [review] gmain: fix monotonic time on Mac OS Perhaps we should have multiple copies of g_get_monotonic_time() rather than having multiple completely-separate implementations in the same function body? The code appears to do what it says. However: >+ if (timebase_info.denom % timebase_info.numer == 0) ... >+ else >+ { >+ /* We could just multiply by timebase_info.numer below, but why >+ * bother for a case that may never actually exist... the second google hit for "mach_timebase_info site:developer.apple.com" includes the lines: For example, if the numerator returned is 1,000,000,000 and the denominator is 6,000,000, the frequency is 6,000,000ths of a second. Every unit of absolute time would be 166.67 nanoseconds and host time frequency in ticks per seconds may be calculated like this, 1.0e9 * (denom / numer).
(In reply to comment #3) > (From update of attachment 269610 [details] [review]) > Perhaps we should have multiple copies of g_get_monotonic_time() rather than > having multiple completely-separate implementations in the same function body? Ya -- that makes sense. I felt sort of uncomfortable adding the variable declaration there, so far from the '{' before. I may also take the chance to remove the fallback to gettimeofday(). clock_gettime() is in POSIX 2001... I'm actually surprised that Mac OS doesn't have it. > the second google hit for "mach_timebase_info site:developer.apple.com" > includes the lines: > > For example, if the numerator returned is 1,000,000,000 and the denominator > is > 6,000,000, the frequency is 6,000,000ths of a second. Every unit of > absolute time > would be 166.67 nanoseconds and host time frequency in ticks per seconds > may be > calculated like this, 1.0e9 * (denom / numer). Are you speculating that maybe they didn't just make this up off of the top of their heads and that this example actually exists somewhere in the wild?
I have no idea. They certainly appear to be reserving the right to do so though.
Created attachment 269634 [details] [review] gmain: rework g_get_monotonic_time() a bit We now assume the existence of clock_gettime() and CLOCK_MONOTONIC as specified by POSIX.1-2001. This means that we always return truly monotonic time, which will prevent problems in the case that the user changes the time. Mac OS doesn't have clock_gettime() but it does have mach_absolute_time(), so we can use that there. We keep our Windows case as well (although we should simplify it once XP hits EOL later this year). This patch removes the fallback to gettimeofday() in case of missing clock_gettime(). We no longer have any way to test this codepath and therefore it must go. This patch also restructures the #ifdef a bit so that we repeat the entire function definition inside of #ifdef instead of just the entire body of one function.
We're going to need to look at g_cond_wait_until() before fixing this... we're already in trouble there on some systems, and this will cause Mac OS to be on that list...
Attachment 269634 [details] pushed as d614312 - gmain: rework g_get_monotonic_time() a bit Note that, with this commit, GLib now has a hard dependency on a present and functional CLOCK_MONOTONIC.