[Tfug] A sense of time

Bexley Hall bexley401 at yahoo.com
Thu Aug 2 16:05:28 MST 2007


Hi,

I've periodically posted this question (or variations
thereof) in a number of different forums.  Obviously,
never quite happy with the answer(s) I've received
(I suspect it is yet another unsolvable problem  :< )

The issue is tracking calendar time in a system
that exposes it (in a variety of forms) to the user.
And, where that user is capable of "changing" the
"current time".

The problem creeps in when time is changed by the
user -- either *correctly* or *incorrectly*.

To put this in terms of a desktop environment,
imagine the user creating a file.  Then, changing
the time.  The *next* file created (*or*, any
modifications made to this previously created
file) will bear a timestamp in this new "timespace".

E.g., create a file at 1:00PM, set the time to
11:00AM and create a second file -- the second file
*claims* to have been created before the first!

While this is obvious, the effect is also present in
many other (often subtler) cases.  E.g.,  you can't
*look* at any two timestamps and deduce anything about
their relative order in "actual" time (isn't the
purpose of a timestamp supposed to be exactly this??).

It also leaves open the possibility of creating
"holes" in time.  E.g., if you schedule a job to
happen at 1:05PM but, at 1:04PM you advance the
ToD clock to 3:45PM, then "1:05PM" never happened
(sure, you can design your system to treat everything
between "the time I last checked the list of jobs to
be performed" and "the current time" and run any and
ALL that fell into that interval... but, that means
the job that was DELIBERATELY scheduled to occur at
1:15PM -- well AFTER the 1:05PM job -- will end up
running roughly concurrent with its intended
predecessor... so, any races that you sought to avoid
will now manifest).

I've tried various "solutions" to this problem in the
past but have not been happy with any of them.  :<
I think the problem lies in defining/assuming what
each time "specification" (i.e. the point at which
a reference to a time is "defined") is intended to
mean.

For example, when I write a routine to blink a light
at 1 Hz, I don't create an "alarm" at "now+1 second".
(i.e. if now is 07 Aug 02 12:34:56 I will not set
an alarm for "07 Aug 02 12:34:57") but, rather, I
will do a "relative" wait (i.e. delay) for 1 second.

Yet, when a "user" thinks in terms of time (like for
*appointments*) they are usually thinking of absolute
times (i.e. "I have a doctor appointment on 3 Aug
at 4:15PM") and *not* "relative" times (i.e. "my
doctor appointment is 28 hours, 34 minutes and 18
seconds from now...").

This is important because it suggests how you can
treat times in different contexts (i.e. relative
times IGNORE changes in the ToD clock whereas
absolute times tend to *follow* them).

Of course, there is nothing that *forces* a user to
think in these terms.  I.e. they can easily convert
relative time to absolute time before specifying a
time reference (I.e. "I want to take a one hour nap
so I will set the alarm for 3:45PM since it is 2:45PM
currently").

I *think* the right solution is to treat time as
continuous and just let the user's *idea* of
"current time" FLOAT in this continuum (e.g., I
have a set of "clocks" -- timepieces -- that
track "real" time but allow the user to define
an *offset* used in all DISPLAYS of time... so
you can set the clock by your bed to be "10 minutes
fast" to trick you into getting out of bed "on time").

I just haven't been able to come up with a scheme
that is easy for a user to relate to in more complex
devices (e.g., the desktop/workstation environment)

Suggestions?  (off list if others aren't interested
in this musing)

Thx,
--don


       
____________________________________________________________________________________
Need a vacation? Get great deals
to amazing places on Yahoo! Travel.
http://travel.yahoo.com/




More information about the tfug mailing list