S
Steve Kraus
Guest
I've gotten curious about the accuracy or not of these. I am speaking of
the motor-driven timers with a 24-hour dial on the front.
Obviously not something someone would use where precision is required so
this is purely an academic question.
I was under the impression that these were driven by tiny sync motors and
therefore as accurate as the power line frequency just like a traditional
electric wall clock. Not easy to set precisely but should repeat nearly to
the second day after day. Or so I thought.
Recently, using a timer on Christmas lights, I noticed variation as much as
a minute. But that was a newer timer and I don't know what kind of motor
it has. I know on the older ones, the motor rotor could randomly start in
either direction but there was device in gear train that would kick it to
reverse if it happened to start by running in reverse. Doesn't that sound
like a sync motor?
So for a few days I've been running a little experiment. I have a number
of timers of various vintages. I ran each to where it just kicked on then
unplugged it. I put the group on a couple of outlet strips and powered
them at the same time, each hooked to an indicator lamp. At the expected
time each night I put a smartphone running an NTP-connected clock app
nearby and video the scene with another phone so as to produce a record of
when each turned on. I was expecting them to switch on a few seconds
before the designated time (based on how many seconds elapsed during the
setup adjustment between the timer triggering and my unplugging it) but the
variation has been much greater, nearly a minute early on one but generally
all over the map and different each night and not in unison as though the
power line frequency* had varied. Maybe this is from mechanical variations
from the switch part of the timer snapping over. Or maybe they are not
sync motors after all.
* I think the utilities used to make up for lost or gained time keeping it
to ą5 seconds. Now I think it's ą25 seconds. That doesn't account for the
variation I see though it could be some of it.
Again, there is no practical point to this; a digital timer can be used
where precision is needed. I am just curious.
the motor-driven timers with a 24-hour dial on the front.
Obviously not something someone would use where precision is required so
this is purely an academic question.
I was under the impression that these were driven by tiny sync motors and
therefore as accurate as the power line frequency just like a traditional
electric wall clock. Not easy to set precisely but should repeat nearly to
the second day after day. Or so I thought.
Recently, using a timer on Christmas lights, I noticed variation as much as
a minute. But that was a newer timer and I don't know what kind of motor
it has. I know on the older ones, the motor rotor could randomly start in
either direction but there was device in gear train that would kick it to
reverse if it happened to start by running in reverse. Doesn't that sound
like a sync motor?
So for a few days I've been running a little experiment. I have a number
of timers of various vintages. I ran each to where it just kicked on then
unplugged it. I put the group on a couple of outlet strips and powered
them at the same time, each hooked to an indicator lamp. At the expected
time each night I put a smartphone running an NTP-connected clock app
nearby and video the scene with another phone so as to produce a record of
when each turned on. I was expecting them to switch on a few seconds
before the designated time (based on how many seconds elapsed during the
setup adjustment between the timer triggering and my unplugging it) but the
variation has been much greater, nearly a minute early on one but generally
all over the map and different each night and not in unison as though the
power line frequency* had varied. Maybe this is from mechanical variations
from the switch part of the timer snapping over. Or maybe they are not
sync motors after all.
* I think the utilities used to make up for lost or gained time keeping it
to ą5 seconds. Now I think it's ą25 seconds. That doesn't account for the
variation I see though it could be some of it.
Again, there is no practical point to this; a digital timer can be used
where precision is needed. I am just curious.