WTF with my computer clock?

Ray L. Volts wrote:

root wrote:
The damned thing loses about 20 minutes/day and has
so since the machine was new about 3 years ago.

My guess is that it isn't fixable, but maybe you
have some ideas.

TIA.

This thread reminds me of an old Columbo movie.
As I recall, the murderer had reset his PC clock so that certain data
would be erroneously timestamped while his PC was used during his
absence -- thus providing his alibi later. I don't recall how Columbo
realized this bit of trickery had taken place, but, being Columbo, he
did. Nowadays, the culprit would need to remember to also keep the
machine from syncing with online time servers!
Don't remember that one - do remember a Columbo movie where a VCR is used to
timeshift a programme (football game?) which together with drugging is used
to give the murderer a witness to prove that he was at home at the time of
the murder. Quite a new idea at the time - the movie was made before launch
of Betamax and VHS so was probably either a Umatic or Philips stacked reels
machine.
 
Jeff Liebermann wrote:

One machine I worked with had a unique problem. When the machine went
into standby, the clock would just stop. When it came out of standby,
it would continue where it left off, losing the time it was in
standby. It was fixed under warranty. I don't recall the vendor.

Oh yeah, check the button battery that backs up the clock. It might
be dead or dying.
I've had machines with faulty (or even missing) CMOS battery causing the
Clock to stop in Standby but still not any loss of setup data.
 
Arfa Daily wrote:

If you are the
telephone company, or a television broadcaster, though, things really do
work a lot better when the digital signals carried by your network all
are at precisely the same bitrate, no matter where they come from.

Right. At one time TV stations etc had their own accurate pulse generator
referenced to the national standard. Here in the UK it was IIRC from the
National Physics Laboratory.


Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.


I reckon that TV companies must now use these laptops with very rough RTCs
! Have you noticed that now programme material is not networked from one
region into some or all of the others, and adverts are no longer 'local',
there is not any need for accurate cueing points around the network, so
advertised starting times are not even nodded at ? I checked the starting
times of about half a dozen programmes tonight, using the teletext clock,
which I believe to be accurate, and not a single one started within 1
minute of the correct time, and a couple of them were off by several
minutes. Just another manifestation of declining standards throughout the
civilised world ... :-\
It's not just between broadcasters, the BBC does it between their channels
as well. Their 'Points Of View' viewer complaints show have done a few
reports on viewers complaining about different times on BBC1 and BBC2, at
least one of which had one of their presenters switching between the 2
channels at programme change to demonstrate the problem.

The problem (which they actually proved was real - surprised they were
allowed to show that on BBC1) is that BBC1 often runs 2 minutes early and
BBC2 is 2 minutes late. Switch one way and you have to wait 4 mins for
programme start, switch the other and you miss the start.

But then the BBC don't seem to care about viewers anymore - the recent
Wimbledon problems where the schedules for the 2 channels were suddenly
switched at the last second for 2 evenings (causing people recording the
last episode of Robin Hood to miss it, needing it to be rebroadcast a few
weeks later - they switched schedules too late for PVR's to catch the
move).

Quite how they thought that helped anyone is a mystery - anyone recording
Wimbledon would have missed it and had BBC1's normal schedule recorded and
anyone recording the normal BBC1 schedule who would have got home to find a
tape / DVD of wimbledon (or in the case of PVR's nothing at all - the
change made recordings just cancel).
 
Arfa Daily wrote:

"Dave Plowman (News)" <dave@davenoise.co.uk> wrote in message
news:508ac7be38dave@davenoise.co.uk...
In article <Kfbhm.225069$xB.193120@newsfe10.ams2>,
Arfa Daily <arfa.daily@ntlworld.com> wrote:
And now that "The Bill" has got a 9 o'clock slot, they've changed the
shooting medium to something that looks altogether 'wrong',

It's called HD. ;-)


Are you sure that's what it is ? Any HD that I've seen is just that. A
perfectly 'normal' looking picture, but with a higher resolution. Why
should a higher res camera change the tonal composition of the picture ?
(assuming that it is being shot on video). Looks more like they've changed
from film to video, or the other way round perhaps. Or are maybe using a
video mode that attempts to simulate film, something like that. I saw it
before on the programme when they did a couple of 'specials'. Didn't like
it then, don't like it now.
Oh dear, sounds like the horrible filmic processing - where they reverse the
order of the 2 interlaced half frames to give the picture a juddering
effect which is claimed to look more like film.
 
In article <7dudnfkL1oZuVhrXnZ2dnUVZ8txi4p2d@brightview.co.uk>,
Nigel Feltham <nigel.feltham@btinternet.com> wrote:
Are you sure that's what it is ? Any HD that I've seen is just that. A
perfectly 'normal' looking picture, but with a higher resolution. Why
should a higher res camera change the tonal composition of the picture
? (assuming that it is being shot on video). Looks more like they've
changed from film to video, or the other way round perhaps. Or are
maybe using a video mode that attempts to simulate film, something
like that. I saw it before on the programme when they did a couple of
'specials'. Didn't like it then, don't like it now.

Oh dear, sounds like the horrible filmic processing - where they reverse
the order of the 2 interlaced half frames to give the picture a
juddering effect which is claimed to look more like film.
No - The Bill has never used that. Or rather not in general - it may have
been tried on a 'special'.
The current ones are shot HD using progressive scan.

But IIRC, they suppress one field and repeat the other for this effect?

--
*He who laughs last, thinks slowest.

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
Dave Plowman (News) wrote:
In article <7dudnfkL1oZuVhrXnZ2dnUVZ8txi4p2d@brightview.co.uk>,
Nigel Feltham <nigel.feltham@btinternet.com> wrote:
Are you sure that's what it is ? Any HD that I've seen is just that. A
perfectly 'normal' looking picture, but with a higher resolution. Why
should a higher res camera change the tonal composition of the picture
? (assuming that it is being shot on video). Looks more like they've
changed from film to video, or the other way round perhaps. Or are
maybe using a video mode that attempts to simulate film, something
like that. I saw it before on the programme when they did a couple of
'specials'. Didn't like it then, don't like it now.

Oh dear, sounds like the horrible filmic processing - where they reverse
the order of the 2 interlaced half frames to give the picture a
juddering effect which is claimed to look more like film.

No - The Bill has never used that. Or rather not in general - it may have
been tried on a 'special'.
The current ones are shot HD using progressive scan.

But IIRC, they suppress one field and repeat the other for this effect?
Certainly, no one in their right mind would deliberately reverse the
interlace ordering - the result is unwatchable.

Sylvia.
 
Dave Plowman (News) wrote:
In article <zm2hm.283768$Sn5.199463@newsfe26.ams2>,
Arfa Daily <arfa.daily@ntlworld.com> wrote:
I reckon that TV companies must now use these laptops with very rough
RTCs ! Have you noticed that now programme material is not networked
from one region into some or all of the others, and adverts are no
longer 'local', there is not any need for accurate cueing points around
the network, so advertised starting times are not even nodded at ? I
checked the starting times of about half a dozen programmes tonight,
using the teletext clock, which I believe to be accurate, and not a
single one started within 1 minute of the correct time, and a couple of
them were off by several minutes. Just another manifestation of
declining standards throughout the civilised world ... :-\

Depends - the actual ad break times are pretty accurate between some of
the companies - the idea being to prevent channel hopping when the ads
come on. You'll just see ads on the others. Hence the way they crash into
the break on progs not made with this schedule in mind. And most of ITV
comes from just one playout centre, so should be synchronised across the
country.
Start times for progs have never been accurately published. They've always
been approximate - apart from on some data points in the evening.
Here in Australia I got documentary proof that a station was
deliberately running late. See

http://groups.google.com/group/aus.tv/browse_frm/thread/703f398d4e875bc6/a1a580334e9ff9f2

I had recorded that channel that evening, on a PC that has its clock
synchronized to an accurate clock, and the times given in that schedule
were to within one second of when the material was actually broadcast.

They just weren't the times that had been advertised.

Sylvia.
 
Nigel Feltham wrote:
Arfa Daily wrote:


If you are the
telephone company, or a television broadcaster, though, things really do
work a lot better when the digital signals carried by your network all
are at precisely the same bitrate, no matter where they come from.
Right. At one time TV stations etc had their own accurate pulse generator
referenced to the national standard. Here in the UK it was IIRC from the
National Physics Laboratory.

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.

I reckon that TV companies must now use these laptops with very rough RTCs
! Have you noticed that now programme material is not networked from one
region into some or all of the others, and adverts are no longer 'local',
there is not any need for accurate cueing points around the network, so
advertised starting times are not even nodded at ? I checked the starting
times of about half a dozen programmes tonight, using the teletext clock,
which I believe to be accurate, and not a single one started within 1
minute of the correct time, and a couple of them were off by several
minutes. Just another manifestation of declining standards throughout the
civilised world ... :-\

It's not just between broadcasters, the BBC does it between their channels
as well. Their 'Points Of View' viewer complaints show have done a few
reports on viewers complaining about different times on BBC1 and BBC2, at
least one of which had one of their presenters switching between the 2
channels at programme change to demonstrate the problem.

The problem (which they actually proved was real - surprised they were
allowed to show that on BBC1) is that BBC1 often runs 2 minutes early and
BBC2 is 2 minutes late. Switch one way and you have to wait 4 mins for
programme start, switch the other and you miss the start.
Much as I'd like to be able to support the view that the BBC's standards
are falling, I have to advise that I was already being frustrated by the
BBC's apparent inability to keep to its published schedules back in the
early 1980s. This is nothing new.

Australia's counterpart, the government funded ABC which also doesn't
carry advertisements, is also apparently unable, or unwilling, to
broadcast things when they say they will.

I suspect that, as with the commercial stations, it's deliberate. I'm
just less than clear what the motivation would be for a non-commercial
station.

Sylvia.
 
On Sat, 15 Aug 2009 14:56:05 -0700, David Nebenzahl
<nobody@but.us.chickens>wrote:

On 8/14/2009 11:46 PM isw spake thus:

In article <4a8509e4$0$7465$822641b3@news.adtechcomputers.com>,
David Nebenzahl <nobody@but.us.chickens> wrote:

Sounds OK to me, except that I just checked and reset my computah's
clock (I use a little Windoze utility called "NIStime" that gets the
time from NIST); it was off by about 5 minutes. Haven't synched it up
for at least 6 months, so I know my RTCC is at least that accurate.
(Running W2K, so I assume that no software process is adjusting my
clock.) Shouldn't most PC clocks be about that accurate? (Older MB,
forget exactly what, can find out if you're interested.)

Most crystals used in computers are within ten or 20 parts per million
of the frequency stamped on the case (you can get a lot more accurate
ones, but computers don't need it). AFAIR, those little cylindrical
"clock" crystals that run at 32,768 Hz are at least ten times poorer,
and far more temperature sensitive to boot. I think the *best* you could
expect from one of those without special treatment would be about a
minute a month.

Hope I'm not belaboring the point here. I just ran "net time" again and
got the error message "Could not locate a time-server". So I assume that
even if that process is running on my computer, as someone else here
asserted, it's not doing anything to my RTC, as there are no
time-servers to query (that it knows about). Therefore, the time my
computer displays is the actual RTC value. Therefore, it seems to be at
least as accurate as you've stated (about a minute a month), which
actually seems pretty damn good to me. If it gets off by 12 minutes a
year, resetting the thing once annually would yield a clock that should
be close enough for most folks' purposes.
You have to set net time up before you can use it.
 
On Sun, 16 Aug 2009 01:44:13 +0100, "Arfa Daily"
<arfa.daily@ntlworld.com>wrote:

"Meat Plow" <meat@petitmorte.net> wrote in message
news:3230kb.2rp.19.3@news.alt.net...
On Sat, 15 Aug 2009 01:24:19 +0100, "Arfa Daily"
arfa.daily@ntlworld.com>wrote:

"Dave Plowman (News)" <dave@davenoise.co.uk> wrote in message
news:508ac7be38dave@davenoise.co.uk...
In article <Kfbhm.225069$xB.193120@newsfe10.ams2>,
Arfa Daily <arfa.daily@ntlworld.com> wrote:
And now that "The Bill" has got a 9 o'clock slot, they've changed the
shooting medium to something that looks altogether 'wrong',

It's called HD. ;-)


Are you sure that's what it is ? Any HD that I've seen is just that. A
perfectly 'normal' looking picture, but with a higher resolution. Why
should
a higher res camera change the tonal composition of the picture ?
(assuming
that it is being shot on video). Looks more like they've changed from film
to video, or the other way round perhaps. Or are maybe using a video mode
that attempts to simulate film, something like that. I saw it before on
the
programme when they did a couple of 'specials'. Didn't like it then, don't
like it now.

My Pana 51" has a different color matrix for SD and HD.

Explain some more ?

Arfa
Not sure what a 'color matrix' is but the color is much more vibrant
on HD. And this is a rear projector circa 1999 so its HD is 480p :)
 
In article <0003dcd7$0$7860$c3e8da3@news.astraweb.com>,
Sylvia Else <sylvia@not.at.this.address> wrote:
Much as I'd like to be able to support the view that the BBC's standards
are falling, I have to advise that I was already being frustrated by the
BBC's apparent inability to keep to its published schedules back in the
early 1980s. This is nothing new.
If you give it some thought, it's near impossible to make a prog run 'to
the second', as some seem to want. You could, of course, always make it
shorter and fill the gaps with trails etc - allowing the next one to start
on the second. But that would bring even more complaints. ;-)

--
*Friends help you move. Real friends help you move bodies.

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
David Nebenzahl wrote:
Hope I'm not belaboring the point here. I just ran "net time" again and
got the error message "Could not locate a time-server". So I assume that
even if that process is running on my computer, as someone else here
asserted, it's not doing anything to my RTC, as there are no
time-servers to query (that it knows about). Therefore, the time my
computer displays is the actual RTC value. Therefore, it seems to be at
least as accurate as you've stated (about a minute a month), which
actually seems pretty damn good to me. If it gets off by 12 minutes a
year, resetting the thing once annually would yield a clock that should
be close enough for most folks' purposes.

http://download.cnet.com/Atomic-Clock-Sync/3000-18512_4-14844.html


--
You can't have a sense of humor, if you have no sense!
 
Dave Plowman (News) wrote:

In article <0003dcd7$0$7860$c3e8da3@news.astraweb.com>,
Sylvia Else <sylvia@not.at.this.address> wrote:
Much as I'd like to be able to support the view that the BBC's standards
are falling, I have to advise that I was already being frustrated by the
BBC's apparent inability to keep to its published schedules back in the
early 1980s. This is nothing new.

If you give it some thought, it's near impossible to make a prog run 'to
the second', as some seem to want. You could, of course, always make it
shorter and fill the gaps with trails etc - allowing the next one to start
on the second. But that would bring even more complaints. ;-)
Maybe not with live shows but when shows are pre-recorded the broadcaster
knows the exact length of each show long before broadcast so should be able
to make the published schedule fit what is actually broadcast - like if you
broadcast a pre-recorded show at 8pm and you know the recording is exactly
60 mins long then advertise the next one as 9:02 to allow for trailers not
9:00 and run late.

Why is BBC1's 'ONE SHOW' always broadcast 2 minutes early (both start and
end times) - I know it's live but showing just 1 trailer before the show
would make it run to schedule, surely showing extra trailers would bring in
less complaints than viewers missing the first 2 minutes of every episode.
 
On 8/16/2009 6:52 AM Meat Plow spake thus:

On Sat, 15 Aug 2009 14:56:05 -0700, David Nebenzahl
nobody@but.us.chickens>wrote:

Hope I'm not belaboring the point here. I just ran "net time" again
and got the error message "Could not locate a time-server". So I
assume that even if that process is running on my computer, as
someone else here asserted, it's not doing anything to my RTC, as
there are no time-servers to query (that it knows about).
Therefore, the time my computer displays is the actual RTC value.
Therefore, it seems to be at least as accurate as you've stated
(about a minute a month), which actually seems pretty damn good to
me. If it gets off by 12 minutes a year, resetting the thing once
annually would yield a clock that should be close enough for most
folks' purposes.

You have to set net time up before you can use it.
Well, duh; that was kinda my point.

So I take it you don't disagree with what I said, or have nothing else
to add?


--
Found--the gene that causes belief in genetic determinism
 
On 8/16/2009 9:55 AM Michael A. Terrell spake thus:

David Nebenzahl wrote:

Hope I'm not belaboring the point here. I just ran "net time" again and
got the error message "Could not locate a time-server". So I assume that
even if that process is running on my computer, as someone else here
asserted, it's not doing anything to my RTC, as there are no
time-servers to query (that it knows about).

http://download.cnet.com/Atomic-Clock-Sync/3000-18512_4-14844.html
Thanks, but I'm happy with the little utility I already use that
contacts NIST (Nat'l Institute of Standards and Technology); see
http://tf.nist.gov/service/its.htm for more info.


--
Found--the gene that causes belief in genetic determinism
 
On Sun, 16 Aug 2009 13:41:27 -0700, David Nebenzahl
<nobody@but.us.chickens>wrote:

On 8/16/2009 6:52 AM Meat Plow spake thus:

On Sat, 15 Aug 2009 14:56:05 -0700, David Nebenzahl
nobody@but.us.chickens>wrote:

Hope I'm not belaboring the point here. I just ran "net time" again
and got the error message "Could not locate a time-server". So I
assume that even if that process is running on my computer, as
someone else here asserted, it's not doing anything to my RTC, as
there are no time-servers to query (that it knows about).
Therefore, the time my computer displays is the actual RTC value.
Therefore, it seems to be at least as accurate as you've stated
(about a minute a month), which actually seems pretty damn good to
me. If it gets off by 12 minutes a year, resetting the thing once
annually would yield a clock that should be close enough for most
folks' purposes.

You have to set net time up before you can use it.

Well, duh; that was kinda my point.
But only in W2K. XP, Vista, and 7 all have default SNTP setup although
the default server 'time.windows.com' is a bit dodgy. I use
time-nw.nist.gov.

So I take it you don't disagree with what I said, or have nothing else
to add?
I agree that for most a minute per month is reasonable but I would
expect the same accuracy as my $29.99 Timex wristwatch which is more
like a second a month. If you use the NIST SNTP server you'll be as
accurate as how frequently your SNTP client updates.
 
In article <XMSdnfsBxIPy1RXXnZ2dnUVZ8nSdnZ2d@brightview.co.uk>,
Nigel Feltham <nigel.feltham@btinternet.com> wrote:
If you give it some thought, it's near impossible to make a prog run
'to the second', as some seem to want. You could, of course, always
make it shorter and fill the gaps with trails etc - allowing the next
one to start on the second. But that would bring even more complaints.
;-)

Maybe not with live shows but when shows are pre-recorded the
broadcaster knows the exact length of each show long before broadcast so
should be able to make the published schedule fit what is actually
broadcast - like if you broadcast a pre-recorded show at 8pm and you
know the recording is exactly 60 mins long then advertise the next one
as 9:02 to allow for trailers not 9:00 and run late.
Oh they do know the *exact* length of a pre-recorded show - but even those
won't run on time to the second. And so much is automated these days,
playout wise.

Why is BBC1's 'ONE SHOW' always broadcast 2 minutes early (both start
and end times) - I know it's live but showing just 1 trailer before the
show would make it run to schedule, surely showing extra trailers would
bring in less complaints than viewers missing the first 2 minutes of
every episode.
Do people really switch on at the exact minute? More of a problem with VHS
recorders where you're swapping channels to record two progs. Luckily PVRs
get round this - to some extent. But the one thing you can be sure of is
programme companies not cooperating with one another just for the viewer.
;-)

--
*Warning: Dates in Calendar are closer than they appear.

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
On 8/16/2009 2:51 PM Meat Plow spake thus:

I agree that for most a minute per month is reasonable but I would
expect the same accuracy as my $29.99 Timex wristwatch which is more
like a second a month.
So that kinda begs the question of why computer mfrs. can't (or won't)
include clocks that are at *least* as accurate as a Timex, no? Wouldn't
a computah be a more compelling reason for a more accurate clock? (I
know, $$$ bottom line, right?)

If you use the NIST SNTP server you'll be as accurate as how
frequently your SNTP client updates.
Of course, it would be nice to know one's computer would maintain
accurate time even if, god forbid, it was somehow disconnected from The
Network ...


--
Found--the gene that causes belief in genetic determinism
 
On Sun, 16 Aug 2009 17:34:37 -0700, David Nebenzahl
<nobody@but.us.chickens> wrote:

On 8/16/2009 2:51 PM Meat Plow spake thus:

I agree that for most a minute per month is reasonable but I would
expect the same accuracy as my $29.99 Timex wristwatch which is more
like a second a month.

So that kinda begs the question of why computer mfrs. can't (or won't)
include clocks that are at *least* as accurate as a Timex, no? Wouldn't
a computah be a more compelling reason for a more accurate clock? (I
know, $$$ bottom line, right?)
Because it's difficult. The right way to have done it would have been
to do a function call from an RTC (real time clock) every time some
application needs the actual time. IBM or MS, in their infinite
wisdom, elected to install an RTC on the mainboard, copy its contents
to the operating system, and then let the OS have the time available
without having to read it from the RTC chip. Great idea in the days
of 4.77MHz CPU's, which don't have too many operations per second. Not
so great an idea with 3GHz processors, where the much larger number of
operations per second will produce far more lost interrupts per
second. The result is clock drift, always in the form of losing time.
Most apps that require accurate time (i.e. SMTPE time code
synchronized NLS editor, SONET, etc) will usually get the time from an
external source, rather than use the OS or even the RTC.

If you use the NIST SNTP server you'll be as accurate as how
frequently your SNTP client updates.

Of course, it would be nice to know one's computer would maintain
accurate time even if, god forbid, it was somehow disconnected from The
Network ...
There are internal GPS receivers that will supply accurate bus timing.
<http://www.symmetricom.com/products/gps-solutions/bus-level-timing/>

If your worried about losing sync when the internet hickups, you can
go cheap and just use the NMEA-182 time data from the GPS or the 1pps
time ticks. Last resort is a WWVB time receiver, which works quite
well in the middle of the night, when you probably don't need it.

Incidentally, I had an odd experience back in the stone age of PC's. I
was doing work for a local PC dealer. I wrote my first, and almost
last, Turbo Pascal program that displayed an analog clock on the CGA
screen, and planted it on a PC in the window of the store. I knew it
wasn't terribly accurate, but it was tolerable (at about 5 minutes per
day). Shoppers would walk up to the window, look at the computer
screen, and then reset their wrist watches using the PC as a
reference. Of course, the computer MUST be more accurate. I
eventually had to put a sign in the window warning that this was a bad
idea.


--
Jeff Liebermann jeffl@cruzio.com
150 Felker St #D http://www.LearnByDestroying.com
Santa Cruz CA 95060 http://802.11junk.com
Skype: JeffLiebermann AE6KS 831-336-2558
 
In article <4a88a59a$0$7469$822641b3@news.adtechcomputers.com>,
David Nebenzahl <nobody@but.us.chickens> wrote:

So that kinda begs the question of why computer mfrs. can't (or won't)
include clocks that are at *least* as accurate as a Timex, no? Wouldn't
a computah be a more compelling reason for a more accurate clock? (I
know, $$$ bottom line, right?)
Yup. Most of these computers use motherboards which are manufactured
under extreme competitive pressure. Shaving a few pennies off of the
bill-of-materials, per board, can make the difference between getting
the contract and not.

--
Dave Platt <dplatt@radagast.org> AE6EO
Friends of Jade Warrior home page: http://www.radagast.org/jade-warrior
I do _not_ wish to receive unsolicited commercial email, and I will
boycott any company which has the gall to send me such ads!
 

Welcome to EDABoard.com

Sponsor

Back
Top