Digital bullshit

In article <r5u145lecmg2oqgbr6cvvgr4b5ik7uh6kt@4ax.com>,
Jeff Liebermann <jeffl@cruzio.com> wrote:
In the past, the FCC limited commercials to 15 minutes
per hour. Now, it's 30 minutes, which I find oppressive.
Crikey. Hope that doesn't happen in the UK. More than bad enough at 15
minutes.

--
*Make it idiot-proof and someone will make a better idiot.

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
On 6/23/2009 5:46 AM Meat Plow spake thus:

I don't remember being able to listen to my favorite analog
radio stations while traveling.
So you're telling us that car radios were just a fantasy and never existed?

Oh, you mean that you couldn't listen to your favorite radio station,
say, all the way across country (the U.S.)?

Poor baby.


--
Found--the gene that causes belief in genetic determinism
 
On 23 Juny, 12:20, b <reverend_rog...@yahoo.com> wrote:
on a related note, a relative bought one of those 'mini' digital set
top boxes that are really just an extended scart plug (US: AV
connector used in europe) with a plug-in IR detector to stick on the
front or top of your TV.
I have never seen such crap in all my life, even with a known good
signal there was breakup and pixellating. They must use a cheap tuner
or something. In the end I used a first generation box and all was
well!
I also bought one of these (an NPG) and it is pretty good given it was
the cheapest. The tuner is good, somewhat more sensitive than average
cheap top boxes but what I really like about it is its separate IR
sensor that you can place where it will properly see your remote. The
remote works from everywhere unlike typical boxes that you must point
to very precisely.

I also found most of the times where a particular brand or model of
receiver appears to be less sensitive it is actually a problem with
self generated noise or marginal shielding. Not the tuner itself but
poor supply filtering or shielding from its digital circuits, in most
of them you can see the noise on some weak analog channels (assuming
the receiver is connected between the antenna and TV). I have found
two effective solutions:
- use better RF cable along with properly shielded antenna connectors
(note that in these last years its has become very common to use cheap
prebuilt antenna cables that have very bad shielding, if any at all)
- connect some other active device between the antenna and the
receiver (eg. a VCR or a satellite receiver with antenna passthrough).
It really helps, have done that many times, and my theory is that the
lower noise of the other device helps sense the weak antenna signals
and stops the receiver noise to reach the antenna.

Anything that interrupts the datastream is disasterous
for digital, whereas with analogue this just wasn't an issue.
For marginal reception you can still see the program quite well with
just some small coloured rectangles appearing randomly. However I
agree that where the analog was still viewable the digital does not
work.
 
On Mon, 22 Jun 2009 18:19:15 -0400, Hipupchuck
<hipupchuck@roadrunner.com> wrote:

I don't think digital is ready for prime time.
I haven't had a single digital cell phone conversation without some
audio fuckups of some kind. I can't watch a single television program or
documentary without some kind of audio or video fuckup of some kind.
I listen to PBS radio a lot and every day they have some audio or RF
fuckup of some kind or some fucking emergency test fuckup of some kind.
This is digital shit is really a fucked up system. Maybe I'm too old or
something but I don't remember this problem in the old days with analog
things.
If you don't remember a problem with analog you have a very short
memory. Mine isn't perfect, but I can recall:

Phones (land line): Humming, buzzing, echos on the lines.

Phones (cordless): All of the above, plus fading within 100 ft.

Phones (Analog Cell): Big, heavy,short battery life, constant signal
loss problems.

Radio (AM): Interference from any spark source, including lightning,
automobile ignition, electric drills, 'Skil Saws' and electric fence
chargers. Also signal fading and interference from distant stations.

Radio (FM): Multipath interference, drifting, loss of signal while
driving.

Television: Interference from aircraft, lightning, and every other
source imaginable. Also crappy picture, drifting color and sweep
frequency.

Digital TV isn't perfect, but it also hasn't had 6 decades of
improvements. If you want a demonstration of the potential of digital
TV, take a DVD player, ideally one with both S-video and component
outputs, a 19" analog TV, one of Sony's older Digital sets (such as
the KDL-19M4000), and an appropriate antenna (I'm using a Winegard
GS-1000). Play a recent DVD and compare the picture on your 'perfect'
analog TV and on the digital TV. Then tune in to a HD digital
broadcast. A golf broadcast on CBS is ideal.

PlainBill
 
On 23 jun, 19:56, Jeroni Paul <JERONI.P...@terra.es> wrote:
On 23 Juny, 12:20, b <reverend_rog...@yahoo.com> wrote:

on a related note, a relative bought one of those 'mini' digital set
top boxes

I also bought one of these (an NPG) and it is pretty good given it was
the cheapest. T
well, I had the misfortune to get one of these: http://www.saxem.com/index_electronica.html

......and my experience was the opposite of yours. I have contacted the
manufacturers and will let you know what they say-don't hold your
breath though!
 
On Tue, 23 Jun 2009 17:50:20 +0100, "Dave Plowman (News)"
<dave@davenoise.co.uk>wrote:

In article <r5u145lecmg2oqgbr6cvvgr4b5ik7uh6kt@4ax.com>,
Jeff Liebermann <jeffl@cruzio.com> wrote:
In the past, the FCC limited commercials to 15 minutes
per hour. Now, it's 30 minutes, which I find oppressive.

Crikey. Hope that doesn't happen in the UK. More than bad enough at 15
minutes.
The average length of a 1 hour program is 40 minutes with 20 minutes
of commercials here. I absolutely abhor commercials and watch nothing
live. I plan all my watching well in advance with help of my DVR.
Takes about 5 seconds to ffwd through commercials and I've gotten good
enough at it now to know where to stop so the DVR can back up to the
start of the next segment. Also, the programmers are pretty clever
with their timing. Most limit the start of the commercials to after
the first 10-12 minutes of the program. Then its approx every 8
minutes after depending on what channel you're watching. This is for a
1 hour show. 30 minute shows are around 8 minutes after the start then
every 5 or so minutes after and are generally 1/3 shorter in duration.

I recall watching or reading something about the strategy of
commercial advertising on tv. It's well studied seeing the expense
companies incur to produce and air them.
 
Hi!

Which tuner card do recommend?
I'm not sure I do, yet.

http://greyghost.mooo.com/tv3way/ was my look at one tuner and some
different software packages.

 I would like to find one with onboard decoding, not dependent on
the host CPU, that has well written drivers (directshow).
For *analog* TV capture-to-video Hauppauge had a couple of different
solutions. The WinTV HVR-1600 mentioned in the review linked to above
has an onboard MPEG2 encoder as part of its Conexant chipset.

But for digital/ATSC, the process is done entirely in software running
on the host CPU. Software that, for no particularly good reason, could
sap a Pentium 4 531 CPU in a Dell OptiPlex 210L PC.

I think it's a question of efficiency somewhere--and I also think that
one of the Digital TV converter box SoCs could be used on these cards
to shuffle the burden away from the main system CPU.

Just recently, I gave an ATI TV Wonder HD650 a spin, thinking that it
might work better than the Hauppauge card. Getting it installed was an
incredible job. Even the latest version of ATI Setup didn't work at
all. It would constantly drop the machine into a Parity Error
bluescreen!

I'm not the only one who has seen this show up as per some searching
that I did on the subject. You'd think anything serious enough to drop
a working machine into a memory parity error (when the memory is
definitely good and of the right spec) would be glaringly obvious to
these people. I guess not.

I finally did the setup myself, by using the Windows Device Manager to
install the drivers. And with a little prodding, I got the ATI
MultiMedia center installer to pop outside of the multi-installer set
that ATI runs through as they set their software up. This worked
around the parity error problems. The video quality isn't nearly as
good as the Hauppauge card offered. It does seem like *maybe* it
doesn't load the CPU as heavily.

ATI uses their own TV Wonder IC on the TV Wonder HD650 board. It's not
well documented like the Conexant part used by Hauppauge, so what all
it has onboard is a mystery. However, the board very clearly sports a
RAM chip, so I suspect that the ATI IC has some sort of integrated
processor core and quite possibly an MPEG encoder of some sort. But
where it works and when it is used is something I do not yet know.

In the end, a Zenith DTT-901 converter box hooked up to my old ATI TV
Wonder PCI board has worked the best of anything so far when it came
to watching ATSC digital TV on a computer.

William
 
Hi!

I don't think digital is ready for prime time.
Most of the TV channels here are actually doing really well. It hasn't
made the programming any better though. But one of them (WCIA-TV) is
plain miserable. It sucks sucks sucks sucks sucks sucks sucks sucks
sucks sucks and sucks so badly on any set I've watched it on. That is
to say it Just Doesn't Work.

Of course, anything I want to watch on TV is probably on CBS. So it
goes.

I don't know why--I've watched it on sets that could receive their
analog signal with rock-solid clarity and sets that were hooked up to
very good directional antennas pointed the right way. Even the
slightest little thing upsets it to the point of unwatchability.

And did I mention that it sucks? I can't see how these people don't
know there is a problem. (And yes, I'm assuming it is *their* fault
judging by the wide variety of TVs I've watched it on in many
different locations.)

I haven't had a single digital cell phone conversation without
some audio fuckups of some kind.
I hate 'em. Whatever compression or encoding they're using on the
majority of digital cellular telephone networks makes any background
noise sound like intrusive mumbling or underwater bubbling. And that's
to say *nothing* of how people behave in public with them!

(Oh wait! I wrote a rant that was partially about that!
http://greyghost.mooo.com/phonerant.htm)

Of course, I know I'm a minority. I don't carry one with me regularly,
have it turned off when I do have it around.

(I also discovered that there was at least one person in this world
who has less to do than the Maytag repairman of yesteryear. That
person is the person at AT&T who writes you a letter to say that
you've had your phone for so long that they are shutting down the
network it runs on.)

This is digital shit is really a fucked up system. Maybe I'm too
old or something but I don't remember this problem in the old
days with analog things.
The problem (at least as I saw it) was that analog couldn't provide
you with 500 subchannels of TV wrestling (!!!) with high definition
pictures so you can see *all* of the compression artifacting and 5.1
surround sound so you can hear it as well. I'm not sure why we needed
it, seems to me that it was more of a solution looking for a problem
than anything else. They *say* it's for reuse of the old spectrum for
various other purposes, a claim that I guess I find doubtful at best.

Oh, and of course, analog TV *worked*. It might have a snowy picture
or monophonic sound, but who cared. If you wanted to watch (big if)
you *could*.

Wow. Did I just get on a bit of a soapbox or what?

William
 
On Jun 24, 1:47 pm, "William R. Walsh" <wm_wa...@hotmail.com> wrote:
For *analog* TV capture-to-video Hauppauge had a couple of
different
solutions. The WinTV HVR-1600 mentioned in the review linked to
above
has an onboard MPEG2 encoder as part of its Conexant chipset.

But for digital/ATSC, the process is done entirely in software
running
on the host CPU. Software that, for no particularly good reason,
could
sap a Pentium 4 531 CPU in a Dell OptiPlex 210L PC.

I think it's a question of efficiency somewhere--and I also think
that
one of the Digital TV converter box SoCs could be used on these
cards
to shuffle the burden away from the main system CPU.

Just recently, I gave an ATI TV Wonder HD650 a spin, thinking that
it
might work better than the Hauppauge card. Getting it installed was
an
incredible job. Even the latest version of ATI Setup didn't work at
all. It would constantly drop the machine into a Parity Error
bluescreen!

I'm not the only one who has seen this show up as per some
searching
that I did on the subject. You'd think anything serious enough to
drop
a working machine into a memory parity error (when the memory is
definitely good and of the right spec) would be glaringly obvious
to
these people. I guess not.

I finally did the setup myself, by using the Windows Device Manager
to
install the drivers. And with a little prodding, I got the ATI
MultiMedia center installer to pop outside of the multi-installer
set
that ATI runs through as they set their software up. This worked
around the parity error problems. The video quality isn't nearly as
good as the Hauppauge card offered. It does seem like *maybe* it
doesn't load the CPU as heavily.

ATI uses their own TV Wonder IC on the TV Wonder HD650 board. It's
not
well documented like the Conexant part used by Hauppauge, so what
all
it has onboard is a mystery. However, the board very clearly sports
a
RAM chip, so I suspect that the ATI IC has some sort of integrated
processor core and quite possibly an MPEG encoder of some sort. But
where it works and when it is used is something I do not yet know.

In the end, a Zenith DTT-901 converter box hooked up to my old ATI
TV
Wonder PCI board has worked the best of anything so far when it
came
to watching ATSC digital TV on a computer.

William
Right now this computer, AMD Phenom II X4 920 (2.8GHz quad core) is
recording QAM-256 HDTV with a Hauppauge 1250 card running WinTV V6.
CPU activity is 12-15%. The other Phenom machine (8650 2.3 GHz tri-
core) is recording ATSC HDTV with an ATI HDTV Wonder and MMC 9.14
software. CPU activity is 6-10%. The old Athlon XP 3200 with MMC9.14
and another HDTV Wonder runs 15-30% CPU time and while it's less
tolerant of multi-tasking, it has only one purpose - to record HDTV -
so it's no problem. I'm NOT a fan of ATI's software. I went through a
LOT of 'upgrades' until they finally gave up and started over with the
650 family. Seems they still have issues. The 9.14 software has faults
but once you figure out where the bodies are buried, it's pretty
livable and most importanly, predictable / reliable. It's actually a
rather rare event for the software to blow a recording. It's much more
likely that I goofed something up like not turning on the machine.

 
William R. Walsh <wm_walsh@hotmail.com> wrote in message
news:e34e1bb9-813a-4210-99b5-51794d0c17c3@r3g2000vbp.googlegroups.com...
Hi!

I don't think digital is ready for prime time.

Most of the TV channels here are actually doing really well. It hasn't
made the programming any better though. But one of them (WCIA-TV) is
plain miserable. It sucks sucks sucks sucks sucks sucks sucks sucks
sucks sucks and sucks so badly on any set I've watched it on. That is
to say it Just Doesn't Work.

Of course, anything I want to watch on TV is probably on CBS. So it
goes.

I don't know why--I've watched it on sets that could receive their
analog signal with rock-solid clarity and sets that were hooked up to
very good directional antennas pointed the right way. Even the
slightest little thing upsets it to the point of unwatchability.

And did I mention that it sucks? I can't see how these people don't
know there is a problem. (And yes, I'm assuming it is *their* fault
judging by the wide variety of TVs I've watched it on in many
different locations.)

I haven't had a single digital cell phone conversation without
some audio fuckups of some kind.

I hate 'em. Whatever compression or encoding they're using on the
majority of digital cellular telephone networks makes any background
noise sound like intrusive mumbling or underwater bubbling. And that's
to say *nothing* of how people behave in public with them!

(Oh wait! I wrote a rant that was partially about that!
http://greyghost.mooo.com/phonerant.htm)

Of course, I know I'm a minority. I don't carry one with me regularly,
have it turned off when I do have it around.

(I also discovered that there was at least one person in this world
who has less to do than the Maytag repairman of yesteryear. That
person is the person at AT&T who writes you a letter to say that
you've had your phone for so long that they are shutting down the
network it runs on.)

This is digital shit is really a fucked up system. Maybe I'm too
old or something but I don't remember this problem in the old
days with analog things.

The problem (at least as I saw it) was that analog couldn't provide
you with 500 subchannels of TV wrestling (!!!) with high definition
pictures so you can see *all* of the compression artifacting and 5.1
surround sound so you can hear it as well. I'm not sure why we needed
it, seems to me that it was more of a solution looking for a problem
than anything else. They *say* it's for reuse of the old spectrum for
various other purposes, a claim that I guess I find doubtful at best.

Oh, and of course, analog TV *worked*. It might have a snowy picture
or monophonic sound, but who cared. If you wanted to watch (big if)
you *could*.

Wow. Did I just get on a bit of a soapbox or what?

William

Its about making money. If I had the brass neck to sell empty space for
billions, I'd be done for fraud.

http://www.eweek.com/c/a/Mobile-and-Wireless/Qualcomm-Moves-Into-Vacant-Broa
dcasters-Spectrum-313032/

"Long dubbed "beachfront" spectrum, the 700MHz band is considered ideal for
advanced wireless services such as mobile television and wireless broadband
because the signals are strong enough to penetrate most interference. In
all, the spectrum auction brought in almost $20 billion, with Verizon
spending $9.6 billion and AT&T dropping another $6.6 billion. "


--
Diverse Devices, Southampton, England
electronic hints and repair briefs , schematics/manuals list on
http://home.graffiti.net/diverse:graffiti.net/
 
"Meat Plow" <meat@petitmorte.net> wrote in message
news:2tpu0u.69v.17.3@news.alt.net...
On Tue, 23 Jun 2009 17:50:20 +0100, "Dave Plowman (News)"
dave@davenoise.co.uk>wrote:

In article <r5u145lecmg2oqgbr6cvvgr4b5ik7uh6kt@4ax.com>,
Jeff Liebermann <jeffl@cruzio.com> wrote:
In the past, the FCC limited commercials to 15 minutes
per hour. Now, it's 30 minutes, which I find oppressive.

Crikey. Hope that doesn't happen in the UK. More than bad enough at 15
minutes.

The average length of a 1 hour program is 40 minutes with 20 minutes
of commercials here. I absolutely abhor commercials and watch nothing
live. I plan all my watching well in advance with help of my DVR.
Takes about 5 seconds to ffwd through commercials and I've gotten good
enough at it now to know where to stop so the DVR can back up to the
start of the next segment. Also, the programmers are pretty clever
with their timing. Most limit the start of the commercials to after
the first 10-12 minutes of the program. Then its approx every 8
minutes after depending on what channel you're watching. This is for a
1 hour show. 30 minute shows are around 8 minutes after the start then
every 5 or so minutes after and are generally 1/3 shorter in duration.

I recall watching or reading something about the strategy of
commercial advertising on tv. It's well studied seeing the expense
companies incur to produce and air them.
I try to do most of my watching via PVR, either later after the programme
has finished, or about 15 minutes behind so that I 'catch up' in the last
segment after the last commercial break. I got pretty savvy at using x30
whizz speed too by looking at the time stamp that comes up when you do this,
but some of the Sky channels over here seem to have gotten wise to this.
Every couple of ads, they stick in a 'sponsor shot' the same as you get
right at the end of the ad slot. Even though you are watching the time
stamp, it still gets you every time, and you instictively hit the "play"
button ... :-\

Arfa
 
Its about making money. If I had the brass neck to sell empty space for
billions, I'd be done for fraud.

http://www.eweek.com/c/a/Mobile-and-Wireless/Qualcomm-Moves-Into-Vacant-Broa
dcasters-Spectrum-313032/

"Long dubbed "beachfront" spectrum, the 700MHz band is considered ideal
for
advanced wireless services such as mobile television and wireless
broadband
because the signals are strong enough to penetrate most interference. In
all, the spectrum auction brought in almost $20 billion, with Verizon
spending $9.6 billion and AT&T dropping another $6.6 billion. "


--
Diverse Devices, Southampton, England
electronic hints and repair briefs , schematics/manuals list on
http://home.graffiti.net/diverse:graffiti.net/
Yes, much of it is indeed about making money. There has been a huge amount
of public brainwashing going on with regard to digital TV. Most people think
it is better than analogue, because that is what the government (here in the
uk) want them to think. That allows a switch-off of analogue TV services
with the minimum of fuss, freeing up the UHF bands for sale to cell phone
operators and others, for billions.

Some aspects of digital are better. It allows many more channels (if that's
what you want) to be crammed into the same space. When it is a full data
rate transmission or in HD, with an appropriate (that's good quality,
operating in native mode) receiver on the end, the results can be quite
stunning. But at grass roots level, it also has many shortcomings, not the
least of which is pixellating or nothing at all on even a slightly marginal
signal, feeding an 'average' receiver. Heavily 'charged' air during
thunderstorm activity, has a profound effect on satellite digital reception,
and heavy rain disturbs my digital terrestrial tv to the point of
pixellation and total freeze, even though I am only 20 miles from the
transmitter with a good 'line of sight'. Whilst my analogue reception used
to degrade under similar circumstances, it never-the-less remained
watchable.

If you have a crap digital receiver however, the results can be truly
dreadful. I see supermarket brands all the time, and even on a good quality
transmission, they still look awful. For the most part, the same couldn't be
said for analogue receivers. Over the years, I have owned many 'el cheapo'
analogue TV sets, and they have all performed remarkably well on some very
variable signals.

Thing is, love it or hate it, digital is with us now, and we are just going
to have to embrace its advantages, and live with its shortcomings. I think
that the trick is to spend that bit extra on buying quality equipment, and
make sure that you have the best signal possible available to it. It's easy
to become a technology Luddite, especially as you get older, but sometimes,
I think we cling onto elements of our youth just because we feel comfortable
with them ...

Arfa
 
On Sat, 27 Jun 2009 11:49:35 +0100, "Arfa Daily"
<arfa.daily@ntlworld.com>wrote:

"Meat Plow" <meat@petitmorte.net> wrote in message
news:2tpu0u.69v.17.3@news.alt.net...
On Tue, 23 Jun 2009 17:50:20 +0100, "Dave Plowman (News)"
dave@davenoise.co.uk>wrote:

In article <r5u145lecmg2oqgbr6cvvgr4b5ik7uh6kt@4ax.com>,
Jeff Liebermann <jeffl@cruzio.com> wrote:
In the past, the FCC limited commercials to 15 minutes
per hour. Now, it's 30 minutes, which I find oppressive.

Crikey. Hope that doesn't happen in the UK. More than bad enough at 15
minutes.

The average length of a 1 hour program is 40 minutes with 20 minutes
of commercials here. I absolutely abhor commercials and watch nothing
live. I plan all my watching well in advance with help of my DVR.
Takes about 5 seconds to ffwd through commercials and I've gotten good
enough at it now to know where to stop so the DVR can back up to the
start of the next segment. Also, the programmers are pretty clever
with their timing. Most limit the start of the commercials to after
the first 10-12 minutes of the program. Then its approx every 8
minutes after depending on what channel you're watching. This is for a
1 hour show. 30 minute shows are around 8 minutes after the start then
every 5 or so minutes after and are generally 1/3 shorter in duration.

I recall watching or reading something about the strategy of
commercial advertising on tv. It's well studied seeing the expense
companies incur to produce and air them.

I try to do most of my watching via PVR, either later after the programme
has finished, or about 15 minutes behind so that I 'catch up' in the last
segment after the last commercial break. I got pretty savvy at using x30
whizz speed too by looking at the time stamp that comes up when you do this,
but some of the Sky channels over here seem to have gotten wise to this.
Every couple of ads, they stick in a 'sponsor shot' the same as you get
right at the end of the ad slot. Even though you are watching the time
stamp, it still gets you every time, and you instictively hit the "play"
button ... :-\

Arfa
LOL they do it here on certain shows. I don't have a timestamp on mine
so I watch the content whiz by. And when I see the ad spot for that
program pop up inbetween commercials it gets me sometimes but I've
wised up to them and let it pass. I can usually stop the ffwd at the
point where it rewinds to the restart. My son was watching with me one
night and asked me "how does the dvr know where the starting point is"
after watching me nail it successively a few times :)
 
On Wed, 24 Jun 2009 09:08:09 -0400, Meat Plow <meat@petitmorte.net>
wrote:

On Tue, 23 Jun 2009 17:50:20 +0100, "Dave Plowman (News)"
dave@davenoise.co.uk>wrote:

In article <r5u145lecmg2oqgbr6cvvgr4b5ik7uh6kt@4ax.com>,
Jeff Liebermann <jeffl@cruzio.com> wrote:
In the past, the FCC limited commercials to 15 minutes
per hour. Now, it's 30 minutes, which I find oppressive.

Crikey. Hope that doesn't happen in the UK. More than bad enough at 15
minutes.

The average length of a 1 hour program is 40 minutes with 20 minutes
of commercials here.
Lovely.

The FCC regulates the duration of childrens programming:
<http://www.fcc.gov/mb/facts/program.html>
How much advertising can a cable system transmit during children's
programming?
Cable operators may transmit no more than 10.5 minutes of commercial
matter per hour during children's programming on weekends and no
more than 12 minutes of commercial matter per hour on weekdays.

However, there's a catch:
Cable operators are responsible for compliance with the commercial
limits on locally originated programming and on cable network
programming, but are not responsible for compliance on passively
transmitted broadcast stations or on access channels over which
the cable operator may not exercise editorial control.
Since little content is actually generated by the cable company, even
this law can easily be circumvented.

I absolutely abhor commercials and watch nothing
live.
Same here. I prefer my dinner and television to be freshly killed and
rather well done.

I plan all my watching well in advance with help of my DVR.
I plan nothing. Planning takes all the fun out of entertainment.

Takes about 5 seconds to ffwd through commercials and I've gotten good
enough at it now to know where to stop so the DVR can back up to the
start of the next segment.
When I first rented my DVR from DirecTV, it would fast forward about
30 seconds in about 2 seconds. After several firmware "enhancements",
it now advances 20 seconds and takes about 3 seconds. My Tivo 2 can
jump 60 seconds in about 1 second. When I complained to DirecTV, they
sorta hinted that I should consider myself fortunate to still have the
feature working.

Also, the programmers are pretty clever
with their timing.
Methinks the credit goes to management, as pressured by the
advertisers.

Most limit the start of the commercials to after
the first 10-12 minutes of the program. Then its approx every 8
minutes after depending on what channel you're watching. This is for a
1 hour show. 30 minute shows are around 8 minutes after the start then
every 5 or so minutes after and are generally 1/3 shorter in duration.
Yep. The first 8 minutes or so is called the "hook". The idea is to
get you hooked on watching the rest of the show. There are usually
forward flashes and previews of things to come, all designed to be
interesting and intriguing. However, they soon ruin the effect by
either playing the same commercial over and over, or playing spots
that advertise other upcoming programs. I can sorta tolerate
commercials if they help pay the bills, but what I'm seeing is nothing
more than filler.

I recall watching or reading something about the strategy of
commercial advertising on tv. It's well studied seeing the expense
companies incur to produce and air them.
Yep. Everything that generates revenue is very well studied. The
basic idea is how much profitable garbage can be crammed into a
program before the phone starts ringing or customers drop their
subscription. That's the way the compression level was established.
Crank it up until the phone rings. Then back it down, slightly.

Same with advertising. Keep piling it on until the viewers scream.
Actually, what really happened was hillarious. The viewers didn't
complain or jump ship (churn). The other advertisers complained
because too much back to back advertising tends to reduce the
effectiveness of the later ads as viewers just walk away from the TV
and do something else. The solution was to increase the frequency of
advertisement breaks per hour, resulting in todays 50% ads to content
ratio and about 8 commercial breaks per hour. If you happen to be a
politician, and pressure the cable or satellite vendors to reduce the
ratio, they do not claim that the advertising revenue will decrease.
They'll claim that the extra 10-15 minutes or so of programming
content per hour will cost them too much to produce.
 
In article <To2dnQuPDpD7nt3XnZ2dnUVZ_tqdnZ2d@giganews.com>,
Hipupchuck <hipupchuck@roadrunner.com> wrote:

I don't think digital is ready for prime time.
I haven't had a single digital cell phone conversation without some
audio fuckups of some kind. I can't watch a single television program or
documentary without some kind of audio or video fuckup of some kind.
I listen to PBS radio a lot and every day they have some audio or RF
fuckup of some kind or some fucking emergency test fuckup of some kind.
This is digital shit is really a fucked up system. Maybe I'm too old or
something but I don't remember this problem in the old days with analog
things.
You just haven't put your ears in for the A>D conversion - doesn't hurt
(much) and is cheap. Kids these days are born that way, but us oldies
have to convert - bit like when decimal currency came along

David - who converted back in '79
 
On Sun, 15 Nov 2009 01:43:38 +0000, David wrote:

Some cut out:

You just haven't put your ears in for the A>D conversion - doesn't hurt
(much) and is cheap. Kids these days are born that way, but us oldies
have to convert - bit like when decimal currency came along

David - who converted back in '79

Sorry David you just have very low standards like the people that think cd
based audio is good quality. A-D conversion can bever reprduce a real
simple sine wave nor can it even approximate a complex multi sine wave
with any accuracy even with the best smoothing circuits.

I know the real insides of your so called A>D conversion and it really
stinks since it was based on not ready for prime time theory that was
several years ahead of what the technology was actually able to deliver.
The result was a very poor sampling rates that have flat response and
lacking both transient and brilliance reproduction. In spite of all the
attempts to cover up its short comings (multiple over sampling (probably
the best way of producing muddy sound ever invented), double data rates,
etc. ) have never corrected the base 2K sampling rate that was the fastest
they could make at the time with the fastest ram and AD converters they
had. Blu-Ray is poor quality for that reason since it uses cd audio
recording as it's base instead of using a real high quality sampling rate
like the alternitive format that was not as well funded.

If Sony and Phillips had waited just two years they could have used a at
least a 4K sampling rate with triple the number of words and that could
have produced better audio but they had tied up a lot of money in the low
quality system and wanted to profit from it. If cd audio was actually so
great why did Sony have to buy up all the recording studios and kill all
vinyl production in order to sell it?

There was no real demand for cd recordings so they had to do this in order
to sell them is why.

Fact currently all real high quality recordings are now being reproduced
in vinyl once again because Cd's are poor for audio and really only fit
for data recording in spite of all the tricks that have been tried to
improve their sounds.

I was servicing those stereos in 1979 you listened to and remember the
deaf teenager syndrome caused by kids sitting on top of real 300 watt
audio systems turned up to full volume. Believe me I encountered more
than one idiot that had a 300 watt system with the speakers sitting on
either side of their desk running at or near full volume.

Poor kids never had a chance to really hear any actual quality audio since
they likely lost more than 20% of their overall hearing and most of their
hearing in the 12000-20,000 cycle range. While blasting their ears with
high power base notes and even higher powered high frequency.

I have suffered some high frequency hearing loss just because I worked
with these high power systems for years but I can sure still tell the
difference between digital audio and genuine analog. Digital looses hands
down always! I have my "system destroyer" record still it is a Phillips
recording that has sound from 15 to 20,000 cycles with a brilliance that
no cd can ever approach and no A/D converter can touch.

Gnack
 
In article <pan.2009.11.15.08.15.01.487664@mailinator.com>,
Gnack Nol <mchozfcesujcfc@mailinator.com> wrote:
Fact currently all real high quality recordings are now being reproduced
in vinyl once again because Cd's are poor for audio and really only fit
for data recording in spite of all the tricks that have been tried to
improve their sounds.
Words fail me. Another who can't or won't hear the intrinsic faults of
vinyl.

Rip any vinyl to CD and compare. Properly. Double blind. No one will tell
the difference.

Rip a good CD to vinyl - it has been done - and do the same test. Easy to
tell the difference.

Vinyl adds all sorts of distortions to a signal - and not just clicks and
pops. Of course some like that distortion...

--
*Why do we say something is out of whack? What is a whack? *

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
Sorry David you just have very low standards like the people that think
CD-based audio is good quality. A-D conversion can bever reprduce
a real simple sine wave nor can it even approximate a complex multi-
sine wave with any accuracy even with the best smoothing circuits.
You know not whereof you speak.


I know the real insides of your so called A>D conversion and it really
stinks since it was based on not ready for prime time theory that was
several years ahead of what the technology was actually able to deliver.
The result was a very poor sampling rates that have flat response and
lacking both transient and brilliance reproduction.
Stop by and I'll play some SotA recordings, and you can decide for yourself.

"The proof of the pudding is in the eating." I've been listening to recorded
music for over 40 years, and sound reproduction has finally reached the
point where you are "close[ly] approach[ing] ... the original sound" (to
slightly modify a well-known marketing slogan). I no longer have to close my
eyes to (sort of) imagine I'm in the concert hall.
 
Words fail me. Another who can't or won't hear the intrinsic
faults of vinyl.
I've compared 45-year-old LPs to their CD transfers, and they are often
remarkably close. LP /can/ be a good recording medium -- it just isn't a
very good playback medium. Good LP playback costs about 10 times (at least)
what good CD playback costs.


Why do we say something is out of whack? What is a whack?
According to the OED, this expression appeared in print not much earlier
than 1885. The etymology appears to be based on the same root as "wacky"
(crazy, deluded), rather than the onomatopoeic "whack" (to strike or hit).
 
On Sun, 15 Nov 2009 04:18:29 -0800, William Sommerwerck <grizzledgeezer@comcast.net> wrote:
Words fail me. Another who can't or won't hear the intrinsic
faults of vinyl.

I've compared 45-year-old LPs to their CD transfers, and they are often
remarkably close. LP /can/ be a good recording medium -- it just isn't a
very good playback medium. Good LP playback costs about 10 times (at least)
what good CD playback costs.
There is only one situation where I've found vinyl superior to CD,
but it had nothing to do with either medium. I've had CDs made from
20 year old master tapes that had a terrible S/N due to the age of the
tapes. The LP was superior, but only because the LP aged better than
the master tape used for the CD.
 

Welcome to EDABoard.com

Sponsor

Back
Top