Falling off the edge
The problem digital TV is that you either get the picture or you don’t. I was watching a show this evening when everything went black. I tried other stations and they were black too. I’m fairly certain there is some sort of weather going on between me and the stations antennas. In the old analog days, just a couple of months ago, the stations would have filled with static, and I could have watched the end of the show. Now I just have to wait until I can watch it on the net (and they don’t seem to have fast forward), or catch repeat. I think I’ll just skip it.
I started thinking about this digital drop point. Why should they drop off so quickly? I haven’t studied the technology behind it yet, but I wonder why a digital signal should just drop off to nothing. On occasion, I’ve noticed the pixelated views of the digital signal when the wind starts blowing, or rain falls. I’m thinking that an advanced receiver could capture enough pixels to keep the sound and video going for a bit. A time delay buffer should be able to mix and fill in the missing data. It’s done in digital videos and photography, so why not on TV. Just wondering…