Video DECODING is a lossless operation. If two decoders yield different bitstreams, then at least one of them is faulty. This clearly as opposed to video ENCODING. There is only one lossy step in the whole chain and that is at the encoder at the provider, after that, everything (including beaming up and reception) is lossless.
It's the video decoder code and algorithm in the broadcom chips that is worse in my opinion. Input is digital, output is digital, but you are not forwarding the digital content to the tv set for decoding. STB chip decodes MPEG2 or H.264 data and converts it to HDMI signal, this convertion is worse in broadcom chipsets in my opinion. I see saturated reds for example in all broadcom chipsets, from old DM800 to latest BCM7413 found in VU+ Ultimo and ET9200. Colors on STi seems much more "correct". When I do tests for picture quality I usually use prerecorded material, 1080i, so scaling and deinterlacing is not in the equation.
I believe this is incorrect, decoding is not a lossless operation, every decoder produces different output. The algorithm used allows specific optimizations to improve quality. At least this was the case for MPEG2 decoders, don't know if this has changed in H.264.