OK, to be exact then: given a specific source, the PQ is only driver dependant.
To be even more exact:
Given a specific source for the decoder, the bitstream coming out of the HDMI must be (bitwise) equal to the reference signal.
If it isn't, either the decoder is using some form of post-processing, or the decoder is faulty.
It never fails to amaze me that people seem to have no difficulty accepting bit-perfect reproduction in a digital audio system, but for some reason the same property for a digital video system is suddenly surrounded with all kinds of black magic.
If you store a text file on your harddisk, and read it back, and then discover that it isn't equal to the original, you'd certainly consider the system faulty. The same logic applies to a digital video system. If it (unintentionally) alters the data, it's faulty.
It reminds me of the days of analog amplifiers, with people having all sorts of weird notions about what would "sound better". A colleague wrapped it in a hilarious, but true, statement: "Tubes sound different than transistors only when you break them." It's basically the same here. Reproduction quality, whether audio or video, is about being as close to the original as possible.