Theoretically, that's true -- and it's a good point to bring up. However, in practice, DSLR manufacturers have had a very hard time bringing the low light advantage to market for big pixels (D3S was one exception). As of right now, most of the 2-micron pixels (e.g. digicam) have 3 times less read noise than most of the 6-micron pixels (DSLR), giving the same noise floor after integration. But even if they fix whatever that problem is in the next generation, the difference is only going to affect people who shoot very high ISO (which is a large percentage of online forum users, but a very low percentage of photographers overall). The majority of photographers are already limited by photon shot noise only -- which so far has remained the same for a huge variety of pixel sizes, thanks to microlenses. (Not counting the effects of OLPF, sharpening, and subjective perception -- all of which favor smaller pixels as it pertains to noise if and when they come into play.)
It wouldn't -- unless you assume (as Nikon apparently did for this article) that the viewing conditions are such that low-resolutions (less than 36 MP) cannot take full advantage of the display. If they had instead assumed all their users were only displaying uncropped images on HD projectors and 4x6 prints, then the PDF would have been a lot smaller.
The way your test is setup mixes the effect of sensor+lens size with pixel size. For example, if you had tried the 20D+70mm vs 5D2+112mm, you would find the 5D2 far superior, even though the pixel size is the same. That is because of the benefits of sensor size (especially if f-number is kept constant).