Just the other day I came across some kind of article that included a bit on a wireless computer to monitor link, let me go dig and see if I can find it.
I wonder why IR didn't catch on? Maybe the available hardware was too anemic at the time? But there have been gigabit IR links in R&D labs. As screens get larger, the bandwidth requirements get larger. Maybe that combined with the negotiation overhead between devices at unknown distances did it in. (Line of sight is also a problem for some applications. But for my monitor setup, it would work fine.)
A small nit: proposed UWB (ultra wide band) schemes at 5GHz and 60GHz are considered EHF (extremely high frequency) or millimeter-wave; IR is generally considered to start at at least 300GHz.
There isn't a heck of a lot of interference per se at 60GHz, but it is very hard to design a transmitter that is not highly directional and line-of-sight sensitive. My feeling is that the low range and line of sight requirements mostly limit current incarnations of this technology to situations that are already amenable to HDMI cables.
There's a good writeup from earlier this year on Ars Technica ( http://arstechnica.com/gadgets/news/2009/02/cutting-the-cord... ) that goes through some of the commercial offerings around that time. No idea if any of them are still around, or in what form.
Sorry about that, it didn't register that GP's question was tangential to GGP's UWB-focused links. Regarding IR, I agree that interference is the biggest issue. The electronics also get harder as you approach THz frequencies, but that could be worked around if IR radiation weren't so pervasive. Interference is probably the reason you can only get up to 1Gbps or so, even in a lab (and you'd need 10.2Gbps to transmit HDMI 1.3, for instance).
edit: found it:
http://gigaom.com/2008/02/27/too-many-signals-delivering-wir...
It's a 2008 article, you'd expect some movement since then.
edit2: more info:
http://www.wirelesshd.org/