Forget about CAT7 and beyond unless the client has the endless resources needed for 10 gigabit gear. CAT 5a, 6, or 6a is adequate for gigabit.
Full 1080p with HD sound lags unless it has around 70-80Mbps in my experience with wireless (although BluRay specs only call for around 50Mbps maximum), so gigabit less overhead (but greater signal stability than wireless) will easily carry 10-15 full 1080p streams over one line. My gigabit line installs usually test out at around 850Mbps after network overhead. But will a single line actually have to carry all the data?
If you are looking at streaming something like BR images to many HTPCs from a NAS device, in a real high use situation you could look at teaming several NAS connections to the switch, although the limiting factor might become your NAS read speeds. The individual devices would only have one stream to each, which is nothing over gigabit.
More the question is how many 1080p videos can your server feed out the port. The cable can only go as fast as the slowest device in the chain. It is highly unlikely you will be able to afford anything faster than 1g in a home environment....and as realbeast points out even 1g is overkill.
1G is fully supported on CAT5E. Does no good to buy any more advanced cable it will still only run 1G. Only reason to buy cat6 cable is sometime you can get it cheaper than cat5e.
The only time you should even consider cable faster than cat5e is if you have 10g ports in your equipment. Then you must use cat6a or cat7.
Most 10g ports are in very high end disk arrays and some "servers" that are actually clusters of servers