I know this was a problem with 802.11 b/g, and to a lesser extent n. If a b device connected to a b/g router, it would force the router into b mode, causing all g devices to fall back to b. In addition, the speed of b/g was locked at the speed of the slowest-connected device. So having an old b device in the far corner of your house always connected could force your WiFi network to 802.11b 1 Mbps even though the router was capable of 54 Mbps 802.11g.
802.11n on 2.4 GHz was a little better since it operated independently of b/g (n devices couldn't be forced to fall back to b/g). But there was still some performance loss due to the router having to switch between n and b/g modes.
I believe the situation was improved with 5 GHz 802.11 n/ac (ac is 5-GHz only, while n works on both 2.4 GHz or 5 GHz). That n and ac were designed so the router can switch between the two seamlessly. Both use encoding which scales bandwidth based on the amount of noise. So you can assign both to the same frequency slots. If at any given time only an n or ac device is transmitting, it sees no noise so gets the full bandwidth. If both are transmitting, they see each others' broadcasts as noise which ends up divvying up the bandwidth between them automatically. I will admit though that I hadn't really followed the development of n/ac closely (I basically skipped 5 GHz until after ac was out), so this is a bit of conjecture on my part.