It is a marketing thing. A Quadro K6000 is 4300 while a K4000 is 700 and a GTX 980 is 550.
Premier uses the vRAM on the video card most extensively. The more bandwidth and GBs - the better. The GPU only accelerates, but it does not calculate that much. The GPU can accelerate only what the CPU has already per-processed, so the CPU is your main bottleneck. This is also a reason why cheaper GTXs with more memory bandwidth win over Quadros that are twice the price.
http://www.pugetsystems.com/labs/articles/Adobe-Premiere-Pro-CC-Professional-GPU-Acceleration-502/#1080p%28PPBM6%29Results
Their first result is a real life scenario. The blue lines are the 4 core I7, while the red lines are the 6 core results. As you can see, going from a K2000 to K5000 does not bring much, but adding 2 more cores to the I7 drastically reduces the times.
Their second benchmark is a custom 4K resolution with only GPU accelerated effects. This is unrealistic, because in everyday editing, what you mainly have is CPU bound scenarios.
Also, on nVidia website, the advertisement is about Adobe Mercury Playback engine, which is rather different than rendering. Of course, for the advertisement purposes, nVidia packed the benchmark with GPU accelerated effects as much as they can. What you are seeing there is a theoretical best case scenario, which is even less than 1% of the use cases.
At the office, we have 4 Z420s with 8 core 3.0 GHz Xeons and 4 other Z420s with 6 core 3.5 GHz Xeons and all of them are with Quadro K4000s. None of us want faster video cards, but we all want more CPU cores. A K4000 is completely enough for 8 cores running at 3.0. You have to jump to 12 cores until the K5000 offers any performance increase over the K4000.
Going with a K6000 does make sense, if your prime software is After Effects. But it makes no sense if your prime software is Premier. Not to mention that a single GTX Titan for 900 beats the 4000+ K6000. In this day and age, for professional work, it does not make sense to spend more on a GPU than on a CPU. We are still in the CPU age.