A 'card' is the whole thing you plug into your comp, ie the pcb and all its chips and interfaces etc.
A GPU is the main chip on a card, so there is only 1 of them except in the very top end cards like the radeon 4870 X2 or geforce gtx 295.
I actually don't know how the shader clock works in Nvidia's, just that it's a lot higher than the core clock for both Nvidia and ATI cards, unlike with ATI cards which have the same core and shader clock speed (ATI core clock speed is usually a lot higher than Nvidia core clock speed though).
It would help if you knew what was in the core. Let me try to give a basic overview. There are three "clocks" on an Nvidia card. The first isn't a part of your question, but it is a clock. Its the memory clock, and its how fast the memory on the card runs. Just like with a CPU, the "CPU" (actually called a GPU) and the memory clocks are different. Also just like a CPU, there are different things inside the GPU/Core. There are the shaders which do all the work, and everything else. There are two frequencies, the Core frequency and the shader frequency. Nvidia has them unlinked, while AMD has them linked. (meaning AMD cards use the same frequency for the core and the shaders.) There aren't two cards in it, just something run at different speeds.