With CRTs, a horizontal line is drawn by sweeping magnetically-deflected, tightly-focused electron beams from the red, green, and blue electron guns across a phosphor coated surface. This is a little bit simplified, but good enough for "government work". The magnetic deflection signal that creates the horizontal sweep is typically derived from a simple ramping voltage that resets at every HSYNC. The number of pixels per line is a function of how quickly the electron gun drive circuitry can modulate the beam.
In an LCD monitor, the pixels are defined absolutely as row and column positions. This is also a bit simplified, but the video image is shifted, pixel-by-pixel, through the drive electronics and latched by the sync signals. The input is either by analog voltages, in which case the monitor needs to convert them to digital signals internally, or the input is digital to begin with. In either case, a digital clock signal (provided or derived) running at the pixel rate (very high frequency, can be over 100MHz) is used to clock the pixel data into the row drivers.
Edit: The above explanations center on the respective monitor types. The circuitry used to CREATE the video signals must be able to clock data out of graphics memory at rates corresponding to the resolution of the displayed image. FWIW, it is no mean feat to send these high-speed signals over wires.
Additionally, for LCD monitors, if your displayed image is jittering horizontally, this could indicate an improper phase relationship between the video data and the video clock.
Regards,
Altazi