How to estimate power consumption variation according to display frame rate and screen resolution?

Nov 15, 2018
1
0
10
I have a display panel resolution 390*390 (circle 1.4 inch) at frame rate 45Hz. It has voltage output from power IC is 6V. With that model I measured the current to the display is about 34mA and power= 240mW.

I'm wondering whether we can estimate the current and power consumption on another display panel which has higher resolution 252*572 (circle 1.95 inch) and frame rate 60Hz? Because the frame rate must be adjusted by some microcontroller coding modification, so I'd like to ask for some formula that can calculate power variation at a precise ratio. Is there any formula with correlation between display frame rate, resolution and current/power consumption for quick estimation without having to measure by multimeter/oscilloscope? Please tell me a quick way. Thank you!