What's the logic behind having the min and max the same? I'm not clear on the concept. Isn't there a range for a reason? Wouldn't it be better to have it set to a low min like 0 or 16 (I think that's the actual min) with a max of 1024 as the data will usually not be exactly 1024? What's the theory behind having them the exact same? Why is it beneficial? Why not set it the same as 16/16 or off if the idea is to have it be rarely used at all? I don't get it.