Let's say you want to OC a 2500K to 5.5 Ghz and you have some crazy nitrogen cooling kit (not real but just as an example) and you increase the voltages to like 1.8 or something but your temps are 20-50 Celsius. Will it still kill the CPU? If so, how?
High voltages are what is most likely to kill the CPU instantly, high temps will kill it over time as the doping in the transistors changes but high voltage can overstress the transistors and cause them to rapidly fail. All transistors have a maximum voltage they can withstand between their source and drain, if you exceed that voltage they break down and fail. Modern CPUs use low voltage transistors to conserve power, but this also means a much lower voltage threshhold for the transistors in the chip.
Just like any other electronic device, your CPU cannot withstand significantly more voltage than it was designed for. Plug a lamp meant for 110 V into a 220 V outlet, the bulb and possibly the fixture will fail quite quickly, the same will happen to your CPU if you exceed its maximum safe voltage.
Where are you located? If you are in the US you don't usually encounter 220 V outlets except in a few specific circumstances. Some garages and workshops have a few as some power tools are 220 V, and an electric stove or water heater is usually on a high current breaker and runs on 220 V, but they use very different outlets so you cannot mess it up. A standard 220 V outlet in the US has the right pin rotated 90 degrees so you cant put a 110 V plug in it and damage the item.
Generally with any incremental improvement of CPUs you get a bit more OCing headroom as the tolerances are a bit tighter, the design is a bit more refined. If there was a certain part of the CPU that would become unstable first they might put faster transistors in there which can buy you even more headroom. It really depends what changes they made how much additional headroom you get, but i would expect you will get to at least the levels of the average SB chip.