I've got a 32A supply in the garage and was looking at various heater elements on ebay.
Notice how many of them are 220VAC? From working on industrial machines a while back I'm aware that you can normally allow +/- 10% on the voltage, so 220 should be fine for 198 - 242VAC.
I've two concerns. I think these are resistive heaters and so if you put 230VAC (UK mains voltage) through them they will pull more current, and so output more heat.
Here's the basic math using V=IR (voltage = current * resistance) & P=VI (power = voltagae * current). I'm no electrician, and I'm aware that AC can be a little different to DC so please correct me if I've dropped a bollock!
** Assume 220VAC supply **
I = P/V = 13.6A
R = V/I = 16.2A
** Raise supply to 230VAC **
I = V/R = 14.2A
P = VI = 3266
...so a 4.5% rise in voltage has risen the power consumption by 8.9%.
Going back to my original rule of thumb about +/- 10% voltage not too bad - the 230V is within this. But my problem is I've measured my supply at 250VAC, which is within the +/- 10% of 230, but pushes the 220VAC elements out of the rule of thumb zone! According to the above my power output would be 3874W! 29% above the rated power!
Back to the drawing board, on my 32A supply using two 220VAC 3kW heaters would pull 26A if I ignore the change in power output with increasing the voltage:
I = P/V = 6000 / 230 = 26A
** Correct power for 220 - 230VAC **
I = 6532 / 230 = 28.4A
** Now what about my 250VAC? **
I = 7748 / 250 = 31.0A
All of a sudden I'm knocking on the door of the 32A breaker! As these power ratings are likely to be when the element is running it is highly likely that the elements will pull more current when stone cold. Like a light bulb I'm assuming these will have an inrush current and so I think I'll be forever popping fuses unless I modulate the supply to the elements and then the maths & circuit design gets a bit more complicated!