An online expert recently expounded on detailed design of a balun, this is an excerpt about wire sizing.
The wire gauge used limits the current handling capacity of the wire, run too thin a wire and it will heat up. Run much too thin of a wire for the power in use and it will fuse open. Current carrying capacity of wire is typically rated for either power transmission applications or chassis wiring applications. The latter, and higher, current capacity for a wire is relevant to designing a balun. How much current your 50 watt signal generates depends on the impedance its looking into. If you're talking about a 50 ohm system, with a perfect match you'll deliver one amp through your balun wires when driving 50 watts into it. Allowing for say a 4:1 SWR the worst case current(@12.5 ohms) is 2 amps. If you're using this as a tuner balun, perhaps to drive a multi-band doublet then the impedance can vary widely so over sizing the wires is easy insurance. Here's a table of wire current carrying capability: https://www.powerstream.com/Wire_Size.htm
For convenience, the relevant part of the table linked above is quoted for discussion.
So, the poster recommends wire with chassis wiring rating of 2A for 50W with reserve capacity for worst case VSWR=4.
Using his reference table, 0.4mm dia (26AWG, 0.129mm^2) is the minimum to meet the requirement, and the table shows the resistance per km to be 134Ω (0.134Ω/m)
The issue here is that heating (which is what melts wire or its insulation) is due to I^2*R, and R at RF can be much greater than the resistance shown in the table for DC, and on which the current ratings are based.
Lets put some numbers to that at 30MHz.
A simple model of skin effect on effective resistance tells us that R is 8.7 times that of DC.
In fact, it takes a 3.4mm diameter wire to have resistance of 0.134Ω/m at 30MHz, that is larger than 8AWG or 9.1mm^2.
Clearly, 50W baluns are not commonly constructed of either 0.4mm (26AWG) or 9.1mm^2 (8+AWG):
- the DC chassis wiring rating is not relevant to the application for many reasons;
- finding a conductor with equivalent RF resistance to the DC rated conductor is not sane either.
A better way
A more rational approach is to make design decisions about the maximum operating temperature of components (core, wire insulation, enclosure etc), total loss within the temperature limit, and apportion it to major elements like core loss, conductor loss etc, then find the minimum wire size that meets that loss requirement. When you have done all that, run a long heat test and measure the component temperatures to validate the design process.
An example of conductor loss: 1m length of 2mm (12AWG) enamelled copper wires twisted with 2.2mm ctc spacing dissipates about 2% of the input power in conductor loss at 30MHz with a 50Ω load. Do you see why you are more likely to see that size of wire than the 26AWG using the online expert's method?
There may be times though when acceptable loss might be lower than heat limiting loss, eg if you are using low power, you might nevertheless choose larger components for higher radiation efficiency (though it is probably not common practice).
Many of my HF tuner balun designs used a ferrite core wound with a transmission line made of 0.9mm PTFE insulated 7 strand silver plated copper wire twisted (12 twists/m), centre to centre spacing 1.4mm. This yields a transmission line with Zo approximately 100Ω, and matched line loss 0.06dB/m @ 30MHz. A length of 0.9m (as used in the balun) with a 50 load at 30MHz loses about 2.1% of the input power as heat in the conductors, and the PTFE insulation gives it a very high breakdown rating, even in excess of the core's Curie temperature. Total loss is dominated by ferrite core losses, so you might ask why so many builders use much heavier wire, commonly with poorer insulation (eg enamelled wire).