| There is the main question: Why would anyone do that? The lowe the voltage, the grater percentage of that is the voltage drop associated with the electrodes, which generates heat but not light. So the lower the arc voltage the lower the resulting efficacy of the lamp. So actually the opposite was happening: Making the lamps so the anode column drops as high voltage as possible, while not hitting some other limit. One of the limits steering most of tradditional lamp sizes was the ability to operate with just a series choke ballast on the given mains. That is the reasom why you find so many lamps with arc voltage in the 50..60V ballpark (about maximum a 120V mains can support; F20T12, F15T8,...), or 90..120V (maximum for 230V mains (F40T12,...). Also when voltage boost is required, the cheapest and most efficient is when the ballast is doubling the mains as OCV (all windings use the same wire), which also leads to the 90..120V ballpark (and tubes like F40T12). Another restriction, causing mainly the small lamps to deviate from the optimum above, is the intention to have common ballast for multiple lamp sizes (e.g. the miniature T5 family, where just the F8T5 is at the optimal 65V drop, the 6 and 4W are just designed to share the same ballast). And the last reason to deviate from the optimum is the intention to make the lamp compatible with existing ballast, but lower the real power to save energy when the technology compensates by higher efficacy, like the F34T12.
So yes, it is possible to modify the lamp designs for lower arc voltage and higher currents, but there is no reason for doing that.
|
|
|
|
Logged
|