With fluorescents, there are two factors influencing the suitability of certain real power level for given lamp: - Electrode heat requirement, to keep the electrode at optimum temperature, any deviation shorten the lamp life: With most ballasts except RS and dimmable ballasts, the cathode heat come from just the cathode fall losses. That mean overdriving a lamp certainly with all ballasts mean overheating the electrodes. When underdriving the lamps, the electrodes run cooler than optimum, what mean shorter life as well. But the RS, as well as decent dimmable ballasts are designed to provide supplemental power to keep the cathode temperature near the optimum, even when the cathode fall losses are lower due to lower arc power. So those ballasts mean no life issue regardless of the ballast factor
Now other problem is mercury pressure. It should be just right, otherwise the efficacy suffer (too low mean no mercury to emit the UV, too high mean the radiation get absorbed). That mean underdriving lamps usually cause the lamps to operate colder than originally anticipated.
The older argon filled lamps (T12, old miniature T5,...) have their optimum performance at rather low temperatures, which is usually exceeded at normal power levels. That mean the lamp highest efficacy is in fact at power levels below the rating. But the rating is optimized not just for the highest efficacy alone, but for the lowest total cost for typical installation (higher power mean lower efficacy, so higher total power, but fewer fixtures, so cheaper installation). But this difference is not more than 10%, with modern lamps (Krypton filled T8,...) it is nonexistent anymore at all.
And there is yet another aspect: Lower power lamps tend to have lower efficacy than their higher power cousins, just for the economy of scale (except LED's, this is valid for all known light sources, not just electric). So with some lumen requirements it may yield higher efficacy to just underpower larger lamp (and operate it below the optimum power) than feed optimum power into a smaller one. The extra size mean more efficacy gain than is lost in the not optimal drive. This is used mainly with battery powered fixtures and emergency lights: Due to rather low expected burning hours, it become easier to have more efficient larger lamp underpowered and frequently replaced due to shorter life, than pay larger batteries for the lower efficacy of the smaller lamps.
In commercial installations the smaller lamps are used not because the lower output would be sufficient, but because the larger lamps would be simply too big.
The residential fixtures the requirements are way different: The required amount of light is usually way lower, so low output lamps are way sufficient and the lamps are usually quite low, so lower brightness is highly welcome. Now you may use smaller lamp, but it's light would have to be diffused. So beside the lower efficacy, you get additional light losses with the diffusion.
Now if you use larger lamps and underdrive them, they are still more efficient than the smaller ones and most important, because of their large size, you do not need the diffuser anymore, so no extra light losses from that. And that is, where the need for the F40 operated at 25W came from.
Now in Europe are historically a lot of "very clever" people in power, who tend to "protect the dumb public" from virtually anything. And these have read, than a more significant underdriving of a lamp mean it degrade it's performance, so they just prohibit the sales of ballast not meeting the "official" spec's. That mean the only available fluorescent systems were the bright, full power ones and nothing else. But as these were frequently just too bright for a household, the fluorescent technology never gained as much popularity for homes as the "residential grade" in the US.
|