Do CFLs Still Pass the Test?

Energy Star cracks down on deficient CFLs.

May 03, 2010
May/June 2010
A version of this article appears in the May/June 2010 issue of Home Energy Magazine.
Click here to read more articles about Lighting

Chris Granda (Image credit: Grasteu Associates)

Glenn Reed (Image credit: Vermont Energy Investment Corporation)
The combined efforts of lighting manufacturers, retailers, energy efficiency program sponsors, and the Energy Star for CFLs program have succeeded in moving CFLs from the margins to mainstream. Energy Star-labeled CFLs still promise some of the cheapest and easiest energy savings available. What do we know about CFL performance, and are manufacturer claims credible? On January 26, 2010, the Department of Energy (DOE) sent out its latest delisting letter, removing 34 CFL models from Energy Star because independent, third-party tests showed that actual performance did not comply with program requirements. The results of four years of this performance testing on 121 models confirm that CFLs are big savers, but also raise concerns about the reliability of some manufacturers’ products.

A Little Background

Before the Energy Star for CFLs program debuted in 1999, energy efficiency program managers often set their own technical criteria for CFLs, and the criteria differed among programs. Energy Star provided a common, consistent standard for CFL performance, and by 2001 most U.S. energy efficiency programs were using the Energy Star specification as a minimum requirement. Non-Energy Star-qualified CFLs are available, but due to the market-driving effect of energy efficiency programs, most CFL manufacturers actively seek the Energy Star label.

To qualify a product under Energy Star, a manufacturer typically pays to test preproduction CFLs at an accredited lighting laboratory. The laboratory provides the test results to the manufacturer, and the manufacturer submits the results to Energy Star to qualify the product. This process creates several potential problems. Preproduction CFLs may be more carefully manufactured than regular-production products. Also, commercial relationships between manufacturers and laboratories create at least the potential for conflicts of interest and the manipulation of test data by manufacturers.

In 2001 a group of energy efficiency program managers became concerned that the Energy Star label did not always mean high-quality CFLs, based on field experience and feedback from program participants. The Program for the Evaluation and Analysis of Residential Lighting (PEARL) was created with leadership from the Natural Resources Defense Council and support from both DOE and the Environmental Protection Agency (EPA) as a way to generate unbiased information on CFL performance. Since 2001, PEARL has gone through nine testing cycles, each time purchasing CFLs at retail and testing them at the Lighting Research Center at Rensselaer Polytechnic Institute in Troy, New York. Lighting manufacturers have no financial involvement with PEARL, and are not informed when their products are chosen for testing. PEARL results are sent to PEARL’s sponsors, to manufacturers, and to Energy Star, but are not released to the general public.

In any given year, thousands of CFL models are qualified with Energy Star. Of course, there are many fewer physically distinct CFLs than there are unique CFL models, because the same CFLs produced at one factory may be packaged and sold under several brand names. In any case, each PEARL cycle was able to test only a small percentage of the contemporary Energy Star-qualified CFLs available on the market (see Table 1). PEARL’s sponsors nominated CFLs for each test cycle based primarily on high sales volumes but also on customer complaints, or in order to look at new types of product. Therefore the CFLs tested by PEARL are not really representative of all Energy Star-qualified CFLs. However, they are generally representative of the Energy Star-qualified CFLs promoted by energy efficiency programs when the tests took place.

PEARL testing (and CFL technology) have evolved over time. Sample sizes and test protocols sometimes changed between the earlier PEARL cycles, so that it is not possible to compare the results of one test cycle to those of another. The most recent cycles—6 through 9—have consistently used the same tests and sample sizes that are required in the Energy Star for CFLs specification. Results for PEARL cycles 6 through 9 can be compared both between cycles, and with the requirements of the Energy Star specification. DOE agrees, and has usually considered negative PEARL results to be grounds for removing CFLs from Energy Star, resulting in delistings similar to the most recent one on January 26.

PEARL cycles 6 through 9 tested 121 CFL models, which means that over 1,500 individual CFLs were put through tests covering key Energy Star specification requirements. The CFLs tested came from 29 different brands, including brands from the largest companies (Philips Lighting, Osram/Sylvania, General Electric, Feit, and TCP); and more than 20 other companies. PEARL tested all major CFL types sold in the United States, including bare-tube, covered or encapsulated, reflector, and multiple-lighting-level models, both three-way and dimmable. (PEARL only tested one-piece, self-ballasted screw-based CFLs. Dimmable products were tested at 100% of rated power, as per Energy Star.) All of the CFL models were available for sale in the United States at the time of testing and were purchased at retail directly by the PEARL sponsors.

Performance Relative to Energy Star

At least 20% of the CFL models tested in PEARL cycles 6 through 9 did not meet one or more Energy Star performance requirements. In other words, PEARL found that one in five CFLs did not perform as well as suggested by the laboratory data submitted by the manufacturer during Energy Star qualification. However, the Energy Star for CFLs specification has many criteria, and PEARL’s results do not necessarily mean that 20% of CFLs did not save energy or provide adequate lighting. The following discussion summarizes the results for three critical performance indicators: efficiency, lumen maintenance, and lifetime.

Enlarge Image

Figure 1. CFL efficiency by wattage and PEARL cycle. (Image credit: Grasteu 2010)

CFL Efficiency

PEARL found that all but 4 of the 121 CFL models tested met the Energy Star efficiency criteria in place at the time the CFLs were purchased. The upper yellow line in Figure 1 represents Energy Star for CFLs Version 4.0 efficiency requirement for CFLs of 15 watts or greater. The blue bars in Figure 1 show that, on average even back in 2005, this class of CFL already met or exceeded today’s 65 lumen per watt (Lm/W) efficiency bar. Energy Star efficiency requirements for CFLs of less than 15 watts are lower (see bottom yellow line). PEARL found that CFLs of less than 15 watts also exceeded Energy Star efficiency levels, typically providing more than 60 Lm/W.

As expected, the efficiency of covered and reflector CFLs (not shown) was lower than that of the bare CFLs. However, these products also easily met the relevant Energy Star requirements. One related discrepancy that did show up in the PEARL testing is that CFL packaging often overstated both the actual electricity consumption and the light output of products.

Enlarge Image

Figure 2. Lumen maintenance by CFL type and PEARL cycle. (Image credit: Grasteu 2010)

CFL Lumen Maintenance

Almost all kinds of lamps get dimmer as they age. CFLs are no exception. PEARL results show that bare CFLs consistently exceeded Energy Star requirements by retaining more than 80% of their original light output at 40% of rated life (see yellow line in Figure 2). Average lumen maintenance for covered CFLs varied for complicated reasons, but was generally acceptable. (It appears that with covered CFLs, PEARL (and Energy Star) may not have established proper initial performance. Many of these CFLs apparently require more than 100 hours of operation to achieve maximum lumen output. The high blue bars in cycles 8 and 9 in Figure 2 were caused by some covered CFLs that were actually brighter at 1,000 hours than at 100 hours.)

The lumen maintenance results for reflectors were disappointing. PEARL tested 20 models of reflector CFLs in cycles 6 through 9, and 8 models of them did not meet Energy Star’s 80% lumen maintenance requirement at 40% of rated life. On average, these 8 dim reflector CFLs had lost more than one-quarter of their light output at only 40% of their rated lives. These results raise concerns about recent trends in energy efficiency programs to focus on reflectors as “specialty” CFLs.
A scientist inspects the CFL test rig. (Image credit: Rensselaer’s Lighting Research Center)

CFL Lifetime

CFL lifetime is obviously important to consumers. Lifetime is also important for energy efficiency program sponsors, because the longer the CFL lasts, the more savings program sponsors can claim. Because it takes about a year to fully test a CFL with an 8,000-hour average rated life, PEARL was not able to perform complete life tests. However, because PEARL did test lumen maintenance, and because the test protocol is very similar to the life test, PEARL was able to gather data on CFL lifetime up to 40% of rated life.

Lighting products are not supposed to last exactly as long as the average rated life printed on the packaging. For CFLs, the industry definition of average rated life is the length of time it takes for half of a sample of 100 CFLs to fail, while being switched on and off every few hours in a controlled environment. Out of a sample of 100 CFLs rated at 8,000 hours it is acceptable for some to fail at 7,000 hours as long as an equal number last until 9,000 hours. The sample can still achieve an average rated life of 8,000 hours. In practice, high-quality volume production should result in actual life (measured in the lab) that is close to average rated life, with most failures occurring shortly before or after the rating.

The Illuminating Engineering Society (IES) estimates that for linear fluorescent lamps, typically less than 5% of a test sample will fail at earlier than 50% of average rated life. PEARL collected data on CFL failures for a shorter period—up to 40% of rated life. Therefore PEARL hoped that the rate of failures would be even less than the IES’s 5% for linear fluorescents.
Energy Star considers each CFL individually, model by model. (DOE tries to keep track of the technical genealogy of CFL models, and to know when different models of a CFL marketed under different brands are actually the same product built by the same manufacturer, as is often the case.) Like Energy Star, PEARL tests ten CFLs per model, and with sample sizes this small, the statistical variance is too large to measure early failure rates with sufficient accuracy. This is why the Energy Star for CFLs specification allows two CFLs out of a sample of ten, 20%, to fail prior to 40% of rated life before denying Energy Star qualification. Twenty percent early failures is obviously much higher than the 5% cited by the IES, but Energy Star considers only this high failure rate to be statistically strong enough to justify denying Energy Star status with a sample size of ten.

In reviewing the PEARL data for cycles 6 through 9, we realized that although there were only 10 CFLs tested for each of the 121 models, that if we looked at all CFLs in a testing cycle as a single sample, suddenly we had four “cycle samples”, each consisting of hundreds of CFLs—more than enough to allow us to see statistically strong trends over time (see Table 2).

Table 2 shows a clear increase in the percentage of CFLs that failed prior to 40% of rated life over the four cycles of testing. These results also raise questions about the accuracy of common claims for average rated life. For the purposes of illustration, assume that all the CFLs in cycle 9 were rated at 8,000 hours. If 11.2% failed before 40% of rated life (before 3,200 hours) as shown above, then at least 11.2% would also need to survive beyond 12,800 hours, under laboratory conditions, for the math of average rated life to work out correctly. Until full-life testing can confirm or disprove manufacturers’ claims for average rated life, energy efficiency program managers may want to review their measure-life claims for CFLs.

Of equal concern is the impact of failures on customer perceptions of CFLs. The PEARL results suggest that when the average program participant in 2008 purchased an 8,000 hour CFL, there was an 11.2% chance that the CFL failed at around 1,600 hours. (We did not obtain information about when tested CFLs actually failed, just that they failed prior to 40% of average rated life. If we assume that these CFLs on average failed at 20% of average rated life, then a CFL rated at 8,000 hours failed at 1,600 hours, or at a bit less than two years at 2.5 hours per day. Of course, this was under controlled conditions. In typical applications, actual life can be shorter.)

(Image credit: Rensselaer’s Lighting Research Center)
In other words, under typical usage a relatively large number of these CFLs probably failed early enough that program participants understood that they were early failures. The media have begun to target CFL early failure as a consumer issue (Vestel, 2009). Recent research by DOE (, Market Profile pdf) suggests that there are many sockets left to fill; but these test results warn of the danger of forming negative impressions about CFLs in consumers who have not tried them yet, creating a barrier to the next phase of market transformation.

The PEARL findings pose a clear question: If early failure is a problem that may be getting worse, and most CFL promotion programs currently focus on decreasing price and increasing availability, are we paying attention to the right market barriers?

Fortunately, the PEARL results also suggest a possible tool that Energy Star could use to reduce early failures. Eighty-seven of the 121 models tested (72%) in PEARL cycles 6 through 9 came from only eight brands. These brands are well known to all energy efficiency programs and are familiar to many consumers. Just as we redefined the samples to include all CFLs tested in a cycle, it is also possible to re-redefine the samples to include all CFLs of a particular brand. PEARL tested between 5 and 18 CFL models, or between 50 and 180 individual CFLs, for each of the eight most frequently tested brands. A sample of 50 is big enough to give a statistically strong indicator of the tendency for early failure by brand (see Figure 3).

Enlarge Image

Figure 3. Early failure rates for eight frequently tested CFL brands. (Image credit: Grasteu 2010)
Of the 870 CFLs PEARL tested from the eight frequently tested brands, 48 (5.5%) failed before 40% of average rated life (Figure 3), not too far off from the IES’s projection of 5% at 50% of average rated life. However, there were also large differences in the rates of early failures between the eight frequently tested brands. While the best brand had an early failure rate of only 2%, the worst had an early failure rate of close to 13%. These findings are consistent with research by the National Lighting Product Information Program (NLPIP) that also showed significantly different rates of early failure by brand.

Figure 3 tells us that manufacturers can achieve acceptable early failure rates with high-volume, low-cost CFL production. It also tells us that not all brands currently offer this level of quality. Version 4.0 of the Energy Star for CFLs specification (effective July 1, 2009) still considers each CFL individually, model by model, and does not include provisions for acting on this kind of brand level performance information. However, with Version 4.0, Energy Star has also begun to buy qualified CFLs at retail and put them through a new, independent testing program. In other words, Energy Star has essentially incorporated the function of PEARL.

This new testing program will cover many more CFL models (20% of all qualified models every year) than PEARL was able to, and the results could allow Energy Star to develop new enforcement policies at the brand level. After calculating early failure rates across brands, as we have done here, Energy Star could, for example, prohibit manufacturers with high early failure rates from qualifying new products, or in the worst cases could delist an entire brand. Such enforcement policies would send a clear signal to CFL manufacturers to more closely control manufacturing quality, and help to protect the integrity of the Energy Star brand. In addition, we encourage Energy Star to focus testing on those brands whose models have shown poor compliance.

It will be incumbent upon Energy Star to make the most of this new CFL performance information resource, because energy efficiency program providers will no longer have access to the test data. PEARL does not publish brand names when providing performance information, and the results from the new Energy Star testing program will not be publicly released.

Going Forward

The most recent five years of PEARL testing show that CFLs remain excellent energy efficiency measures. Almost all are very efficient, and most of them perform well. However, lumen maintenance with reflector CFLs still appears to be a problem, and PEARL results also suggest that not all types or brands of Energy Star-qualified CFLs offer equivalent quality. Lumen maintenance with reflector CFLs appears still to be a problem.

The high early failure results for some brands of CFL also raise concerns about consumer perception, lifetime savings, and damage to the reputation of Energy Star. PEARL has shown that independent testing programs can provide useful information about which CFLs perform best. However, PEARL does not release brand-specific results, and even less information will be available to energy efficiency programs and the public from the testing conducted as part of Energy Star for CFLs 4.0.

The good news is that Energy Star now has the tools to address these problems. Interested readers should contact DOE about the new Energy Star testing program, participate when possible in Energy Star lighting meetings, and advocate for the development of new enforcement procedures for Energy Star for CFLs through organizations like the Consortium for Energy Efficiency.

FINAL NOTE: It was suggested while we were writing this article that the transition to solid-state lighting (LEDs) and/or to high-efficiency incandescent lamps (when the Energy Independence and Security Act of 2007 becomes effective, starting in 2012) will solve quality problems with energy-efficient lighting. Unfortunately, there is no reason to believe that there will be fewer quality problems with any new, untested lighting technology. Those familiar with CALiPER or other LED test programs know that solid-state lighting, at least in its infancy, seems to have as many quality problems as CFLs.

Chris Granda is a principal consultant at Grasteu Associates. E-mail Chris at Glenn Reed is a managing consultant at the Vermont Energy Investment Corporation. E-mail Glenn at Granda and Reed are members of the PEARL board.

For more information:

For the IES estimates for linear fluorescent lamp life, see the IES Lighting Handbook.
For more information on recent LED lamp and fixture testing.

  • 1
  • NEXT
  • LAST
Click here to view this article on a single page.
© Home Energy Magazine 2023, all rights reserved. For permission to reprint, please send an e-mail to
Home Energy Pros
Discuss this article with other home performance professionals at Home Energy Pros!.

Add a new article comment!

Enter your comments in the box below:

(Please note that all comments are subject to review prior to posting.)


While we will do our best to monitor all comments and blog posts for accuracy and relevancy, Home Energy is not responsible for content posted by our readers or third parties. Home Energy reserves the right to edit or remove comments or blog posts that do not meet our community guidelines.

Related Articles
SPONSORED CONTENT What is Home Performance? Learn about the largest association dedicated to home performance and weatherization contractors. Learn more! Watch Video