FREE CONTENT

Evaluating DOE's Weatherization Assistance Program

With an unprecedented increase in funding - and expectations - through the American Recovery and Reinvestment Act of 2009, WAP has to figure out what works best and tell Congress.

July 01, 2010
July/August 2010
This article originally appeared in the July/August 2010 issue of Home Energy Magazine.
SHARE
Click here to read more articles about Quality Assurance

The Weatherization Assistance Program (WAP) was recently given $5 billion by the federal government. One purpose of an upcoming evaluation is to find out how effectively WAP is spending the money.
 


Students in Springfield Illinois take a two day assessor/final inspector class. (Image credit: Edward Haber)

If you are a weatherization crew chief, technician, or administrator, get ready for a visit. Evaluations of DOE’s Weatherization Assistance Program are moving ahead. The last WAP national evaluation was conducted two decades ago. This time there are actually two WAP evaluations coming up, a retrospective evaluation focused on 2007 and 2008, and a WAP American Recovery and Reinvestment Act (ARRA) period evaluation that will study years 2009, 2010, and 2011. Oak Ridge National Laboratory (ORNL) has supervisory responsibilities for both evaluations. (For more on the last evaluation, which took place in 1990, see “Previous Evaluation”.)

An independent team of evaluators was selected by a competitive process to conduct the retrospective evaluation. This team is led by APPRISE, Incorporated, a nonprofit energy research company. Key team members include the Energy Center of Wisconsin, Michael Blasnik & Associates, and Dalhoff Associates, LLC. The plan for the WAP ARRA period evaluation is currently being developed by ORNL.

Previous Evaluation
In 1990, DOE sponsored a comprehensive evaluation of WAP. At that time, the program weatherized about 250,000 dwellings per year through approximately 1,110 local weatherization agencies. The evaluation consisted of five parts: a network study, a resources and population study, a multifamily study, a single-family study, and a fuel oil study. The average savings for all single-family and small multifamily dwellings in 1989 was estimated to be 17.6 million Btu per year, 18.2% of the energy used for space heating and 13.5% of total energy use. It was estimated that over a 20-year lifetime, the program would save the equivalent of 12 million barrels of oil.

Several factors were associated with high energy savings, including the weatherization of high energy users, using an integrated audit of the heating system and envelope, and installing attic and wall insulation. Notable characteristics of higher-saving weatherization programs included limited turnover of weatherization staff, extensive use of blower doors, ability to leverage DOE funding to attract non-DOE sources, and ambitious client education programs.

Recommendations coming out of this evaluation were to enhance workforce training, emphasize replacement of inefficient space-heating systems, increase attention to water-heating measures, and select measures based on savings-investment ratios produced by audits.

Evaluating WAP in 2007 and 2008

The central evaluation question is this: How much energy did WAP save in 2007 and 2008? We’ll use well-known analytical approaches to answer this question. Electricity and natural gas billing histories will be collected pre- and postweatherization for a sample of weatherized homes (the treatment group) and a sample of comparable homes that were not weatherized (the comparison group). The comparison group for the 2007 treatment group will be homes weatherized in 2008. The comparison group for the 2008 treatment group will be homes weatherized in 2009. A national sample of 400 local weatherization agencies will be selected to provide information on about one-third of their weatherized homes. Billing history data will be normalized using three different analytic methods, including PRISM, the Princeton Scorekeeping Method (see “Analytic Methods”).

An important complementary evaluation question is this: How cost-effective are these savings? Theoretically, all installed weatherization measures should be cost-effective, since they all have to meet the savings-investment ratio (SIR) test. Evaluations are conducted to compare theory to reality. We will collect cost information associated with each weatherized home included in the treatment sample. Total energy cost savings over the lifetime of the measures will be divided by total costs to estimate a benefit-cost ratio for WAP.

We want to quantify nonenergy benefits as well as energy benefits. Utility providers benefit because weatherization reduces arrearages and service shutoffs. Occupants benefit because weatherization makes homes more comfortable, healthier, safer, and more valuable. When housing is affordable for people and families with low incomes, there are less homeless living on the streets of our cities and towns. Society benefits because weatherization reduces greenhouse gas emissions and other forms of pollution and conserves water. It increases local employment and it increases local economic activity caused by the multiplier effect. This project will use primary data and a wide range of secondary data sources to estimate total nonenergy benefits. These will be broken down into three categories—utility benefits, occupant benefits, and societal benefits—as outlined above.

Analytic Methods
Natural gas and electric billing histories will be analyzed using three separate approaches. As a baseline, the popular Princeton Scorekeeping Method (PRISM) will be used. PRISM relies on a linear model of the relationship between weather and energy consumption, estimated for each household (see “Advancing the Art of PRISM Analysis,” HE July/Aug ’95, p. 19). Because of extreme variability in energy consumption data due to changes in household size, undersized heating and/or cooling systems, and periods of zero or below-normal energy consumption because of service shutoffs, among many factors, the PRISM approach can lead to high failure rates for the individual household models. A second approach will therefore be used that aggregates all household data into one model. This is known as the ORNL Aggregate Model (see Appendix P of the evaluation plan, available at http://weatherization.ornl.gov). Lastly, the team of independent evaluators will use a third approach of their own choosing.

 

In order for us to evaluate program processes, all WAP grantees and subgrantees will be surveyed to collect data about their operations, approaches to training and client education, and quality assurance procedures. Data gleaned from these surveys will provide a snapshot of WAP in 2008. Respondents will also be given the opportunity to indicate strengths and weakness of the program during that year.

The process evaluation will also include case studies and a field study. These studies are designed to furnish insights on the implementation of WAP.
 

  • The evaluation team will conduct case studies with 6 to 10 high-performing weatherization agencies to learn how the administrative and operational procedures used by these agencies lead to improved weatherization outcomes.
  • The evaluation team will observe a sample of homes being treated by 20 agencies to learn how audit procedures, client education, weatherization staff training, and quality assurance affect weatherization program outcomes for those homes.

 

It is difficult to measure the impact of energy efficiency programs on homes that use bulk fuels, such as propane and fuel oil. This is because residents use different suppliers and sometimes use additional fuels, and because they do not always fill up the tank completely. To meet this challenge, the evaluation team will install submeters to measure energy savings in homes heated with bulk fuels. These special studies will include single-family homes heated with propane, single-family homes heated with fuel oil, mobile homes heated with propane, and large multifamily buildings heated with fuel oil. Results will be compared to results from homes that do not use bulk fuels to determine if there are differences in energy savings.

A sample of homes will also be monitored to determine the impact of weatherization on indoor air quality (IAQ). Depending on the availability of funds, additional studies could focus on refrigerators and air conditioning. Treatment and control groups for each of these studies will consist of about 200 buildings. The only exception will be the study of large multifamily buildings heated with fuel oil, which will include only a treatment group of 24 buildings. While these studies are being conducted as part of the retrospective evaluation, the same studies will be used to collect submetering data in 2010.

The retrospective evaluation includes two additional surveys. The occupant survey covers occupants’ knowledge of energy use, the nonenergy benefits that they have received from the program, their health, and their satisfaction with the program. The first three parts will be administered pre- and postweatherization. Approximately 800 occupants will be recruited to take this survey, with a similar number of occupants from comparison homes. The resulting data will be used—among other things—to estimate the nonenergy benefits of the program and for the evaluation of the weatherization process.

The weatherization staff survey is designed to collect data about weatherization as a career, about the usefulness of the weatherization staff’s training; and about the extent of the staff’s training. This is a first-of-its-kind survey. Like the special studies discussed above, these two surveys will collect data relevant to the evaluation of WAP in 2009–2011.
 






Illinois’ Weatherization Certification Classes include instruction on building fundamentals, building diagnostics, advanced furnace systems, and indoor air polutants and lead safe weatherization. (Image credits: Edward Haber)

Evaluating WAP in 2009, 2010, and 2011

WAP is a much different program than it was in the past. There are four key differences between the pre-ARRA program and the post-ARRA program.

First, to expend $5 billion and to increase weatherization activity by 300%, a greatly expanded weatherization workforce was recruited, trained, organized, and sent into the field. To support this expansion, the percentage of WAP funds allowed for training and technical assistance has been raised from 10% to 20%.

Second, all states and U.S. territories have received unprecedented increases in their weatherization funding, and some states and local agencies are grappling with budgets that are several times larger than anything they previously managed. Faced with this massive program expansion, many states and local agencies have chosen to implement innovations in program delivery and management.

Third, in recent years, DOE has made provisions to set aside substantial sums to support innovations in program funding and design. The first of these provisions, for the Sustainable Energy Resources for Consumers grants, sets aside up to 2% of WAP funds to encourage innovative projects by local agencies that further the purposes of weatherization but that are outside the scope of existing program regulations and restrictions. The second sets aside $35 million to encourage the formation of partnerships between traditional and nontraditional weatherization providers to leverage nonfederal resources to further the purposes of the program. This money will be distributed as grants to states and localities, nonprofit organizations, and for-profit firms. Grantees will be required to report estimates of energy savings to DOE. These reports will provide the foundation of ORNL’s evaluation of this aspect of WAP ARRA.

Lastly, to accommodate the expansion of the weatherization program, several major changes were incorporated into the program structure. The household income threshold was increased from 150% to 200% of the federal poverty income guidelines. Also, the average cost ceiling—that is, the average amount of money that grantees can spend to weatherize their homes—was increased from $2,500 to $6,500. Finally, wages for weatherization workers were subject to Davis-Bacon Act prevailing wage requirements for the first time.

Potential Evaluation Topics

Many aspects of the 2009–2011 evaluation will mirror those of the retrospective evaluation. Energy savings and cost-effectiveness analyses will be conducted in a similar fashion, although billing histories may be collected to assess particular program innovations (such as the weatherization of public housing) and jurisdictions new to weatherization (for example, Puerto Rico). The same can be said for the nonenergy benefits analysis.

The biggest difference between the retrospective and the 2009–2011 evaluation will be in the process evaluation. As described above, funding and associated provisions have had a significant impact on the weatherization community and program operations. Thus, two special process evaluation studies are under consideration, one dealing with Davis-Bacon rules and the other related to changes in the national weatherization community.


Davis-Bacon. As part of the ARRA legislation, Congress stipulated that projects funded with ARRA money must follow Davis-Bacon rules. Davis-Bacon is the common name applied to legislation passed in the 1930s that requires all federal construction projects to pay prevailing wages. The U.S. Department of Labor is responsible for identifying prevailing wages in the construction industry. These wages are identified for a set of construction industry jobs and are estimated for each county in the United States. Weatherization had not been subject to Davis-Bacon in the past. DOE quickly realized that weatherization-related jobs did not fit neatly into existing categories for construction industry jobs. Predictably, this caused much confusion, and funding was delayed as result. Here are some process evaluation questions pertaining to Davis-Bacon:

  • What were the actual, monetary administrative costs for complying with Davis-Bacon?
  • How did multicounty weatheriza tion agencies deal with county-spe cific Davis-Bacon wage rates?
  • How has Davis-Bacon affected weatherization costs associated with multifamily buildings of four or more stories?
  • Have changes in weatherization costs associated with Davis-Bacon altered the choice of measures installed in homes?

Overall, how did Davis-Bacon implementation affect the cost-effectiveness of WAP?
 

National weatherization network. The unprecedented influx of federal funds into low-income weatherization has changed the national weatherization network. The weatherization labor force is larger, and funding increases have drawn new stakeholders into the network. The funding increases have had both positive and negative effects on long-standing leveraging relationships and have increased the visibility of low-income weatherization. The following evaluation questions are designed to document and evaluate changes in the national weatherization network.

  • What types of newcomers have joined the network?
  • Are new leveraging relationships forming?
  • Has the public’s perception of low-income weatherization changed? How has it changed?
  • How have relationships between state weatherization offices and local weatherization agencies changed?
  • What approaches did local agencies and/or contractors use to recruit and train new qualified,reliable, and trustworthy weatherization crew members, and how effective were these approaches?
  • Did ARRA change the way that local agencies procured weatherization services under contract (for example, did it change the use of Requests for Proposals, or RFPs, versus bids)?
  • Did the expanded weatherization workforce find work opportunities in the energy efficiency field outside of WAP?

Using the Results

The results of the evaluation will inform policymakers—most importantly, Congress—about the essential benefits of the program. These include estimated energy savings attributable to WAP for 2007–2011, along with associated estimates of cost-effectiveness, and nonenergy benefits.

The information collected will provide insights into energy savings attributable to particular measures and the strengths and weaknesses of computer audits versus priority lists as a way to select appropriate weatherization measures. Successful and less successful approaches to weatherization training, client education, and quality assurance will be identified. Additional benefits or problems due to changes in IAQ after weatherization will also be documented. For the first time, we will have data collected in the field about how weatherization crews operate and how they interact with clients. All this information will be made available to states and local weatherization agencies.

We will have program operation data for two levels of average home investments ($2,500 and $6,500) and two levels of home eligibility (150% and 200% of the poverty level). We will also have some indication of the impact on program benefits attributable to changing the federal grant formula that distributes more funds to hot-climate states. Policymakers will use this information to determine how WAP should be modified to achieve the greatest potential benefits, and how the program should be funded in the future.

Lastly, the results of both evaluations could affect people on the ground. We will identify and share information on the best software and hardware being used by agencies in the field and in the office. The most effective client education programs will be identified and documented. Characteristics of efficient crews will be described and shared with the weatherization community as well.

Bruce Tonn is a staff scientist in the Environmental Sciences Division at ORNL. Contact him at (865)574-4041 or bet@ornl.gov. Jackie Berger is the director of program evaluation at APPRISE. Contact her at (609)252-8009 or jackie-berger@appriseinc.org.

We thank Joel Eisenberg and Marty Schweitzer for their comments on a draft of this paper. This article sponsored by DOE's Weatherization Assistance Program, through the Oak Ridge National Laboratory.

 

For more information:

The WAP full evaluation plan can be found at http://weatherization.ornl.gov.
Brown, Marilyn A., and Linda G. Berry. Key Findings of the National Weatherization Evaluation, CONF-9406309—1. O

Discuss this article in the Building Performance Institute (BPI) and Weatherization groups on Home Energy Pros!

Comments
Add a new article comment!

Enter your comments in the box below:

(Please note that all comments are subject to review prior to posting.)

 

While we will do our best to monitor all comments and blog posts for accuracy and relevancy, Home Energy is not responsible for content posted by our readers or third parties. Home Energy reserves the right to edit or remove comments or blog posts that do not meet our community guidelines.

Related Articles
Email Newsletter

Home Energy E-Newsletter

Sign up for our free monthly
E-Newsletter!

Harness the power of
HOME PERFORMANCE!

Get the Home Energy
e-newsletter

FREE!

SUBSCRIBE

NOW!