Abstract
Wind erosion is detrimental to agriculture. Planting shelterbelt trees is a common strategy to protect vulnerable areas. I estimate the impact of shelterbelts on agriculture while instrumenting the endogeneity of planted location with a designated zone under the Great Plains Shelterbelt Project—a massive tree-planting operation implemented in 1935–1942. I find a shift from cropland to pasture associated with higher shelterbelt coverage due to differential productivity changes in livestock and crop production. The revenue increase from livestock was contributed by cattle, while the decline in crop production mainly occurred in western counties owing to obstacles to adopting new irrigation technology.
1. Introduction
Wind erosion degrades the environment and is detrimental to agricultural production. It is a global problem and is known to affect many arid and semiarid areas around the world (Toy, Foster, and Renard 2002). Perhaps the best-known case is the 1930s American Dust Bowl, which is one of the largest human-made environmental disasters in U.S. history. The Dust Bowl severely damaged more than 30% of the topsoil in the Great Plains and persistently resulted in a decreased annual agricultural revenue and farmland value in affected areas by 20%-30% from the 1940s to the 1990s (Hakim 1995; Hornbeck 2012).
Planting windbreak trees, or shelterbelts, is a popular strategy that is applied worldwide to counter the detrimental effects of wind erosion. Prominent ongoing examples include the massive 3-North Shelter Forest Program, which is planned to cover the northern half of China, and Africa’s Great Green Wall and the Algerian Green Dam, both of which are intended to fight wind erosion along the borders of the Sahara Desert. As a result of the vast geographic coverage of these programs, a significant amount of money is spent on them every year. Indeed, substantial sums have already been devoted to the three projects. The 3-North Shelter Forest Program, for example, which will take more than 70 years to complete, cost China in its initial completed stage roughly $250 million per year in the 1990s, at a time when China’s per capita GDP was almost 10 times lower than that of the rest of the world.1 These programs typically target extremely poor regions that suffer from harsh climates, such as sub-Saharan Africa and inland China. The Great Green Wall, which is the flagship program in Africa covering 20 countries in the Sahara and Sahel regions, affects the welfare of nearly 700 million people in what is the world’s most impoverished region (Food and Agriculture Organization 2017).
For various reasons, some of which may be related to the ongoing nature of these programs, the overall efficacy and economic effectiveness of these often-controversial programs have been insufficiently studied. Attention seems to have been limited to the initial survival of the trees, although it typically takes decades for trees to achieve some effective-ness.2 The lack of long-term evidence on the impact of forestation programs thus warrants going back in time to study the large-scale and well-documented effort to fight wind erosion in the American Midwest. Moreover, as I argue, the unique setting of this representative program in history, the Great Plains Shelterbelt Project, provides a way to identify the short-and long-term economic consequences of shelterbelts over eight decades. This also helps us understand whether short-term government interventions have long-term benefits or whether they are underminded by factors such as subsequent technical changes and land use adjustment.
In the 1930s, when severe dust storms threatened the American breadbasket in the Great Plains area, President Franklin D. Roosevelt launched the ambitious Great Plains Shelterbelt Project. Between 1935 and 1942, an unprecedented 220 million trees were planted in a massive shelterbelt zone, which was 100 miles wide and stretched 1,150 miles from the Canadian border into northern Texas, as shown in Figure 1 (Droze 1977). The program unfortunately ceased to exist after the United States entered World War II. Perhaps because shelterbelts are no longer at the center of policy debate in the United States, the program’s long-term impact has never been fully understood in spite of the available detailed documentation.3
Academic evidence on the effectiveness of shelterbelts mostly comes from agroforestry, which largely draws on small-scale field experiments. The primary focus of those studies has been to inform us about the technological effects of shelterbelts or their ability to protect agricultural production. We know that shelterbelts reduce wind velocity and wind-related damage, keep moisture in the soil, protect livestock, and tend to improve air quality. Note, however, that there are also some negative potential side effects associated with shelterbelts. They occupy arable land, may create obstacles for irrigation systems, and, if not properly maintained, may have sapping and shading effects that take water and sunlight from nearby crops (NRCS 2011). Hence, however informative the experimental approach is, we still need empirical evidence to study the actual economic impact of such large-scale forestation programs and whether such interventions are worthwhile. The overwhelming majority of evaluation in this context is relatively short term, but the Great Plains Shelterbelt Project allows for a study of the long-term effect of shelterbelts.
Estimating the impact of planting trees on agricultural revenue and land use is not trivial since the specific location choice of trees is endogenous. In practice, trees survive in natural conditions that are also favorable to other crops; at the same time, it is also possible that trees will be planted where the opportunity cost is the lowest. Hence, there are potentially both upward and downward biases when assessing the impact of shelterbelts on agricultural output and the scale of land use adjustments toward more productive activities. To address these endogeneity concerns, I exploit a special geographic feature of the Great Plains Shelterbelt Project. In particular, I digitized the boundaries of the program and superimposed the borders of all counties in the area. As shown in Figure 1, two key variations of the article include (1) the 100-mile-wide shelterbelt zone (the belt with a dashed boundary), and (2) the actual area with shelterbelt protection under the project (the black area). I use the proportion of each county that overlaps with the shelterbelt zone as the instrumental variable (IV) to predict the actual region where shelterbelts were planted, and I compare those counties to similar neighboring control counties that lie outside the zone. Conditional on county and year fixed effects, the identification assumptions using the IV is equivalent to a difference-in-differences (DID) methodology (Duflo 2001; Hudson, Hull, and Liebersohn 2017). That is, the monotonicity and exclusion restriction of the IV requires that the coverage of the shelterbelt zone increases the actual coverage of shelterbelt protection, but it does not affect the relative pretreatment trends for counties within the zone versus their neighboring control counties outside the zone, given predetermined and observed conditions. These conditions are controlled by covariates on climate and soil characteristics, a set of observed pretreatment features, and year and county fixed effects. I am careful to establish that there had been no differential trends between the more and the less (or not) treated counties in the periods before the program took effect.4
A distinct advantage of my study is to estimate long-term effects because the Great Plains Shelterbelt Project happened around eight decades ago, and a long series of data is available. I draw on county-level data concerning agricultural land use and production from the U.S. Census of Agriculture and Population between 1910 and 1992 to evaluate the short-and long-term effects of shelterbelts. Critical for my study and its identification strategy are maps provided by the U.S. Forest Service (USFS 1935) and Droze (1977), which I digitized with GIS tools. Other related information includes the soil erosion level during the 1930s Dust Bowl from Hornbeck (2012), the coverage of the Ogallala Aquifer from Hornbeck and Keskin (2014), and climate data from Willmott and Matsuura (2001).
My first-stage regression shows that the coverage of shelterbelt protection for counties within the 100-mile-wide shelterbelt zone is 17 percentage points higher than that of their neighboring counties outside the zone, which amounts to almost twice the coverage of the control counties.5 Interestingly, and somewhat surprisingly, the second-stage results show that a 10% increase in actual shelterbelt protection in a county leads to a switch of 1.3%–3.1% of farmland from cropland to pasture, which in turn brings about a 7–13 percentage point increase in revenue from animal products. Meanwhile, the shelterbelts caused some decrease in crop revenue in earlier decades, as they created physical obstacles to adopting irrigation systems that became available in the 1950s and were another essential component in fighting the Big Dry. For the subsample of the eastern half, where irrigation is less necessary, a positive impact on crop revenue showed up in later decades, which suggests that shelterbelts would also benefit crop production if not swamped by the historical coincidence of the subsequently introduced irrigation technology. To demonstrate that these effects are indeed caused by the shelterbelts, I discuss alternative channels, such as other government programs and agricultural inputs. My estimates are robust to examining various subsamples and including alternative factors that could confound the baseline results.
My work on the Great Plains Shelterbelt Project is, to the best of my knowledge, the first empirical evaluation of the long-term effects of a large-scale forestation program. A series of agronomy papers adopt field experiments to measure the effect of shelterbelts, but the unavoidable limitation of these studies at much smaller scales typically includes the lack of external validity and evidence on long-term effectiveness.6 Another parallel study with some interesting connections is Kaffine (2019), which finds positive externalities of wind farms on crop yields due to “micro-climate” effects. Compared with wind farms, however, trees planted within farmland have additional features, such as sapping and shading effects, as well as potential risk to prevent future farm reorganization. These complicating factors of trees cause my conclusions to differ from those of Kaffine (2019).
The effects identified in my article are also distinct from literature on the payments for ecosystem service (PES) programs for reforestation. PES mostly addresses the take-up of conservation practices, the general equilibrium effect under compensation, or the opportunity cost of implementation.7 Instead, my article directly evaluates the private benefits generated by the ecosystem service itself, not those by direct pecuniary payment to the landowners as in the PES literature.
My work is also related to the literature on general cost-sharing programs in the United States. Papers of particular interest are Feng, Hennessy, and Miao (2012), which examines land use changes under the Conservation Reserve Program (CRP), and Goodwin and Smith (2003), which discusses the effect of the CRP and other government programs on soil erosion. These studies evaluate programs with a wider range of conservation practices other than shelterbelt planting, and so they do not directly address the effects of shelterbelts.8 Meanwhile, these programs were also implemented with larger geographic coverage or in later decades in the 1980s or 1990s. I distinguish my findings in terms of timing and mechanism from the effects of these related government programs in the subsection titled” Alternative Channel: Other Programs” in Section 7.
In addition, Hornbeck (2012) and Hornbeck and Keskin (2014) contribute relevant covariates used in my article. Both papers study agriculture in the Great Plains area, but they do not address forestation efforts and cover different geographic regions. Hornbeck (2012) evaluates the impact of soil erosion caused by the 1930s Dust Bowl and finds persistent detrimental effects on agricultural production and a large population decline as the major channel for economic adjustment under soil erosion. Alternatively, as noted here, the economic adjustment under the Great Plains Shelterbelt Project mainly came through land use switching from cropland to pasture, while there has been no significant population de-cline.9 Hornbeck and Keskin (2014) examine the effects of the availability of groundwater irrigation from the Ogallala Aquifer on agricultural production. I borrow this variation on the aquifer to demonstrate that my findings are robust to groundwater availability. I also show the effects of shelterbelts on irrigation.
2. Background
In response to the most severe drought and wind erosion in the history of the Great Plains area, the Great Plains Shelterbelt Project, later renamed the Prairie States Forestry Project (1937), was initiated by President Franklin D. Roosevelt in July 1934. Its aim was to fight wind erosion in the region and, as the president advocated, to construct “America’s Great Wall,” which was supposed to be a “one hundred mile wide zone of shelterbelts, spread one mile apart, and running continuously from the Canadian border to the Texas Panhandle” to hold back “the dust, drought, and despair of the Dust Bowl” (Orth 2004, 140). After field surveys and experiments, the U.S. Forest Service published a report in 1935 to guide this flagship project (USFS 1935).
The intent-to-treat variation used in this article is based on the region proposed in this report concerning where the shelterbelts would be planted. Originally, the president’s dream was to fill the 100-mile-wide shelterbelt zone with continuous strips of trees to fundamentally change the climate in the Great Plains area and to shelter the whole eastern half of the United States, which turned out to be problematic. However, the final shape of the shelterbelt zone still somewhat reflects the president’s original vision (Droze 1977). Based on the climate and characteristics of the area, it was argued that the shelterbelt zone could not be placed too far to the west, as the seedlings would die due to lack of water. Neither could it go too far to the east, as trees were less necessary. Hence, as shown in Figure 1, the U.S. Forest Service proposed a 100-mile-wide belt-shaped shelterbelt zone (the belt with the dashed boundary) that stretched 1,150 miles from the Canadian border into northern Texas, totaling 114,700 square miles (USFS 1935). The western limit of the belt was generally within the sufficient precipitation boundary (the thick line to the left of the belt in Figure 1), accounting for varying evaporation from the north to the south. In addition, it was acknowledged that 56% of the proposed land area had desirable soil conditions for shelterbelt planting while 5% was entirely unfit, so the proposed zone did not actually form continuous parallel strips of trees. It was thus deemed necessary to adapt the planting of trees to specific local conditions. (USFS 1935, 11-17).
The shelterbelt planting started in 1935 and ceased in 1942, as funds were cut off after the United States entered World War II (Droze 1977). Because the U.S. Forest Service stated that it would take at least five years for newly planted shelterbelts to grow high enough “to achieve some degree of effectiveness,” the potentially positive effect should not exist at all before 1940, and it would not apply to the whole region until 1947 (USFS 1935, 24). Based on the current guidelines of the Natural Resources Conservation Service, it takes 20 years for shelterbelts to achieve their designed height (NRCS 2011).10 By 1942, a total of 30,233 shelterbelts containing 220 million trees had been planted within the black area in Figure 1, with nearly $20 million of federal and local investment (Droze 1977).11 One can see that most of the shelterbelts were planted within the 100-mile-wide shelterbelt zone. However, because of the eventual rise in the popularity of shelterbelts, many local governments and politicians from outside the belt also advocated for the implementation of the program, which led to an expansion of another 100 miles to the east (so 200 miles wide in total) in 1937, or even a direct removal of the eastern boundary (Wessel 1969).12 However, these expansion plans did not secure any additional funding, and the whole program ceased in 1942 as the United States entered World War II. The originally proposed 100-mile-wide shelterbelt zone is thus still the officially documented eligible area (Droze 1977). My estimation also indicates that counties within the 100-mile-wide belt are significantly more likely to be covered by shelterbelts than neighboring counties outside the belt.
Although the U.S. Forest Service initially wanted the federal government to directly acquire ownership of the land, the project simply worked under cooperative agreements with landowners owing to financial and legal difficulties (Zon 1935; Ballantyne 1949). The agreement specified requirements on location, size, and soil conditions for shelterbelts, as well as rules for landowners to preserve the planted area. Participants needed to prepare their land for planting in return for a shelterbelt, fences, and rodent control. Taking the opportunity cost of the land sacrificed for tree planting into account, the project supported roughly half of the entire costs for shelterbelt planting (Droze 1977). As a typical take-up process, the U.S. Forest Service selected the planting areas state by state within the designated zone, established planting quotas, and decided the location within each county. The organization then needed to negotiate with landowners or tenants for strips to plant the trees. From the landowners’ perspective, not all applications were approved because of the planting quota, unfavorable location, natural conditions, and so on. (Droze 1977). While contracts were signed with owners of individual farms separately, it was made clear to the public that “the best results are obtained by grouping belts on a number of adjoining farms” (USFS 1935, 16). Hence, one would expect common collective decision-making.
Another concern was the potentially low survival rate of trees in the semiarid Plains area. It is worth emphasizing that the U.S. Forest Service achieved a tree survival rate as high as 73%, with more than half of the trees rated good or excellent, while less than 5% had disappeared by 1954. According to a subsequent report from the General Accounting Office, the average removal rate of these shelterbelts in 16 counties in Kansas, Nebraska, and Oklahoma was around 9% until 1974, and a majority of counties with the highest removal rates were in Oklahoma.13 Owing to data limitations, I rely on time-invariant variations digitized from historical maps, so I expect increasing inaccuracy in these static measures for later decades.14
3. Trade-Offs and Expected Impacts of Shelterbelts
Existing Evidence
In terms of the technical effects of shelterbelts, the Natural Resources Conservation Service lists the purpose for growing shelterbelts as reducing soil erosion and wind-related damage (such as windfall in orchards), increasing carbon storage in biomass and soils, altering the microenvironment for enhancing plant growth, protecting properties and livestock, and improving air quality and irrigation efficiency (NRCS 2011). As the most fundamental function, Appendix Figure A.1 illustrates how shelterbelts can reduce wind velocity. The protected zone extends 20 times the height of the trees, so the benefits are basically localized within each treated county. An example of shelterbelt planting and protected areas are shown in Appendix Figure A.2. In addition, shelterbelts provide refuge for predatory birds and insects (against harmful insects, etc.) and potentially help with the carbon balance equation, easing the economic burdens of climate change (Brandle, Hodges, and Zhou 2004).
A series of studies use field experiments to measure the effect of shelterbelts on crop production on a relatively small scale. Brandle, Johnson, and Akeson (1992) evaluate different sizes of shelterbelts over a range of economic conditions and find that crop yield increases by less than 15%. Helmers and Brandle (2005) further explore the optimal spacing for shelterbelts. However, according to Kort (1988), different crops are heterogeneous in terms of responsiveness, and the actual impact depends on the trees’ height and longevity, field width, shelterbelt orientation, and precipitation, among other things.
In animal sciences literature, field experiments show that providing shade to cattle improves their dry-matter intake by 6% and their average daily gain by 9%, and it helps suppress heat stress (Allen et al. 2013; Barajas, Garces, and Zinn 2013).15 In other studies, shade is as an effective prevention strategy in the dairy industry by reducing symptoms and signs of heat stress (Schutz, Cox, and Tucker 2014; Van Laer, Moons, et al. 2015; Van Laer, Tuyttens, et al. 2015). In addition, shelterbelts’ effect on plant growth through “altering [the] micro-environment” should also improve the quality of grass on pasture (NRCS 2011, 1). Hence, pastured livestock should benefit from shelterbelts.
It needs to be mentioned, however, that shelterbelts may have adverse effects. First, shelterbelts by necessity must occupy some arable land. In practice, the two major reasons for the destruction of shelterbelts are freeing land for crop production and eliminating obstacles to sprinkler irrigation systems. Other problems may occur if the shelterbelts do not receive proper maintenance. For example, sapping and shading from shelterbelts can take water and sunlight from nearby crops if the trees are not pruned and thinned periodically (Droze 1977; NRCS 2011).
To sum up, shelterbelts can generate a potential positive impact on both crops and livestock, but the harm stemming from the trees affects crops, not livestock. As a result, shelterbelts may have differential impacts on the productivity of crops versus animal products. Meanwhile, existing studies based on field experiments are unavoidably limited by the lack of external validity and evidence on long-term effectiveness, so the economic impact of large-scale forestation programs over the long term remains an empirical question. In addition, Brandle, Hodges, and Zhou (2004) point out the challenge of understanding why producers are reluctant to adopt shelterbelts—an issue on which this article sheds light.
Theoretical Framework
To clarify what the previous subsection means under an economic framework, I employ a simple model for a representative farmer who can allocate land to produce two types of goods, crops and animal products, based on the profit functions πc (θ, Ac) and πa(1−θ, Aa), respectively.16 The share of land allocated for crop production is represented by θ, so (1 − θ) is the share for animal products, and Ac and Aa measure the productivity to produce crops and animal products, respectively. The farmer’s objective function is to maximize the total profit generated by the two types of good: [1]
Assume the profit functions to be differentiable and concave; then the first-order condition of θ leads to an interior solution: , where is the initial equilibrium level of θ.17
The representative farmer, knowing the potential benefits and especially the need to cope with future wind erosion, has to decide whether to plant shelterbelts around the farmland. However, in periods of hardship, such as the Great Depression and the Dust Bowl, farmers may face liquidity constraints and are less likely to afford the practice without government intervention. The government hence decides to alleviate the burden by covering part of the cost (i.e., labor, technique, and materials to plant trees) for any farmer who is willing to pay the remaining cost (i.e., the opportunity cost for some marginal portion of farmland), under the condition that the farms are located within a designated zone, which is determined exogenously, given explicit criteria. As a result, farms within the designated zone more likely end up having shelterbelts planted than farms outside the zone.
Once shelterbelts are planted, Ai goes to Âi, for i=c, a, but suppose the adjustment of land allocation (θ) is costly and cannot be made immediately. Consequently, changes instantaneously to , for i=c, a, but θ stays at in the short run. Hence, there may be a deadweight loss in total profit under this occasion because may not be the optimal land allocation after shelterbelts are planted. Eventually, the farmer is able to adjust the land allocation to its new equilibrium level , which will lead to a weakly larger total profit due to the efficiency gain from eliminating the deadweight loss when θ used to be at the previous suboptimal level.
I am interested in the land use adjustment, , as well as its effects on πc and πa. The sign of should depend on the relative changes in profit from crops and animal products. As discussed at the beginning of Section 3, with shelterbelts planted, it is likely that the productivity of animal products would increase, while the impact on the productivity of crops is less obvious. To simplify the discussion, I consider the following three possibilities, given the positive impact on livestock, that is, :
(1) If the impact of shelterbelts on „crop production is negative, that is, , the marginal profit from cropland is smaller than from pasture, that is, . Consequently, the optimal should be smaller than , which means eventually the land allocation will switch from cropland to pasture, such that . Then, we have and .
(2) If the impact of shelterbelts on crop production is positive but its scale is weakly smaller than the scale of the positive impact on animal product production, we have . Similar conclusions follow as in possibility 1. If the optimal is weakly smaller than , then we have and , but the relative size between and is theoretically ambiguous.
(3) If the impact of shelterbelts on crop production is positive and strictly larger than the impact on animal „product production, . The optimal is thus strictly larger than , so the land allocation switches from pasture to cropland. Consequently, we have and , but the relative size between and is theoretically ambiguous.
The above are the three possible outcomes based on the model and literature. This article tends to empirically test which ones hold in practice. My estimates in the subsections titled “Main Results” and “Mechanisms for the Eastern and Western Halves of the Zone” in Section 6 support the first (for overall and for the western part of the sample) and the second scenarios (for the eastern part of the sample) but not the third one.
4. Data
The main data set that I use is a county-level panel from the U.S. Census of Agriculture and Census of Population (Gutmann 2005; Haines 2005). Most variables of interest were collected decennially from 1910 to 1930 and approximately every five years from 1945 to 1992. The shelterbelts were planted from 1935 to 1942, and most trees needed at least five years to “achieve some degree of effectiveness,” and the maximum height/effectiveness would not be reached until several decades later (USFS 1935; Helmers and Brandle 2005; NRCS 2011). Thus, I have three waves of data until 1930 before the treatment occurred and 10 waves after, although I do not expect to see much positive effect in 1945.18 This dataset contains detailed information on agricultural land use and production.
As for the information on the treatment, I extracted the data on the 100-mile-wide shelterbelt zone and the actual region of shelterbelt protection under the Great Plains Shelterbelt Project based on the maps provided by the U.S. Forest Service and Droze (1977). I digitized the maps and calculated the proportions that overlap with each county’s boundaries. In Figure 1, the shelterbelt zone (the belt with the dashed boundary) runs from the Canadian border into northern Texas with occasional bends that shifted it to the east or west due to local geographic conditions. The counties with more than 50% covered by the belt are dark gray, while other counties in my sample are light gray.
Specifically, those in light gray are either counties with less than 50% of their areas covered by the belt or counties that lie outside the belt but are also immediate neighbors of the covered ones.
In addition, I also use the county-level soil erosion data constructed by Hornbeck (2012), according to the Soil Conservation Service. and the information on the Ogallala Aquifer from Hornbeck and Keskin (2014), based on the U.S. Geological Survey. The precipitation and temperature data are from Willmott and Matsuura (2001) at the University of Dela-ware.19 More details on data sources and construction are provided in Appendix B.
Table 1 shows the summary statistics in the baseline year 1930 for the more and the less (or not) treated counties, where a county is defined as “more treated” if more than 50% of its area is covered by the 100-mile-wide shelterbelt zone (the dark gray counties in Figure 1) and is otherwise “less (or not) treated.” In this comparison, 117 counties in my sample belong to the more treated group, while the other 117 counties in the less (or not) treated group are either counties with less than 50% of their area covered by the zone or neighboring counties to the east and west of the zone.20 As can be seen in Table 1, the two groups are generally similar to each other in most variables except a few as follows. There is a mere four percentage point difference in the fraction of rural population, although it is statistically significant at the 95% level. In terms of land allocation for crops, the more treated counties planted slightly more cotton and less barley/oats/rye than the less (or not) treated counties. Despite statistical significance, the size of these differences is not large, leaving the only striking difference in the coverage of shelterbelt protection as shown in the first row. All the observations above are not sensitive to the arbitrary cutoff at 50%. I show that the qualitative features do not change in Appendix Table A.2 as I adjust the cutoff for more treated counties and less (or not) treated counties to 40% coverage. Nor does it matter even if I move the cutoff to 0%. In my regression analysis, I control for all these pretreatment characteristics listed in Table 1 in order to account for differential initial conditions. The pretreatment trends of outcome variables are shown in Figure 2 and discussed in the subsection titled “Validity of the Instrumental Variable” in Section 5.
5. Empirical Strategy
Endogeneity Concerns
I am interested in the effect of shelterbelt planting on agricultural production and land use adjustment. It is difficult to empirically assess this effect because the decisions concerning where to plant the trees is endogenous. Indeed, trees survive in natural conditions that are also favorable to other crops. Hence, simply comparing the areas with and without trees may lead to an upward bias for outcome variables on revenue or land use adjustment toward more productive activities. However, if farmers do not want to sacrifice their best land to plant trees, this could lead to a downward bias for the same outcome variables.
I use the geographical neighbors of counties covered by the 100-mile-wide shelterbelt zone designated under the Great Plains Shelterbelt Project to provide more control counties. In the Great Plains area, geographically neighboring counties almost always have fairly similar natural conditions. Moreover, I include county fixed effects to purge any time-invariant differences so the concerns about the upward and downward biases mentioned above are mitigated. However, there is still another concern that may potentially give way to downward bias. Since this project was initiated in response to the crisis caused by the Dust Bowl, the farmers who suffered more from the Dust Bowl were perhaps more likely to cooperate with shelterbelt planting. Therefore, an ordinary least squares (OLS) estimation using the actual region with shelterbelt protection is actually biased downward in terms of agricultural revenue and land use adjustment toward more productive activi-ties.21 Consequently, an alternative empirical strategy with more caution is necessary to address this concern.
Validity of the Instrumental Variable
I adopt an IV method to address the above endogeneity concern. The instrument that I use is the proportion of a county’s area within the 100-mile-wide shelterbelt zone designated by the Great Plains Shelterbelt Project. In other words, I instrument the actual take-up of the treatment (the proportion of a county’s area within the actual protected region of shelterbelts) with the eligibility measure (the proportion of a county’s area within the proposed 100-mile-wide shelterbelt zone). Instrumenting an actual treatment effect with a DID variation has been widely used in empirical research, such as Duflo (2001), although it has not received much formal discussion in the literature on econometric theory.22 Hudson, Hull, and Liebersohn (2017) provide a short note on this “instrumented difference-in-differences” and point out that the parallel trends and monotonicity assumptions are consistent with the DID and IV literature.
As for the monotonicity assumption of my IV, a majority of the shelterbelts were planted within the 100-mile-wide shelterbelt zone (see Figure 1), and my estimates show that counties within the zone are on average 17 percentage points higher in the ratio of shelterbelt protection, a finding that is statistically significant at the 99% level (see column 5 of Table 2). Thus, there is little doubt about the high correlation between the proportion covered by the shelterbelt zone (the IV) and the proportion actually protected by shelterbelts (the endogenous variation of interest).
In order to establish that the exclusion restrictions of the IV are met, I first focus on the determinants of selection. When policy makers determined the location of the shelterbelt zone, they took three factors into account: adequate rainfall under local temperature, soil quality, and longitude. The first two factors ensured the trees’ survival, while the third factor, longitude, was important in determining how far west they could go (Droze 1977). These criteria are easily controlled for by annual rainfall and temperature data as well as by county fixed effects. Other than these selection criteria, I also control for a whole set of pretreatment covariates to account for potentially differential pretreatment characteristics and trends for counties within and outside the shelterbelt zone.
Conditional on the covariates mentioned above, the variation left in the IV that I used for identification is explicitly the following two features: (1) the shelterbelt zone is always exactly 100 miles wide and (2) the shelterbelt zone roughly follows a rectangular shape. These artificial features of the zone originated in the political debate of the time and were especially influenced by President Roosevelt’s personal interest in constructing a 100-mile-wide zone of shelterbelts running continuously from the Canadian border to northern Texas (see Section 2). Although this so-called America’s Great Wall was not actually carried out by the U.S. Forest Service owing to technical and practical reasons, the final shape of the zone still somewhat reflected the president’s vision (Orth 2004). As a result, these artificial features of the IV driven by political decision are arguably not correlated with outcome variables on local agricultural production and land use. I further corroborated this below with the exhibition of parallel pretreatment trends.
According to Hudson, Hull, and Liebersohn (2017), the exclusion restriction using the IV conditional on county and year fixed effects is equivalent to the parallel-trends assumption in a DID specification, which requires that the treatment of the shelterbelt zone does not affect the relative pretreatment trends for counties within the zone versus their neighboring control counties outside the zone, after controlling for all covariates including the officially declared selection criteria. Figure 2 shows the graphs comparing the more treated counties (with the shelterbelt zone coverage over 50%) with the less or not treated counties (with the shelterbelt zone coverage less than 50%) on their revenues from crops (panel A) and from animal products (panel B), respec-tively.23 Across both panels, I eventually introduce more control variables from graph 1 to graph 4 to check the residuals for each one of the outcome variables. The light gray bars on the graphs mark the program’s implementation periods. Graph 1 shows the raw plots of outcome variables for the two groups of counties. Because the two groups are made up of fairly similar neighboring counties, they generally exhibit similar trends even without any controls. As more and more control variables are added in graphs 3 and 4, it is evident that the two groups were establishing parallel pretreatment trends between 1910 and 1930.24 Therefore, the exclusion restriction is satisfied, conditional on the control variables included in graphs 3 and 4, and the proportion within the shelterbelt zone is a valid instrument under my empirical specification.
Estimation Procedure
As stated in Section 2, the shelterbelts were planted between 1935 and 1942. However, also recall that it generally takes at least five years for the trees to become somewhat effective (USFS 1935).25 Hence, among the waves when census data were collected, 1945-1992 are the posttreatment years for this program. Following the empirical specification in Hornbeck (2012), I pool all the data on outcome variables from 1930 to 1992 on the left-hand side of my regressions while using 1930 as the baseline year.26 Meanwhile, I include the pretreatment values of the outcome variables in 1910-1930 as control variables interacted with each posttreatment-year dummy on the right-hand side in order to account for pretreatment trends.
Because the proportion within the actual shelterbelt-protected region in county c (PropShltrBltc) and the proportion within the proposed 100-mile-wide shelterbelt zone in county c (PropZonec—the instrumental variable) are time invariant, I interacted both variables with each of the posttreatment-year dummies to estimate the time-varying effects of shelterbelts. Hence, for each year from 1945 to 1992 (i.e., t ≥ 1945), the first stage of my two-stage least squares (2SLS) regression is the following: [2] for all t ≥ 1945, where 1(year = t) is a dummy variable that equals 1 in year t; Controlc is a large set of time-invariant control variables for county c, including the proportions of high-and medium-eroded regions in the 1930s Dust Bowl (with the proportion of low-eroded regions as the baseline category), the proportion above the Ogallala Aquifer, the pretreatment characteristics listed in Table 1, and the pretreatment values of outcome variables in 1910-1930 (i.e., Yc,1910, Yc,1920, and Yc,1930 that correspond to the left-hand-side variable in equation [3]); Wct is a set of time-varying controls, including the total precipitation and average temperature in county c pooling both year t and t − 1; and fc is a county fixed effect, ft is a year fixed effect, and ect is an idiosyncratic error term. Because all variables in Controlc are also time invariant, I need to interact them with each corresponding post-treatment-year dummy, 1(year = t), too. The coefficient of interest is α because it estimates the impact of the instrumental variable, the eligibility measure of the program, on the actual coverage of shelterbelt protection.
In the second stage, I also need to interact all time-invariant variables on the right-hand side with year dummies in order to estimate their time-varying effects on the outcome variables.27 Hence, the empirical model for my second-stage estimation is [3] where Yct is the outcome variable in county c in year t; PropShltrBltc is instrumented by PropZonec, and both are interacted with 1(year = t), for each t ≥ 1945; similar to the first stage, Fc is a county fixed effect, Ft is a year fixed effect, and Ect is an idiosyncratic error term. Hence, the vector of βt’s for all years with t ≥ 1945 includes all coefficients of interest. Note that the subscript t on βt means that the posttreatment effects are estimated year by year. To depict a general tendency, I pool the years around the same period and report each period’s average βt for all the results in the following sections.
6. Results
Main Results
The baseline results are shown in Table 2. I use an IV method to tackle the endogeneity problems that potentially exist for the OLS regressions, as discussed in Section 5.28 The first stage of the 2SLS regressions is shown in column 5, which indicates that the counties completely covered by the 100-mile-wide shelterbelt zone are 17 percentage points higher in the coverage of shelterbelt protection than (or nearly twice as likely as) those completely not covered. This coefficient is statistically significant at the 99% level, so it gives us more confidence in the monotonicity requirement of the IV method.
Columns 1–4 exhibit the baseline 2SLS results of several key outcome variables on land use and agricultural revenue.29 I control for year and county fixed effects along with a set of time-invariant pretreatment characteristics interacted with corresponding postperiod dummies, including the soil erosion levels from the 1930s Dust Bowl, the proportion above the Ogallala Aquifer, and the outcome variable and other control variables listed in Table 1 in all pretreatment years.30 Column 1 indicates the effects of shelterbelt protection on the fraction of cropland relative to the total area of cropland and pasture. Note that the scale of treatment in the table is 100%, so a more meaningful explanation of the coefficients is to scale down to, say, 10%, which is closer to the 17 percentage points estimated coefficient in the first stage. Hence, farmers in treated counties, when facing a 10 percentage point increase in shelterbelt protection, would switch from cropland to pasture, leading to a 1.3–3.2 percentage point decrease in the fraction of cropland in treated counties since the 1950s.
Corresponding to this adjustment in land use, columns 2 and 3 report the effects of shelterbelt coverage on per acre logarithmic revenues from crops and animal products, respectively. A 10 percentage point increase in shelterbelt protection leads to an 8%–12% decrease in crop revenue in the 1950s and 1960s, as shown in column 2. This unintended negative impact, as I will discuss in more detail in the next subsection, is possibly caused by creating the physical obstacles that prevented the adoption of the new irrigation technology that appeared in the 1950s. On the other hand, it is apparent that animal products are the main contributor to the positive effect of the project. Specifically, the effects in column 3 are statistically significant from the 1950s onward, with the size of the coefficients growing over time. The overall per acre revenue in column 4 exhibits approximately a 5% decrease in earlier periods with a 10 percentage point increase in shelterbelt protection, which is obviously driven by the decline in crop revenue. However, this negative impact eventually dies out after the 1970s, and the increase in animal products leads to positive (although not statistically significant) coefficients in later periods.
In order to shed light on the specific factors behind this adjustment in land use and the effects on agricultural revenue, I look further into the effects of shelterbelt planting on several major crops and livestock in the following two subsections.
Major Crop Production
Table 3 shows some key outcome variables on crop production. The effects on the productivity (per acre logarithmic values of yields) for wheat, corn, and hay are reported in columns 1–3. One can see that more shelterbelt protection leads to negative and statistically significant effects for wheat and corn, especially in earlier years, as in columns 1 and 2, but not quite so for hay in column 3. More importantly, in column 4, there is also significantly less irrigation in the treated area in earlier periods. This is consistent with the issue pointed out in the literature review in Section 3 that shelterbelts can potentially create obstacles to adopting subsequently introduced sprinkler irrigation systems. It seems to take a few decades for the farmers to adjust for this disadvantage in irrigation technology. Hence, the reason for the negative impact on corn and wheat production in earlier years is possibly because corn and wheat are more dependent on irrigation, whereas hay is less dependent.
Overall, if we define the productivity of cropland as total crop revenue over cropland area as in column 5, the periods when treated areas were suffering from a loss exactly correspond to the periods when there was a smaller irrigated area.
This negative impact of shelterbelts on irrigation and crop production is somewhat unintended because of when people planted the trees within their typically rectangular field. In the 1930s, nobody could predict the prevalence of sprinkler irrigation systems in the 1950s, which significantly improved crop productivity but typically requires larger, round-shaped fields. Historically, this unlucky coincidence may have also contributed to people’s misunderstanding about the true effect of shelterbelts and further caused the drop in shelterbelt popularity in the United States. Meanwhile, one may wonder what the shelterbelts’ impact on crops could have been without such a historical coincidence and whether this negative impact is the only factor driving the land use adjustment from cropland to pasture. I will further provide suggestive evidence to answer this question in a later subsection (”Mechanisms for the Eastern and Western Halves of the Zone”) by comparing the eastern half, where irrigation is less common, with the western half of my sample.
Major Livestock Production
In Table 4, I use the logarithmic numbers of cattle, pigs, and chickens in each county (normalized by dividing the county’s farmland area) as the outcome variables in columns 1–3. The number of cattle in the treated counties in column 1 is 6–10 percentage points higher than in the control counties from the 1950s onward, with 10 percentage points more of shelterbelt protection. On the other hand, the numbers of pigs and chickens do not exhibit any robust and statistically significant increase in columns 2 and 3. Recall that column 1 of Table 2 shows a higher fraction of pasture in the treated area. Hence, the reason for the increase in cattle is probably because cattle are more likely to be pastured than pigs and chickens. To further confirm this conclusion, I show the regression result on the logarithmic value of expenditure on livestock feed in column 4 of Table 4, and there is no statistically significant increase in feed. This suggests that the increased number of cattle is indeed mainly raised on pasture, not in feedlots. Overall, if we define the productivity of pasture as total revenue from animal product over pastureland area in column 5, we start to see some positive productivity gain beginning in the 1950s. The size continues to grow over time, although the coefficients eventually become less statistically significant in later periods.
To sum up, pastured cattle is likely to be the main contributor to the positive effect on the revenue of animal products from shelterbelt planting. Compared with columns 1 and 3 of Table 2, where cropland has been replaced by pasture and the per acre revenue from animal products has been increasing since the 1950s, the number of cattle also started to be significantly higher in treated counties from the 1950s onward, as shown in column 1 of Table 4. Therefore, after the farmers started to adjust their agricultural production by replacing cropland with pasture and raising more cattle in order to make the best out of the environment with more trees, the productivity loss in crop production was eventually alleviated and further compensated for by the productivity gain from livestock.
Mechanisms for the Eastern and Western Halves of the Zone
One may be concerned that some geographic features to the east or to the west of the shelterbelt zone could have caused my results, so my findings could be solely driven by one side of the shelterbelt zone. In particular, considering that the prevailing wind in the Great Plains area generally brings dry air masses from the west to the east, the counties to the west can be more likely to suffer from drought (Leathers 2011). Moreover, there could be heterogeneity in the mechanisms for eastern and western counties owing to precipitation and the relative importance of irrigation.
To address the potential geographic heterogeneity in my results, I divide my sample into the eastern half and the western half by longitude and show the same baseline regression results on four major outcome variables in Table 5. Columns 1–4 compare the treated counties in the eastern half of the belt with control counties to the east. The first-stage effect has a larger scale at 26 percentage points when compared with column 5 of Table 2 and is statistically significant at the 95% level. Meanwhile, the adjustment of land use from cropland to pasture in column 1 is statistically significant from the 1950s, which is consistent with column 3 of Table 2, although it is slightly smaller in size at 0.9–1.5 percentage points. In column 3 of Table 5, the revenue from animal product also consistently exhibits positive and statistically significant effects beginning in the 1950s. In column 2 of Table 5, however, crop revenue in the eastern half of the counties was not negatively affected in earlier periods and even was more statistically significant in more recent periods, while recall that column 2 of Table 2 shows the negative and statistically significant effects in earlier periods. To further understand this discrepancy, one can see that irrigated acreage in column 4 of Table 5 exhibits muted effects, and the coefficients for earlier decades are even positive, unlike the negative and statistically significant effects for the corresponding column 4 of Table 3. These results suggest that crop production in the eastern half of the sample was not quite negatively affected by the shelterbelts perhaps because irrigation was not as crucial as it was in counties in the west.
On the other hand, columns 5-8 compare the treated counties in the western half with the control counties to the west. The first-stage coefficient shows a slightly smaller difference in shelterbelt coverage at 14 percentage points, but the coefficients in columns 5 and 7 are larger than those for the eastern half in columns 1 and 3. This is consistent with the agronomy literature, where Kort (1988) find that percentage-yield increases due to shelterbelts are higher in drier regions. In column 5, the land use adjustment has a larger size at 2-3 percentage points for a 10 percentage point increase in shelterbelt protection, but it is mostly statistically insignificant. Meanwhile, the production of animal products in column 7 is benefited at a larger scale with up to over a 30% increase, although it also appears less statistically significant in later periods. These results are all qualitatively consistent with the corresponding columns in Table 2. More interestingly, column 6 of Table 5 exhibits drops of more than 20% in earlier periods for a 10 percentage point increase in shelterbelt coverage, unlike column 2 for the eastern half. In addition, the western half of the counties also suffered from less irrigation, as shown in column 8 of Table 5. These results further demonstrate that shelterbelts hurt crop revenue mainly by creating physical obstacles to adopting irrigation systems, and this mostly applies to the western counties in my sample, where rainfall has been relatively insufficient and the adoption of sprinkler irrigation is more important.
To sum up, columns 1, 3, 5, and 7 of Table 5 demonstrate that my main results on land adjustment and the revenue from animal products are not solely driven by counties on one side of the shelterbelt zone. Especially, the results from the eastern half of the sample help me rule out the possibility that prevailing west wind is driving the positive impact on animal products: if my results were actually driven by the prevailing wind from the west, the eastern half of treated counties within the belt, which are located more westward than the control counties to the east of the belt, should have suffered more from the dry air masses from the west. However, the results in columns 1 and 3 suggest that this is unlikely to be what happened.
On the other hand, the heterogeneous results between the eastern and western halves of the counties in columns 2, 4, 6, and 8 suggest a more complicated combination of mechanisms: the land use adjustment toward pasture, driven by shelterbelts’ negative impact on crop revenue and irrigation, is more applicable to western counties where irrigation has played a more crucial role, whereas in eastern counties, the same adjustment should be largely driven by the shelterbelts’ positive impact on pasture and livestock.
7. Robustness Checks
Alternative Channel: Other Natural Conditions
Perhaps one of the most fundamental concerns is that the shelterbelt zone is correlated with certain natural environmental features, so the effects presented in Section 6 are actually not driven by shelterbelts. Although controlling for the county fixed effects should already help address any time-invariant confounding factors, I also choose a comprehensive measure called the Agro-Ecological Suitability Value to check the balance of the treatment in terms of suitability distributions and the aggregate potential productivity of crops.
This Agro-Ecological Suitability Value takes into account soil quality, slope, and climatic conditions, as well as crop fallow-period requirements, environment fallow-period requirements, and management-specific fallow-period requirements, and is constructed by the Food and Agriculture Organization of the United Nations and the International Institute for Applied Systems Analysis.31 Specifically, I extracted the data at the county level and test the suitability value for two crop categories, cereals and alfalfa, representing the potential productivity for cropland and pasture, respectively. In panel A of Appendix Table A.4, I calculate the county-level average weighted by farmland area and compare the more treated counties (shelterbelt zone coverage > 50%) and the less (or not) treated counties (shelterbelt zone coverage < 50%).32One can see that the weighted average values for cereals and alfalfa are quite similar and not statistically different between the two groups. This provides more confidence in the exogeneity of the shelterbelt zone.
The availability of water can be another potential confounding factor. On the one hand, planting shelterbelts can be more feasible in areas where more water is available, which may lead to an overestimation of the actual effect of shelterbelts elsewhere. On the other hand, irrigation directly ensures sufficient water in the soil, so that the shelterbelts’ benefit of containing moisture in the soil can be less necessary.
The actual irrigated area is already explicitly estimated in Tables 3 and 5 and plays a key role in the story discussed in Section 6. Meanwhile, note that I have already controlled an arguably exogenous proxy for water availability, which is the proportion of a county above the Ogallala Aquifer, the most important water source for irrigation in the Great Plains area. Hornbeck and Keskin (2014) estimated that irrigated farmland area has been significantly higher in the counties above the aquifer since the 1950s. The geographic coverage of the aquifer from the U.S. Geological Survey is shown in Appendix Figure A.3. Table 1 shows that the coverage of the aquifer is not statistically different for the treated and control counties. Moreover, the main results in Tables 2-5 are all robust to controlling for the county-level proportion above the aquifer. Therefore, the availability of water is not likely to be the channel driving my main results.
Alternative Channel: Other Programs
Although the Great Plains Shelterbelt Project had been planned as an independent program, this initiative failed to convince Congress to fund it. Hence, the program was actually financed as a public work under President Roosevelt’s New Deal in the 1930s (Droze 1977). In Appendix Table A.5, I replicate the main results as in Table 2 after additionally controlling for the per capita New Deal payments by five categories at the county level in 1933-1939.33 One can see that the replicated results are fairly close to the baseline results in Table 2. Therefore, the general New Deal payments are not likely to be the main driver of my conclusions.
After the shelterbelt project ended, other conservation programs were also implemented by the Soil Conservation Service (currently the Natural Resources Conservation Service) and other agencies under the U.S. Department of Agriculture, such as the Soil Bank Program (1956-1973), the Agricultural Conservation Program (1936-1996), the Conservation Reserve Program (1985-present), and the Environmental Quality Incentives Program (1996-present). These programs promoted various conservation practices, including reducing cropland acreage and preventing soil erosion, which could potentially confound my find-ings.34 Although I do not have all the specific information to test each one of these policy factors, I gathered reported actual conservation practices from the U.S. Census of Agriculture for the counties in my sample.
Panel B of Appendix Table A.4 compares the more treated counties (shelterbelt zone coverage > 50%) with the less (or not) treated counties (shelterbelt zone coverage < 50%) on these reported measures for all available years. The first-listed practice, the share of farmland in strip cropping, is the only measure that is consistently different for all reported years (i.e., 1959, 1964, and 1969). The more treated counties adopted less strip cropping by 0.6%-0.9% of farmland than the less treated as a method to prevent soil erosion, probably because there was already more protection from shelterbelts. This can lead to a small underestimation of the true effect of shelterbelts, but this specific practice or any policy promoting it is not likely to be the main driver of my findings for the other reported conservation practices, including the shares of cropland in cover crops or in summer fallow, and under a few more recent conservation programs, I do not see any systematic differences across different years between the more and the less (or not) treated counties.35
In addition, considering the mechanism and geographic coverage of the shelterbelt project, an important program that requires special attention is the Great Plains Conservation Program (currently replaced by the Environmental Quality Incentives Program). Under the Great Plains Conservation Program (GPCP), the Soil Conservation Service provided cost sharing and technical assistance for various conservation practices, including reseeding grassland, plant cover, erosion-control dams, windbreaks, and strip cropping (US. Department of Agriculture Soil Conservation Service 1982). This program was approved in 1956, with its first contract signed in 1957, and 519 counties were designated by 1982, as shown in Appendix Figure A.4 (Helms 1981; U.S. Department of Agriculture Soil Conservation Service 1982). Hence, my main conclusion can be threatened if the increases in pastureland and the number of cattle since the 1950s shown in Tables 2 and 4 were driven by this program instead. In terms of reported actual practices promoted by the GPCP, I already show in panel B of Appendix Table A.4 that the treated and control counties were not consistently statistically different except for strip cropping. To further alleviate the concern about its intent-to-treat effect, I test it formally by using spatial and temporal variations of the designation status of the GPCP.
From Appendix Figure A.4, the designation of the GPCP covers a majority of counties in my sample. For the counties with over 50% of their area covered by the shelterbelt zone, the proportion with the GPCP designation is 91.45%; for those with less than 50% covered, the proportion is 79.49%. This difference between the two groups is statistically significant. To alleviate the concern that this difference is driving my main results, I test the robustness for all my main outcome variables while controlling for the GPCP designation status interacted with dummies for all years after 1957 in Appendix Table A.6. Note that years are pooled differently from previous tables by separating 1954 from other years in order to separate the effects before and after the implementation of the GPCP. As one can see from columns 3 and 4, the revenue from animal products increased and the share of cropland dropped in 1954 before the GPCP was even initiated. Meanwhile, even after the GPCP was initiated, the effects from 1959 to 1992 in the two columns are still robust after controlling for the designation of GPCP. Hence, across all the columns in Appendix Table A.6, the qualitative results are basically consistent with those in Table 2. The GPCP may be a factor enhancing the effects from shelterbelts, but the main driver is still the mechanism of shelterbelts, as explained in Section 6.
Alternative Channel: Other Agricultural Inputs
Appendix Table A.7 shows some additional regressions on labor and capital inputs, as well as farmland and woodland areas.36 Column 1 shows that shelterbelts do not cause significantly different growth in rural population, so the baseline results in Table 2 are not likely to be driven by any change in rural labor. In column 2, however, the value of farming equipment per acre is significantly lower in the treated counties, especially in earlier years. This is potentially driven by the lack of irrigation systems in treated areas, which is consistent with the results in Section 6. Moreover, the muted results in columns 3 and 4 indicate that neither total farmland area nor woodland area is driving my findings.37
8. Conclusion
My article estimates the short-and long-term effects of shelterbelt planting on agricultural land use and production under the Great Plains Shelterbelt Project, a massive forestation effort following the Dust Bowl in the United States. In order to address the endogeneity problem in the location choice of tree planting, I use a 100-mile-wide belt-shaped shelterbelt zone designated by the program as the instrumental variable. Counties within the belt are nearly twice as likely to be covered by shelterbelts as their neighboring counties outside the belt, with a confidence interval over 99%. Meanwhile, this instrumental variable is also arguably exogenous, conditional on covariates. My estimates show that a 10 percentage point increase in shelterbelt protection causes 1.3%-3.2% of farmland to be switched from cropland to pasture. This leads to a 7-13 percentage point increase in revenue from animal products in later decades, which is attributable to pasturing cattle. On the other hand, I also find that a decline in crop production mainly occurred in earlier decades and was caused by shelterbelts creating obstacles for subsequently introduced irrigation systems, which especially applies to the western half of the sample, where precipitation is relatively insufficient. Moreover, these heterogeneous effects between the eastern and western halves of the sample also imply that the land adjustment in the eastern counties is more likely to be driven by the benefits accrued to livestock. Hence, the breadbasket of the United States was actually protected by shelterbelts through raising livestock but not through growing crops. To demonstrate that this effect is indeed caused by shelterbelts, I have discussed alternative channels, including other natural conditions, government programs, and other agricultural inputs, and have ruled out all these channels as major causes.
Studying the Great Plains Shelterbelt Project is meaningful not only for understanding the effect of shelterbelts on the Great Plains but also for sustaining currently ongoing large-scale projects in other arid or semiarid regions in the world, such as the 3-North Shelter Forest Program in China and the Great Green Wall in the Sahara Desert. An important historical lesson drawn from the interplay between shelterbelts and irrigation suggests that a comprehensive organization of land taking into account the adoption of new technologies could potentially be crucial. Another issue worth noticing is that the benefit from the shelterbelts only started to appear after the farmers in the treated areas adjusted their land and production allocation. This means that policy makers should provide more information and support to help farmers to adapt to the agricultural environment with more protection of shelterbelts.
Acknowledgments
I wish to thank my advisors, Jay Shimshack, Sheetal Sekhri, Peter Debaere, and Leora Friedberg, for their guidance throughout this project. I appreciate valuable comments from the anonymous referees, Land Economics Editor Daniel J. Paneuf, Timothy Beatty, Kelsey Jack, Wallace Huffman, David Hennessy, Kathy Baylis, Ken Leonard, Jessica Goldberg, Molly Lipscomb, Stephen Smith, Emi Uchida, Scott Rozelle, Sarah Turner, John Pepper, William Johnson, Heidi Schramm, Peter Orazem, David Keiser, Yue Hua, Junfu Zhang, and other participants in the 2015 North East Universities Development Consortium, the Association for Public Policy and Management 2015 Fall Research Conference, the third annual Washington Area Development Economics Symposium, the Applied Micro Workshop and the first Economics Research Colloquium at the University of Virginia, the 2018 Institute of Urban Development Symposium on Regional and Urban Economics at Nanjing Audit University, and seminars at REAP of Stanford University and IMPAQ International. Any errors are my own.
Footnotes
↵1 “3-North” refers to the Northeast, Northwest, and North China regions in mainland China. See the data from the World Bank: https://data.worldbank.org/indicator/NY.GDP.PCAP.KD?locations=CN. In 2019, China’s per capita GDP was about 75% of the world’s average.
↵2 Based on the current NRCS (2011) guidelines, it takes 20 years for shelterbelts to achieve their designed height, but Helmers and Brandle (2005) assume the maturity of windbreak trees to be reached after 40 years.
↵3 Alternatively, the reason for the lack of interest in shelterbelts in the United States (when compared to China and African countries) is probably because the effect of the program has been misunderstood under some historical coincidence caused by technological advancement in agriculture. More details will be discussed in Section 6.
↵4 Detailed statistics and a discussion are provided in Section 5.
↵5 This means that the proposed 100-mile-wide shelterbelt zone is not strictly enforced; more details are provided in Section 2.
↵6 See Kort (1988); Brandle, Johnson, and Akeson (1992); Helmers and Brandle (2005); details are discussed in Section 3.
↵7 See Uchida, Xu, and Rozelle (2005); Xu et al. (2006); Uchida, Rozelle, and Xu (2009); Jack (2013); Alix-Garcia, Sims, and Yanez-Pagans (2015).
↵8 Other studies focusing on conservation practices under the CRP include Babcock et al. (1996); Wu and Lin (2010); Jacobs, Thurman, and Marra (2014); Miao et al (2016). Studies on another similar program, the Environmental Quality Incentives Program (EQIP), include Obubuafo et al. (2008) and Nyaupane, Gillespie, and Paudel (2012).
↵9 For detailed estimates, see Table 2 and Appendix Table A7.
↵10 Helmers and Brandle (2005) assume that windbreak trees will not reach maturity until 40 years later.
↵11 Measured in dollars, which would be about 18 times higher if measured in 2019 dollars. Perry (1942) recorded that the gross federal expenditure was $13,882,419 and estimated other local and individual donations and cooperation costs at about $5 million.
↵12 In addition, some shelterbelts were planted in the Sand Hills County of the Nebraska panhandle.
↵13 For detailed statistics, see Appendix Table A.1.
↵14 To the best of my knowledge, no more recent documentation exists on the actual shelterbelt coverage. However, the inaccuracy could be reasonably low (10% or less) before 1974 based on the General Accounting Office documentation. The main treatment effects of the program, as shown in Section 6, either stayed persistent or appeared earlier than 1974, so my findings are not likely to be significantly affected by the potential removal of shelterbelts after 1974.
↵15 Heat stress is a problem that can reduce a cow’s milk production and increase its risk of lameness.
↵16 This setup borrows from the theoretical framework of Hornbeck (2012).
↵17 Possible corner solutions are that the farmer produces only crops or only animal products. With a technological shock favoring crop production, some farmers initially producing only animal products may be induced toward an interior solution of producing both goods; farmers initially producing only crops will continue to stick to their corner solution. A similar conclusion follows with a technological shock favoring livestock. Discussing these corner solutions will not change the qualitative outcomes of the model, so I focus only on the interior solution to simplify the discussion.
↵18 The years included are 1910, 1920, 1930, 1945, 1950, 1954, 1959, 1964, 1969, 1978, 1982, 1987, and 1992. There was another wave taken in 1974, but some main outcome variables are not available in that wave.
↵19 Data available at https://www.esrl.noaa.gov/psd/.
↵20 The numbers of counties in the two groups are, by coincidence, the same. The median of coverage in my sample is 49.79%. Among the 234 counties included, 30% are completely covered by the 100-mile-wide shelterbelt zone, 29% are completely outside the zone, and the remaining 41% are partially covered with the coverage strictly between 0 and 1.
↵21 As shown by comparing the results in Table 2 and Appendix Table A.3.
↵22 In my case, this is an extension from the discrete treatment case to settings with continuous treatments, which is valid as with an IV approach (Angrist, Graddy, and Imbens 2000).
↵23 Another main outcome variable, the fraction of cropland and pasture, has unfortunately only been consistently reported since 1930, so I cannot show a similar graph on its pretrend. However, conditional on a whole set of covariates on pretreatment features, land use changes in pretreatment years could be highly correlated with corresponding revenue changes.
↵24 Endogeneity concerns typically exist when controlling for lagged outcome variables due to autocorrelation. However, this risk is technically considered to be low after clustering the standard errors at the county level and controlling for county fixed effects, as well as for a whole set of other county-level covariates. This argument is detailed in Bertrand, Duflo, and Mullainathan (2002).
↵25 As mentioned, it takes 20 years for shelterbelts to achieve their designed height (NRCS 2011), and Helmers and Brandle (2005) assume that maturity will not be reached until 40 years later.
↵26 The earliest year available for the outcome variable on irrigated area is 1935 from ICPSR 4254 Great Plains Population and Environmental Data: Agricultural Data. Hence, 1935 is my best option as the baseline year for regressions on irrigated areas, although the proposal for the Great Plains Shelterbelt Project was announced in the same year. Also, the outcome variable on expenditure on livestock feed is only available for 1910 and 1920, so I use 1920 as the baseline year for this variable.
↵27 Specifically, the outcome variables of interest include land allocation between cropland and pasture, per acre revenues from crops and livestock and the total of both, per acre yields of several major crops, irrigated area, cropland productivity, densities of major livestock, expenditure on livestock feed, pasture productivity, population density, value of farming equipment, farmland fraction, and woodland fraction in each county. An explanation of these variables is given in Appendix B.
↵28 The OLS results are shown in Appendix Table A.3. The coefficients in this table are all potentially downward biased in scale when compared with Table 2 The most likely explanation is that farmers in the counties that suffered more during the Dust Bowl should be more likely to cooperate in shelterbelt planting, as discussed in Section 5.
↵29 Cropland includes all the land used for growing crops (harvested or failed), as well as fallowed or idle cropland, but not cropland used solely for pasture. Pasture includes regular pastureland as well as cropland and woodland used for pasture. The division of cropland and pasture was not reported in 1969, so I linearly interpolated for data on pastureland area in 1969.
↵30 Controlling or dropping the soil erosion levels does not significantly change the results in Table 2 (not shown), demonstrating that my results are not driven by the recovery from the Dust Bowl.
↵31 See http://gaez.fao.org/Main.html#.
↵32 I set the measurement as “rainfed” and “low input,” so there are enough variations across counties.
↵33 New Deal payments are divided into five categories: payments for the Agricultural Adjustment Act, public works spending, relief spending, New Deal loans, and mortgage loans guaranteed. The per capita amount in each category is interacted with each posttreatment-year dummy to account for its long-term effect. The data are from Hornbeck (2012).
↵34 Specifically, the Soil Bank Program initiated by the Agricultural Act of 1956 was designed to reduce production of basic crops, maintain farm income, and conserve soil. It included two components: the Acreage Reserve Program implemented in 1956-1958 for the immediate reduction of basic crops, and the Conservation Reserve Program for an enduring reduction in cropland acreage. The signing of new contracts for the latter component ceased in 1960, although payments continued until 1973. A contemporary version of the program is called the Conservation Reserve Program, which started in 1985 (Helms 1985). Along another series of policies, the Agricultural Conservation Program administered by the Farm Service Agency offered cost sharing and technical support to farmers who adopt approved land conservation practices (such as practices to increase the efficiency of fertilizer and pesticide use). This program was replaced by EQIP under the terms of the 1996 Farm Bill (National Center for Environmental Economics 2015).
↵35 In addition, I have checked that controlling for interpolated/extrapolated measures of these practices does not affect my main results (estimates available upon request).
↵36 Population data are only available decennially. Data on the value of equipment are missing in the 1950s, 1964, and the 1980s.
↵37 Although woodland area can be correlated with the shelterbelt-protected area (the black area in Figure 1), it is also important to acknowledge the differences between the two. The actual shelterbelt-protected area indeed includes woodland, but this treatment measure also includes the farmland that is surrounded by shelterbelts (but not actually covered by trees). Moreover, the woodland area includes other forests that are not shelterbelts.
This open access article is distributed under the terms of the CC-BY-NC-ND license http://creativecommons.org/licenses/by-nc-nd/4.0) and is freely available online at: http://le.uwpress.org.