After a more than a century of fighting to keep fire out of forests, reintroducing it is now an important management goal. Yet changes over the past century have left prescribed burning with a big job to do. Development, wildfire suppression, rising global temperatures, extended droughts, exotic species invasions, and longer fire seasons add complexity to using this practice.
Managers must consider how often, how intensely, and what time of year to burn; for insights they often look to how and when fires burned historically. However, attempting to mimic historical wildfires that burned in hot, dry conditions is risky. Burning in fall or spring when temperature and humidity are low reduces the risk of prescribed fires becoming uncontrollable, but does it have the intended effects? How do forest ecosystems that historically were adapted to fire respond when fire is reintroduced after so much time without it?
Forest Service researchers Becky Kerns and Michelle Day conducted a long-term experiment in the Malheur National Forest, Oregon, to assess how season and time between prescribed burns affect understory plant communities in ponderosa pine forests. They found that some native plants persisted and recovered from fire but didn’t respond vigorously, while invasive species tended to spread. These findings may help forest managers design more effective prescribed-fire treatments and avoid unintended consequences.
Predicting climate change influences on forest diseases will foster forest management practices that minimize adverse impacts of diseases. Precise locations of accurately identified pathogens and hosts must be documented and spatially referenced to determine which climatic factors influence species distribution. With this information, bioclimatic models can predict the occurrence and distribution of suitable climate space for host and pathogen species under projected climate scenarios. Predictive capacity is extremely limited for forest pathogens because distribution data are usually lacking. Using Armillaria root disease as an example, predictive approaches using available data are presented.
Influences of environment (indicated by plant associations) and forest management practices on the distribution of Armillaria spp. and genets (vegetative clones) were investigated. A total of 142 isolates of Armillaria was collected from various host trees on pristine and managed sites (thinned and/or fertilized) growing in relatively wet and dry environments in eastern Washington, U.S.A. The incidence of Armillaria spp. was significantly higher in the relatively wetter sites than the relatively drier sites, as indicated by plant associations. However, no differences in Armillaria occurrence were found among different forest management practices (control vs. thinned vs. thinned and fertilized) within both wetter and drier sites. Incidence of Armillaria was significantly different among conifer and shrub species. The highest proportion with Armillaria was found on grand fir (Abies grandis). Based on pairing tests and rDNA sequencing, the 142 isolates were comprised in a total of 20 genets representing three Armillaria species. More diverse Armillaria spp. were found in both relatively wetter and relatively drier sites within the undisturbed control plots, compared to plots disturbed by forest management practices. The results from this study provide baseline information toward understanding how environment and forest management practices influence incidence and diversity of Armillaria species and genets.
As part of a larger effort to assess the distribution and ecology of Armillaria species throughout western North America, we present preliminary survey results for the East Cascades of Oregon. Surveys and sampling were conducted on 260 0.04-ha plots, which were randomly located across diverse environments and geographic locations. Using DNA-based techniques for the identification of Armillaria spp., we identified three genetically-distinct species groups that comprised the ca. 450+ Armillaria samples. The association of Armillaria species groups with habitat is summarized based on detailed vegetation data. When we understand habitats in which Armillaria species can potentially occur, forest management prescriptions can be developed to improve forest health at the stand level by reducing impacts of Armillaria root disease. These data also provide critical baseline information for evaluating larger-scale impacts in forest ecosystems as trees are subjected to climate-change induced stress.
Climate change will likely have dramatic impacts on forest health because many forest trees could become maladapted to climate. Furthermore, climate change will have additional impacts on forest health through changes in the distribution and severity of forest disease. Methods are needed to predict the influence of climate change on forest disease so that appropriate forest management practices can be implemented to minimize disease impacts. Initial approaches for predicting the future distribution of pathogens are dependent on reliable data sets that document the current, precise location of accurately identified pathogens and hosts. Precise distribution information can be used in conjunction with available climate surfaces to determine which climatic factors and interactions influence species distribution. This information can be used to develop bioclimatic models to predict the probability of suitable climate space for host and pathogen species across the landscape. A similar approach using climate surfaces under predicted future climate scenarios can be used to project suitable climate space for hosts and pathogens in the future. Currently such predictions are well developed for many forest host species, but predictive capacity is extremely limited for forest pathogens because of lacking distribution data. Continued surveys and research are needed to further refine bioclimatic models to predict influences of climate and climate change on forest disease.
Forest management planning can be challenging when allocating multiple ecosystem services (ESs) to management units (MUs), given the potentially conflicting management priorities of actors. We developed a methodology to spatially allocate ESs to MUs, according to the objectives of four interest groups—civil society, forest owners, market agents, and public administration. We applied a Group Multicriteria Spatial Decision Support System approach, combining (a) Multicriteria Decision Analysis to weight the decision models; (b) a focus group and a multicriteria Pareto frontier method to negotiate a consensual solution for seven ESs; and (c) the Ecosystem Management Decision Support (EMDS) system to prioritize the allocation of ESs to MUs. We report findings from an application to a joint collaborative management area (ZIF of Vale do Sousa) in northwestern Portugal. The forest owners selected wood production as the first ES allocation priority, with lower priorities for other ESs. In opposition, the civil society assigned the highest allocation priorities to biodiversity, cork, and carbon stock, with the lowest priority being assigned to wood production. The civil society had the highest mean rank of allocation priority scores. We found significant differences in priority scores between the civil society and the other three groups, highlighting the civil society and market agents as the most discordant groups. We spatially evaluated potential for conflicts among group ESs allocation priorities. The findings suggest that this approach can be helpful to decision makers, increasing the effectiveness of forest management plan implementation.
In addition to long-standing concerns about sustaining forest productivity, maintaining forest ecosystems under changing conditions and emerging threats has become increasingly important when planning forest management. With the aim of understanding effects of management on both productivity and recovery, we quantified the 25-year impact of varying degrees of organic matter (OM) removal and soil compaction on above-ground biomass, soil carbon and nutrients, soil bulk density, and stand development in aspen-dominated forests in the upper Lake States region of the US. Treatment impacts were assessed at three different sites with comparable overstory composition, but with varying soil texture, site quality, and climate. Across all sites, soil C and N generally decreased with increasing OM removal, and bulk density increased with increasing compaction; 25- year observations indicate recovery of bulk density at the surface (0–10 cm) but not at deeper portions of the soil profile. At the most productive site (loamy soils) with favorable initial soil porosity, severe compaction decreased mean aboveground biomass (-46%), particularly of trees (-73%). Biomass at 25 years did not differ among organic matter removal treatments (e.g. stem-only harvest), but a greater increase in soil C occurred with stem-only harvest relative to whole-tree harvest plus forest floor removal. In contrast, at a less productive site with sandy soils poorly buffered to nutrient and C removals, whole-tree harvest reduced biomass by 25% (tree biomass declined 35%) relative to stem-only harvest while compaction treatments did not differ in effects on biomass production, soil C or soil N. On clay soils, compaction treatments did not significantly impact biomass production, but whole-tree harvest plus forest floor removal reduced tree biomass by 47% relative to whole-tree harvest alone. Assessment of mean relative density indicates canopy closure has not yet occurred at the least productive site (clay soils) or the more severely disturbed stands at the intermediate site (sandy soils), suggesting the possibility for treatment impacts not yet discernible to become more pronounced as stands develop and nutrient uptake continues in the future. Our results align with concepts of soil quality and texture-specific limitations to growth, underlying a need to understand key soil limitations when considering forest management impacts to aboveground structure and productivity.
In young Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco var. menziesii) stands, intensive forest management has resulted in variable responses in survival and growth and therefore stand biomass due to effects on microclimate and competition for limited resources. As seedling genetics and management practices develop over time, there is a need to re-evaluate how Douglas-fir survival and growth varies with intensive practices. Five-year Douglas-fir diameter at 15 cm (D15) and height growth, biomass, and nitrogen (N) pools after organic matter removals and soil compaction at a recently established site in central Oregon (NARA) were compared to five-year results from three Douglas-fir sites with organic matter removal and vegetation control treatments that were established 10–14 years earlier in central Washington (Fall River and Matlock) and northern Oregon (Molalla). The NARA site contained up to two times greater individual-tree foliar and branch biomass for the same stem D15 than the previously established sites, possibly due to a lower planting density, a warmer climate, and/or improved genetics/different seed sources. Prediction of stem biomass was similar at all sites using D152*height. Periodic D15 and height growth from 1 to 3 years and five-year Douglas-fir biomass was lower in the bole-only (BO) without compaction treatment at NARA compared to other treatment combinations. Greater periodic D15 and height growth after five years of annual vegetation control (Fall River, Matlock, and Molalla) resulted in larger Douglas-fir biomass compared to the initial vegetation control treatments. These results indicate the beneficial effects of vegetation control on early growth across a wide range of conditions. With one exception, whole-tree removal treatments did not affect five-year-old Douglas-fir biomass or N pools, indicating site resilience to increased OM removal across the region. A principal component analysis (PCA) clearly separated all four sites according to climate, soil temperature and moisture, N availability, D15 and height growth, and Douglas-fir and competing vegetation biomass. Within each site, the PCA showed that vegetation control was associated with the greatest differences in Douglas-fir and competing vegetation biomass. This supports the notion that site factors and vegetation control treatments are critically important in mitigating response to intensive management practices.
We present the results of a replicated before-after-control-impact study on 33 streams to test the effectiveness of riparian rules for private and State forests at meeting temperature criteria in streams in western Oregon. Many states have established regulatory temperature thresholds, referred to as numeric criteria, to protect cold-water fishes such as salmon and trout. We examined across-year and within-year patterns of exceedance at control and treatment stream temperature probes. Determining whether an exceedance at the downstream end of a harvest was unambiguously related to harvest proved surprisingly difficult. The likelihood of a site exceeding its numeric criterion appeared related, in part, to the site’s preharvest temperature range. Four control reaches as well as three preharvest treatment reaches exceeded their numeric criteria, necessitating additional analysis to evaluate timber harvest impacts. Nine percent of sites (3 of 33) both exceeded their numeric criteria and exhibited a potential harvest effect (16.7% of private sites [3 of 18], 0% of State sites [0 of 15]). After harvest, exceedances were typically observed in only the first of the two post-harvest years. These findings highlight the importance of including temporal and spatial controls in temperature assessments of numeric criteria when the assessment’s purpose is to determine whether exceedances are related to human activities.