In interior Alaska’s 115 million acres of boreal forest, white and black spruce are the dominant tree species. Climate models suggest that the region is becoming warmer and drier, resulting in declining growth of black and white spruce, according to some researchers. These drier conditions also may lead to greater risk of stand-replacing wildfires, resulting in forests dominated by birch and aspen, which are early-successional tree species.
To compare long-term growth trends of the dominant coniferous and deciduous tree species, a team of researchers with the USDA Forest Service Pacific Northwest Research Station and the University of Alaska Anchorage analyzed tree cores collected from the Tanana Valley and measured tree-ring widths of these four tree species over the past 150 years. They also compared growth against monthly temperature and precipitation data to determine if there is a correlation between climate and growth.
The team found that white and black spruce have not experienced as rapid a growth decline as earlier studies suggested; instead, their annual growth remains near the long-term mean. Of the four species examined, aspen showed the greatest recent growth decline, likely reflecting a widespread insect outbreak. Among the climate variables that will affect the future growth of these species, summer rainfall was identified as a significant factor.
Data describing aircraft position and attitude are essential to computing return positions from ranging data collected during airborne laser scanning (ALS) campaigns. However, these data are often excluded from the products delivered to the client and their recovery after the contract is complete can require negotiations with the data provider, may involve additional costs, or even be infeasible. This paper presents a rigorous, fully automated, novel method for recovering aircraft positions using only the point cloud. The study used ALS data from five acquisitions in the US Pacific Northwest region states of Oregon and Washington and validated derived aircraft positions using the smoothed best estimate of trajectory (SBET) provided for the acquisitions. The computational requirements of the method are reduced and precision is improved by relying on subsets of multiple-return pulses, common in forested areas, with widely separated first and last returns positioned at opposite sides of the aircraft to calculate their intersection, or closest point of approach. To provide a continuous trajectory, a cubic spline is fit to the intersection points. While it varies by acquisition and parameter settings, the error in the computed aircraft position seldom exceeded a few meters. This level of error is acceptable for most applications. To facilitate use and encourage modifications to the algorithm, the authors provide a code that can be applied to data from most ALS acquisitions.
Evidence of shifting dominance among major forest disturbance agent classes regionally to globally has been emerging in the literature. For example, climate-related stress and secondary stressors on forests (e.g., insect and disease, fire) have dramatically increased since the turn of the century globally, while harvest rates in the western US and elsewhere have declined. For shifts to be quantified, accurate historical forest disturbance estimates are required as a baseline for examining current trends. We report annual disturbance rates (with uncertainties) in the aggregate and by major change causal agent class for the conterminous US and five geographic subregions between 1985 and 2012. Results are based on human interpretations of Landsat time series from a probability sample of 7200 plots (30 m) distributed throughout the study area. Forest disturbance information was recorded with a Landsat time series visualization and data collection tool that incorporates ancillary high-resolution data. National rates of disturbance varied between 1.5% and 4.5% of forest area per year, with trends being strongly affected by shifting dominance among specific disturbance agent influences at the regional scale. Throughout the time series, national harvest disturbance rates varied between one and two percent, and were largely a function of harvest in the more heavily forested regions of the US (Mountain West, Northeast, and Southeast). During the first part of the time series, national disturbance rates largely reflected trends in harvest disturbance. Beginning in the mid-90s, forest decline-related disturbances associated with diminishing forest health (e.g., physiological stress leading to tree canopy cover loss, increases in tree mortality above background levels), especially in the Mountain West and Lowland West regions of the US, increased dramatically. Consequently, national disturbance rates greatly increased by 2000, and remained high for much of the decade. Decline-related disturbance rates reached as high as 8% per year in the western regions during the early-2000s. Although low compared to harvest and decline, fire disturbance rates also increased in the early- to mid-2000s. We segmented annual decline-related disturbance rates to distinguish between newly impacted areas and areas undergoing gradual but consistent decline over multiple years. We also translated Landsat reflectance change into tree canopy cover change information for greater relevance to ecosystem modelers and forest managers, who can derive better understanding of forest-climate interactions and better adapt management strategies to changing climate regimes. Similar studies could be carried out for other countries where there are sufficient Landsat data and historic temporal snapshots of high-resolution imagery.
In Phase III of the North American Forest Dynamics (NAFD) study an automatic workflow has been developed for evaluating forest disturbance history using Landsat observations. It has four major components: an automated approach for image selection and preprocessing, the vegetation change tracker (VCT) forest disturbance analysis, post-processing, and validation. This approach has been applied to the conterminous US (CONUS) to produce a comprehensive analysis of US forest disturbance history using the NASA Earth Exchange (NEX) cloud computing system. The resultant NAFD-NEX product includes 25 annual forest disturbance maps for 1986-2010 and two time-integrated maps to provide spatial-temporal synoptic view of disturbances over this time period. These maps were derived based on 24,000+ scenes selected from 350,000+ available Landsat images at 30-m resolution, and were validated using a visual assessment of Landsat time-series images in combination with high-resolution and other ancillary data sources over samples selected using a probability based sampling method. The validation revealed no major biases in the NAFD-NEX maps for disturbance events that resulted in at least 20% canopy cover loss. The average user's and producer's accuracies for the disturbance class were 53.6% and 53.3%, respectively, with the individual year's user's accuracy varying from 42.8% to 73.6% and producer's accuracy from 39.0% to 84.8% over the 25-year period. The NAFD-NEX disturbance maps are available from a web portal of the Oak Ridge National Laboratory Distributed Active Archive Center (ORNL-DAAC) at https://doi. org/10.3334/ORNLDAAC/1290.
The Rocky Mountain Research Station works with National Forest planning teams to understand and maximize an important resource: forest data collected by the Forest Service’s Forest Inventory and Analysis (FIA) program. The program’s website, found at https://www.fia.fs.fed. us, provides a variety of tools that allow users to download standard reports and create custom queries that can be used to improve the efficiency of their planning process. By integrating or putting FIA data to work, National Forest planners are able to meet the 2012 Planning Rule’s requirements for monitoring and using the best available science. For example, National Forest planning teams can use FIA data to better understand forest characteristics and conditions using readily available data and FIA analysis skills. Additional information on FIA resources for the Interior West region can be found at https://www.fs.usda.gov/rmrs/ interior-west-forest-inventory-analysis-fia. Other resources for National Forest plan revision teams include riparian and groundwater-dependent ecosystems assessments and a nationwide toolset of National Forest Climate Change Maps.
Of California’s almost 100 million ac, about a third are forested (32 million ac). This report, including the accompanying tables, summarizes key findings from the 5,369 Forest Inventory and Analysis (FIA) plots measured in California’s forests during the period 2006–2015. Estimates are provided for forest area, ownership, species composition and distribution, size and age classes, volume, biomass, carbon, dead and downed wood, and understory vegetation. Starting in 2001, plots were measured on a 10-year cycle (10 percent of all plots measured annually). Thus, those plots measured in 2011–2015 represent completion of half of the remeasurement cycle; estimates of growth, mortality, and removals from remeasured plots are also included. The U.S. Forest Service manages about half of California’s forested land—48 percent. Fifty-two percent of California’s forests is categorized as timberland (unreserved forest land capable of producing ≥20 ft of wood per acre per year) predominantly consisting of the California mixed-conifer type. The most common forest type on the remaining 48 percent was western oak. Mean annual gross growth was 1.99 billion ft/year. Subtracting harvest removals (21 percent of growth values) and mortality (45 percent of growth values) still resulted in a positive net growth of 673 million ft/year. Of some of the commercially important tree species, damage was present in 17 to 27 percent of the trees, including Douglas-fir (17 percent), white fir (27 percent), ponderosa pine (20 percent), and redwood (17 percent). The two most prevalent nonnative species were both grasses—cheatgrass (estimated 277,000 ac of cover) and ripgut brome (234,000 ac). During the 10-year period, the years with the most forested acres with evidence of fire were 2008 and 2015. FIA plots will continue to be measured as stipulated by the 1998 Farm Bill. By the time the next FIA report for California is issued, a complete remeasurement cycle will have been completed.
Methods to accurately estimate spatially explicit fuel consumption are needed because consumption relates directly to fire behavior, effects, and smoke emissions. Our objective was to quantify sparkleberry (Vaccinium arboretum Marshall) shrub fuels before and after six experimental prescribed fires at Fort Jackson in South Carolina. We used a novel approach to characterize shrubs non-destructively from three-dimensional (3D) point cloud data collected with a terrestrial laser scanner. The point cloud data were reduced to 0.001 m-3 voxels that were either occupied to indicate fuel presence or empty to indicate fuel absence. The density of occupied voxels was related significantly by a logarithmic function to 3D fuel bulk density samples that were destructively harvested (adjusted R2 = .32, P < .0001). Based on our findings, a survey-grade Global Navigation Satellite System may be necessary to accurately associate 3D point cloud data to 3D fuel bulk density measurements destructively collected in small (submeter) shrub plots. A recommendation for future research is to accurately geolocate and quantify the occupied volume of entire shrubs as 3D objects that can be used to train models to map shrub fuel bulk density from point cloud data binned to occupied 3D voxels.
Fine particulate matter (PM2.5) is a well-established risk factor for public health. To support both health risk assessment and epidemiological studies, data are needed on spatial and temporal patterns of PM2.5 exposures. This review article surveys publicly available exposure datasets for surface PM2.5 mass concentrations over the contiguous U.S., summarizes their applications and limitations, and provides suggestions on future research needs. The complex landscape of satellite instruments, model capabilities, monitor networks, and data synthesis methods offers opportunities for research development, but would benefit from guidance for new users. Guidance is provided to access publicly available PM2.5 datasets, to explain and compare different approaches for dataset generation, and to identify sources of uncertainties associated with various types of datasets. Three main sources used to create PM2.5 exposure data are: ground-based measurements (especially regulatory monitoring), satellite retrievals (especially aerosol optical depth, AOD), and atmospheric chemistry models. We find inconsistencies among several publicly available PM2.5 estimates, highlighting uncertainties in the exposure datasets that are often overlooked in health effects analyses. Major differences among PM2.5 estimates emerge from the choice of data (ground-based, satellite, and/or model), the spatiotemporal resolutions, and the algorithms used to fuse data sources.
Implications: Fine particulate matter (PM2.5) has large impacts on human morbidity and mortality. Even though the methods for generating the PM2.5 exposure estimates have been significantly improved in recent years, there is a lack of review articles that document PM2.5 exposure datasets that are publicly available and easily accessible by the health and air quality communities. In this article, we discuss the main methods that generate PM2.5 data, compare several publicly available datasets, and show the applications of various data fusion approaches. Guidance to access and critique these datasets are provided for stakeholders in public health sectors.
Given the increasing interest in the use of lidar-based remote sensing to support carbon monitoring systems, including those used for measurement, reporting, and verification requirements for REDD+ (reduced emissions from deforestation and degradation) programs in tropical nations, there is a corresponding need for the cost-effective field measurement systems to enable model development and robust assessment of uncertainty in these programs. In this report, we describe an efficient field sampling design and measurement protocol designed to provide field-based estimates of biomass/carbon stored in trees and coarse woody materials to support complex, multilevel carbon monitoring systems using Landsat time series and airborne lidar. Airborne lidar data were collected as a strip sample (single flight lines spaced 5 km apart) at six sites in the United States (Colorado, Maine, Minnesota, Oregon, Pennsylvania/New Jersey, and South Carolina) (298 total flight lines), and about 50 field plots were established within the lidar coverage at each site. Field plots were distributed across 15 (3 cover × 5 height) strata at each site, using field protocols that were consistent with those of the U.S. Department of Agriculture Forest Service Forest Inventory and Analysis program. Field measurements were collected digitally on personal tablets and uploaded daily into a database. Preliminary regression analyses indicate strong relationships between lidar metrics and tree biomass at all sites.
This report highlights key findings from the most recent 10-year survey of Forest Inventory and Analysis (FIA) data collected across southeast and south-central Alaska and represents the first full remeasurement of all forest plots in the coastal Alaska inventory unit. Estimates of forest area, stand age, volume, aboveground biomass, and carbon are provided across ownerships, forest types, and species throughout the region. Of the 54 million ac in the inventory unit, approximately 15 million ac (28 percent) were considered forest land, most of which is managed by Tongass National Forest in southeast Alaska. Western hemlock (Tsuga heterophylla), mountain hemlock (T. mertensiana), Alaska yellow-cedar (Callitropsis nootkatensis), and Sitka spruce (Picea sitchensis) forest types dominate the region, together accounting for 75 percent of total forest area and 86 percent of total aboveground biomass. Understory vegetation was dominated by oval leaf blueberry, rusty menziesia, and bunchberry dogwood, while nonforest areas were dominated by tall and dwarf shrub community types characterized by Sitka alder (Alnus viridis ssp. sinuata), salmonberry (Rubus spectabilis), and sweet gale (Myrica gale). Over the 10-year remeasurement cycle (1995–2003 to 2004–2013), net change in forest volume was mostly positive, with the exception of privately owned lands, where timber removals exceeded growth. Among softwood species, only lodgepole pine (Pinus contorta) (also known as shore pine) displayed a net loss in biomass, while mountain hemlock, Sitka spruce, western redcedar (Thuja plicata), and all hardwood species exhibited a net increase in biomass. Mortality rate was highest for white spruce (P. glauca), likely driven by a large spruce bark beetle outbreak in the late 1990s. However, white spruce also experienced a higher growth rate than other softwood species, perhaps reflecting a growth release among survivors of the beetle attack. This report serves as an updated version to the forest attribute data summarized by Barrett and Christensen (2011) and provides important insight into forest resources for land managers, industry, and researchers.