Decision Support Tools to Inform the Rehabilitation and Management of High Graded Forests
Abstract
Numerous forests in the eastern United States have been degraded due to past exploitative timber
harvesting known as high grading. High graded forest stands may not improve without active re-
habilitation and may require targeted silvicultural treatments. This study focuses on high graded
mixed-oak (mixed-Quercus spp.) stands and aims to develop a model that can identify past high
grading and to determine modifications that may improve forest management recommendations provided
by the prominent decision support tool, SILVAH. We present a model that uses standard forest
inventory measurements and does not require knowledge of preharvest stand conditions to predict
with moderate to high accuracy whether a stand was high graded, which could be par- ticularly
useful for nonindustrial private forests. Results indicate that modifications to SILVAH may be
necessary to improve its utility for prescribing silvicultural treatments in high graded stands.
Study Implications: High graded forest stands are often not readily apparent and likely require
specific forest management practices. We present a tool that uses standard forest inventory meas-
urements to predict past high grading, which can be used to inform and prioritize forest manage-
ment decisions. We also present suggested modifications to the prominent decision support tool,
SILVAH, that may improve its ability to prescribe optimal silvicultural treatments for high graded
stands. Results from this study provide forestry professionals/landowners working in the mixed- oak
forests of the northeastern United States with tools to inform forest management decisions
that aim to return degraded stands to healthier and more productive states.
Publication Date: 2022
Credits: Advance Access publication, February 11,2022
Source: ournal of Forestry, Volume 120, Issue 5, September 2022, Pages 527–542, https://doi.org/10.1093/jofore/fvab077
DOWNLOAD FILE — PDF document, 28,497 kB (29,180,966 bytes)