Effects of Tree Mortality on Power Line Security
by Siegfried Guggenmoos
Abstract. Others have reported that instances where
trees grow into lines rarely result in power outages. The
vast majority of tree-related outages stem from tree
failure, particularly if outages during severe weather events
are included. Generally, treeconductor conflicts resulting
from tree failure are classified as unpreventable because the
trees are located outside the right-of-way. In the
emerging competitive environment, utilities will require a means
of decreasing so-called unpreventable outages. The
primary locations for unpreventable outages are areas where
lines run adjacent to or through natural forest tree stands.
Tree mortality exposes a power line to a high risk of tree
incidents over time. The risk to the line is directly related to
the number of trees within striking distance of the line.
Conventional clear widths leave a substantial residual tree
risk. Hazard tree removal programs do not provide
enduring reliability gains. A new mathematical model, the
optimal clear width calculator, is used to assess the tree risk
over variable clear widths and line heights. The risk ratings in
the output line strike probability charts permit
quantitative comparisons of construction and maintenance options.
The line strike probability chart indicates that there is a point
of diminishing return in line security for dollars invested
in additional clear width.
Key Words. Utility arboriculture; hazard tree;
tree-related outages; treeconductor contacts; electric system
reliability; tree mortality; line strike risk; tree risk quantification.
It is estimated that North American utilities spend $2
billion (Rees et al. 1994; EPRI 1995; Goodfellow 2000) to
$10 billion (Transmission & Distribution World 2002)
annually on vegetation management to prevent service
disruptions and safety hazards associated with trees contacting
conductors. Throughout their 100-year history, utilities have
been challenged by treeconductor conflicts that continue
to plague them today.
Trees are a major cause of power outages,
particularly on distribution systems. On distribution systems,
tree-related outages comprising 20% to 50% of all
unplanned outages are common (Rees et al. 1994; Simpson and
Van Bossuyt 1996; Johnstone 2001). Tree-related
exceeding 50% of the total tend to draw attention to
the need for remedial action (St. Petersburg Times
1999; Megawatt Daily 1999; Poole and Clements 2000).
While these percentages indicate that trees are a major threat
to reliability, the convention of excluding outage
statistics arising from severe storm events (Louisiana Public
Service Commission 1998; California Public Utilities
Commission 2000; Carris 2000; Michigan Public Service
Commission 2000; Finch and Allen 2001; Oregon Public Utility
Commission 2001) means the extent of the problem is
Utilities are in the business of generating electricity
and delivering electricity and/or electrical service. They invest
in and install equipment to condition and transport
electricity to the point of use. This equipment, essential to
providing the service that constitutes the business, is an asset.
That which holds the potential of disrupting the service
and thereby the revenue stream constitutes a financial
liability. From a utility business perspective, all trees capable
of growing into or on failure striking a power line are not
only a legal liability due to human safety and property
concerns but also a financial liability.
Considering the long history of attention and
resources focused on reducing or eliminating treeconductor
conflicts, the extent of the ongoing level of tree-related
outages suggests something is missing. Utility foresters balance
a plethora of competing interests. Challenged to satisfy
the need for safe, reliable, and economic electric
service; easement conditions; property rights; regulations;
environmental concerns; tree ordinances; public perceptions;
and aesthetics, one understands how it might be difficult "to
see the forest for the trees."
Tree-related outage statistics provide information
about the extent of tree exposure and efficacy of the
line-clearance program. However, these statistics are after the fact.
The intent of this article is to provide a means of
understanding and quantifying tree risk in advance of failure events.
This search for a conceptual framework for sustainable
tree-related outage reductions takes a dispassionate
view, focusing on trees, how forest stands develop and
die, characteristics of the electrical system and utility
business, and how they interact to impact reliability.
In developing the conceptual framework, this article
will examine from a North American perspective:
· the increasing pressure on utilities to reduce
· the source of tree-related outages
· natural tree mortality and predictive modeling of
· the implications of tree mortality for
conventional hazard tree identification and removal program
cycles and intensity
· an approach to quantifying the risk of tree line strikes
· using the cost of changes in tree risk to decide
among maintenance options
· the relationship between target clearance and tree risk
The primary audience for this article is utility
foresters. Secondary audiences include utility asset managers,
utility arboriculture consultants, utility regulators, forest
managers, and other stakeholders. Terminology used draws
both from utility and forestry domains. Standards and units
used in examples have been selected based on wide
commonality in the North American utility industry.
It will be demonstrated that the majority of the
risk associated with the tree liability arises from trees outside
the maintained right-of-way (ROW). Tree failure events
that disrupt electric service are weather related (Simpson
and Van Bossuyt 1996; Desbiens 2001; Finch and Allen
2001; Rogers 2001; Tomich 2001; Keener, no date).
Because severe weather may cause healthy, defect-free trees to
fail (Simpson and Van Bossuyt 1996; Desbiens 2001; Finch
and Allen 2001; Tomich 2001; Keener, no date), all trees
capable of interfering with power lines are included in the
risk assessment. In viewing all trees capable of interfering
with power lines as a liability, the assessment of tree risk
assumes a worst-case scenario. Hence, the quantification of the
tree risk as presented should be seen as the base case that will
be modified by other risk factors such as
species-specific failure characteristics and the frequency of severe
storms, drought, and pest infestations.
Nonetheless, tree risk assessment can be used on
a comparative basis. As such, the quantitative tree
risk assessment constitutes a tool that will provide
· a means for progressive reliability improvements
· another method of assessing the role of trees
on circuits experiencing poor reliability
· a means of setting specific, acceptable, residual tree
· a basis for prioritizing investment to maximize
reliability gains and minimize losses
· a means of rationalizing capital investment in
equipment or methodologies that prevent
· a means of illustrating to regulators the need for
and prudence of line-clearance maintenance decisions
The use of a broad, inclusive definition for the
tree liability should not be construed to suggest that
utilities reduce the tree risk to zero by removing all trees capable
of interfering with power lines. Not only would such a
decision be met with public and regulator resistance but also, as
will be demonstrated, it would not be financially prudent.
CONTEXT FOR TREECONDUCTOR CONTACTS
AND SERVICE DISRUPTIONS
In little more than a decade, a firestorm in
Washington (Partners in Protection 1999), the burning of a
historical California town (EnergyOnline 1997a, 1997b; Olsen
2001) and two major western U.S. grid crashes
(EnergyOnline 1996a, 1996b, 1996c) have been attributed to
treeconductor contact. In western North America,
where summer forest conditions tend to be dry,
treeconductor contacts are a frequent cause of forest fires
(EnergyOnline 1997c, 1998a; Partners in Protection 1999).
Utilities in eastern North America face ice
storms (EnergyOnline 1998a; Desbiens 2001). In the south
and southeastern United States, windstorms are
relatively frequent events (Electric Perspectives 2001; Tomich
2001; Keener, no date). While the stress these events place on
the electrical system results in direct equipment failures,
often the majority of outages associated with these events
are indirect. They are the result of tree failures (Megawatt
Daily 1999; PRNewswire 1999; Tomich 2001).
The risk of major system outages caused by
severe weather events is increasing. Climatologists studying
global warming predict greater variability in weather in the
future. They forecast the number and severity of major
weather events will increase (Watson et al. 1998). The trend
may already be established. During the past 21 years, 48
extreme weather events, each with estimated damages
exceeding US$1 billion, hit the United States. Of these, 41
have occurred in the past 12 years (Hadden 2001).
The transmission component of the electrical system
is experiencing unprecedented load. Due to the
business uncertainty associated with evolving regulation
toward competitive markets and public resistance to siting
new transmission lines, expansion of the transmission system
has not kept pace with growing electricity demand (Owens 2001).
A new trend emerging from public utility commissions
is to specify reliability targets that must be met
(Kjellstrand 1998; Rights-of-Way Online 1999; Grayson 2001).
The number of states in the United States that have set
reliability standards increased from three in 1996 to 27 in 2001
(Bush 2002). A variant is performance-based ratemaking,
under which utilities will be financially rewarded for
exceeding reliability goals and, in some cases, punished for failures
to meet them (Grayson 2001). As of 2001, 11 states in
the United States have penalties and awards for
performance (Bush 2002). The effects of major storms on the
statistics are excluded from the base targets.
However, public utility commissions are increasingly
questioning whether a utility's past maintenance practices have
not compounded the extent of storm damage
(EnergyOnline 1998b; Tomich 2001).
With the shift to and expansion of the digital
economy, reliability of the electric system takes on previously
unimagined significance. The annual U.S. economic loss due to
power outages is estimated to range from a conservative
US$50 billion (EPRI 1995) to US$100 billion (Lewis 2001).
In a recent RKS Research & Consulting survey
(Business Wires Features 2001), 75% of the respondents said
it "doesn't matter which company supplies electricity,
as long as delivery is reliable."
Evolving customer and regulator expectations
suggest that an approach of classifying tree-related outages
as nonpreventable will no longer be acceptable. The need
for reliable service has increased dramatically. Due to costs,
the digital economy is completely intolerant of outages
(Lewis 2001). One might expect the most flexibility and
tolerance with light-load residential and small commercial
customers. However, it is unknown how long these customers,
dependent on electric service for security, comfort,
productivity, convenience, and recreation, will continue to be
forgiving for outages stemming from major storm events. Failure
to address the reliability issue will drive customers to adopt
the emerging distributed generation technologies to
free themselves of the grid.
SOURCE OF TREE-RELATED OUTAGES
To reduce tree-related outages, it is necessary to
examine the origin of tree-related outages (Rees et al.
1994; Guggenmoos 1996; Simpson and Van Bossuyt
1996; Goodfellow 2000). Tree-related outages can be
classified into two groups based on fault type: those attributable
to tree growth and those attributable to tree failure.
When a pruning program begins to fall behind,
tree branches grow into conductors. Initially, as branches
begin to make contact with energized distribution conductors,
the shoots tend to be "burned off" through momentary
contact (Figure 1*). At this early stage of treeconductor
contact, rarely would we expect a fault to occur. Rees, of
Baltimore Gas & Electric, attributed only 2% of all tree-related
outages to trees growing up into a line (Rees et al. 1994).
Guggenmoos showed tree growth to account for 2% to 10% of
tree-related outages on TransAlta Utilities' distribution
system (Guggenmoos 1996). Finch, reporting on Niagara
Mohawk's tree-caused outages, indicated that tree growth accounts
for 14% of outages (Finch and Allen 2001), while
Rogers explained that part of the reasoning behind Puget
Sound Energy's Tree Watch program is that only 13.5% of
tree-related outages are attributable to tree growth
(Rogers 2001). From these geographically and ecologically
diverse utility systems, a common thread emerges: Tree growth
into power lines accounts for less than 15% of all
tree-related outages. A marked increase in outages due to growth is
not likely to occur until the pruning program is so far
behind cycle that tree branches are of a more substantial
diameter and in simultaneous contact with two phases (Rees et
al. 1994; Goodfellow 2000; Finch and Allen 2001).
Notwithstanding, this finding should not be
interpreted to mean that trees growing into distribution lines are not
a risk. Safety and fire hazard risks increase in relation to
the decreasing clear distance among trees and bare
conductors and number of incidents of treeconductor contact.
These risks represent a legal liability not only to utilities but
also directly to utility executives and directors
(Guggenmoos 1996; EnergyOnline Daily News 1997a, 1997b).
Treeconductor contacts arising from tree failure will
in most cases result in a fault by
· breaking the conductor or bringing it to the ground
· bringing phases into contact with each other
· making a substantive bridge between
phasesallowing a carbon path to develop and leading to a short (Rees
et al. 1994; Goodfellow 2000; Finch and Allen 2001)
Where maintenance practice does not remove overhangs, some electrical faults will arise from trees within
the right-of-way (Finch and Allen 2001); however, the
majority of tree-caused outages are the result of failure of
trees outside the right-of-way. The number of trees capable
of striking the line from outside the right-of-way
vastly outnumbers the trees on it. This is particularly true
for distribution lines, which comprise roughly 90% of
the electric grid. North American distribution line heights
are generally 6.1 to 9.1 m (20 to 30 ft). Maintained
right-of-way width for distribution lines is commonly 6.1 to 9.1 m, with
a clear width of 3 to 4.5 m (10 to 15 ft) (Figure 2).
Where such distribution lines run through 24.2 m (80 ft) tall
tree stands, the maintained right-of-way area represents
only 12% to 19% of the total area from which
treeconductor conflicts can arise. Along transmission lines, where
voltages are higher, a greater percentage of the area from which
tree conflicts could arise needs to be and typically is
maintained as right-of-way. The maintained portion may comprise
100% of this area but more commonly comprises 30% to
70%. The tree risks in the transmission line wire zone are
readily recognized (EnergyOnline 1996a, 1996b,
1996c; Goodfellow 2000) and providing adequate funding
are generally addressed. Hence, it is the off-ROW trees that
will constitute the larger source of outages, particularly
under severe weather conditions.
On TransAlta's distribution system, where all
overhangs were removed, 90% to 98% of tree-caused outages
were due to tree failure (Guggenmoos 1996). It is estimated
that 95% or more of these failures were trees beyond
maintained right-of-way. Finch reports 86% of
tree-caused outages result from trees outside the right-of-way
(Finch and Allen 2001). Similarly, on the west coast, Rogers
reports that 66% of PSE's outages are caused by trees greater
than 4.5 m (15 ft) from the nearest conductor (Rogers 2001).
If tree-related outages are to be substantially reduced,
off-ROW trees will need to be addressed.
Addressing the safety and reliability risks of off-ROW
trees represents an enormous challenge to utilities. Power lines
run through or adjacent to an enormous number of trees.
Thirty-three percent of the United States and 56% of Canada
are forested (Smith and Sheffield 2000;
Forestinformation.com, no date). Most of these trees are natural. Only 7% are
planted (Forest Service 2001). The extent of the liability will
vary geographically. Maine is 90% forest covered, while Iowa
has only 5.7% forest cover (Forest Service 2000). In U.S.
urban areas, tree cover is 27% (U.S. Forest Service 2001).
On distribution systems where a large portion of
the tree-related outages are attributable to tree growth,
the pruning maintenance cycle is too long and divorced
from the tree inventory and tree growth rates. Conceptually,
this problem is easy to resolve since the underlying cause of
this condition is inadequate funding. Where tree failure is
the major source of tree-related outages, resolution of
the problem is more complex and certainly not as apparent.
TREE MORTALITY AND IMPLICATIONS FOR UTILITY MAINTENANCE PRACTICE
Trees may be power line hazards because of lean; a
poor anchoring medium; poorly formed, narrow angle
crotches; codominant leaders; and other structural defects. Such
trees are removed in a hazard tree program. But what of
trees that have no physical defects yet succumb to
competition for light, water, and nutrients? An examination of
natural tree mortality is warranted.
Data for lodgepole pine (Pinus contorta) in
Alberta, Canada (Johnstone 1976), reveal that between age 20
and 100 years, about 4,000 trees per hectare die (Table
1). Lodgepole pine is a fire-origin species. Stands are
even-age and quite uniform. In the first 30 years, the trees are not
of a height where they are likely to cause service
interruptions. In Table 1, the column titled relevant mortality provides
a cumulative total of dead trees per hectare in excess of 11.8
m (39 ft) in height. The mortality is considered relevant
because these trees, in the vicinity of a distribution lines, hold
the potential to cross phases, start fires, and disrupt
service. Typical distribution line height is 6 to 9 m (20 to 30 ft).
Tree mortality that poses a threat to distribution lines is
almost 3,500 trees per hectare over 70 years. Putting that into
a utility context, a hectare is about 1 mi by 20 ft wide. For
a power line running alongside such a lodgepole pine
forest, about 3,500 trees die over a 70-year period within a
20-ft strip just outside the maintained right-of-way.
Table 1. Lowest density (1,236/ha at 70 years)
yield table for lodgepole pine (Pinus
contorta) on an "average site" in Alberta (adapted from Johnstone
1976). Relevant mortality refers to trees greater than 11.8
m (39 ft) in height.
|Age||Trees/ac||Trees/ha||height (ft)||height (m)||mortality
the sake of simplicity, a straight line average, that
amounts to an annual average of 50 trees per hectare (20 ft
´ 1 mi) that become susceptible to failure.
Of course, only a percentage of these 50 trees
per hectare capable of striking the line on failure will do
so. However, we can reasonably expect that the number of
tree-caused outages is directly proportional to the degree
of exposure, measured in standing hazard trees.
Other fire-origin speciesjack pine (Pinus
banksiana) and trembling aspen (Populus
tremuloides)of the Canadian boreal forest (Figure 3) follow a similar pattern of mortality. Over
50 years, the stand density declines 70%. For young
South Carolina forests (Figure 4) predominated by pines
(Pinus spp.), oaks (Quercus spp.), maples
(Acer spp.), yellow poplar (Liriodendron
tulipifera), and black gum (Nyssa
sylvatica), which would typically evolve into uneven-age stands, stand
density is found to decline 60% over 50 years (Crookston 1997).
Tree mortality for the South Carolina forest stands represented
in Figure 4 amounts to 36 trees
(mi1yr1 over a 20
ft width). Mortality skewed to small-diameter-class trees, as
one would expect for uneven-age stands, is not a factor
because there is no accretion built into the stand model used
(Lilly 2000). Still, not all of the trees have the height to interfere
with power lines. Most do, however, since they achieve 12.1 m
(40 ft) in height in 20 years. Eliminating the short trees
from the South Carolina data to focus on annual mortality
relevant to utilities reduces the rate to 23 to 25 trees
Some of the lowest tree mortality rates appear in the
U.S. Pacific Northwest. Using the Suppose simulator
1997), forests stands comprising Douglas-fir
(Pseudotsuga menzeisii), ponderosa pine
(Pinus ponderosa), western larch (Larix
occidentalis), grand fir (Abies grandis), and
lodgepole pine show a 40% stand reduction over 90 years (Figure
5). Considering accretion and higher mortality of small
diameter trees (Johnson 1990), the average annual
mortality relevant to utilities is 18 trees per hectare.
All of the preceding forest stands examined have
been young stands. Old stands, however, follow the same
pattern (Figure 6), though the average annual mortality in trees
is lower because the number of trees in the stand is lower.
For this 500-year-old stand of ponderosa pine and
Douglas-fir (Crookston 1997), the predicted average annual mortality
is 7 trees per hectare. Given that only 6% of U.S. timber
is more than 175 years old and that 55% of the U.S.
forests are less than 50 years old (U.S. Forest Service 2001),
most electric line exposure will be to forest stands having
a relatively high tree density.
The diverse forest stand examples provided show
that while the total number of trees and the rate of
mortality varies by species and location, the trend of a declining
viable tree population over time, is common. It is true for
even-age and uneven-age stands. Inter- and intra-species
competition for light, water, and nutrients drives the decline in
tree population. Periods of stress caused by drought or
pests accelerate the rate of mortality.
To understand the implications of the declining
tree density on power line security, the data for lodgepole pine
in Alberta (Johnstone 1976) are graphed to highlight
the number of dead and decadent trees (Figure 7). The
mortality data have been altered to exclude trees that die
before achieving a height that is likely to pose a serious risk to
a distribution line (relevant mortality column in Table 1).
Death in trees is a process that may occur over months to years,
and dead trees may stand for extended periods of time. Figure
7 shows how the risk to power lines accumulates. It provides
a stimulus for further inquiry and examination of the rate
of hazard tree development in the context of utility
From the data used to generate Figures 3 through
6, illustrating the viable stand populations, tree mortality
rates ranging from 7 to 50 trees per hectare per year were
derived. It has been stated that 1 ha equals 1 mi by 20 ft. When
the height of trees is considered, evaluating the hazard over
only the first 20 ft of the adjacent forest will generally be
inadequate. Summing mortality for 3 ha, or over 60 ft per
mile, would appear a more reasonable approach. For example,
this assumption would fit a distribution line with a 3 m (10
ft) clear width adjacent to 21.2 to 24.2 m (70 to 80 ft) trees or
a transmission line with a 9.1 m (30 ft) clear width adjacent
to 27.3 to 30.3 m (90 to 100 ft) trees. Based on the need
to consider the risk arising over 60 ft rather than 20 ft,
the number of dead trees posing a risk to power lines
increases by a factor of 3, from 21 to 150 trees
mi1yr1 per ROW side.
The natural phenomenon of decreasing viable
tree density over time for forests represents an enormous risk
to line security. However, this risk is frequently ignored and
not quantified because the trees constituting this risk are
usually outside the right-of-way. While inventories may include
off-ROW hazard trees, it is a static snapshot of conditions at
a particular time. There has not been a predictive model
for the development of hazard trees. Rather, utilities
that recognize off-ROW hazard trees as substantial risk to
line security tend to have a program of cyclical field
inspections to monitor and identify hazard trees. Assuming a
5-year cycle for hazard tree identification and removal, the
lowest tree mortality rate found (U.S. Pacific Northwest
interior) necessitates the removal of 105 (21 trees
mi1yr- ´ 5
yrs) trees per mile per ROW side per maintenance cycle.
For lodgepole pine and other fire-origin species, the
required removals expand to 750 (150 trees
mi1yr1 ´ 5 yrs) trees
per mile per ROW side. This requirement is two orders
of magnitude above what would be considered by the
utility industry a typical hazard tree program. Considering the
rate of tree mortality, a hazard tree program that
successfully removes the emergent risk of line strikes would
necessarily be a major operation. It would be noticed in the
utility arboriculture industry much as Puget Sound Energy's
Tree Watch program has been (Rogers 2001).
Utility arborists and researchers have focused
considerable effort on better hazard tree identification. While
this effort is useful, the sheer volume of trees dying under
normal conditions suggests that unless the maintenance cycle
is drastically reduced and the number of hazard trees
identified and removed per cycle is greatly increased, the risk of
treeline contacts will not be meaningfully reduced.
Complicating the matter is that better than half the trees that fail show
no noticeable defects (Simpson and Van Bossuyt 1996; Finch
and Allen 2001). Shifting from typical hazard tree programs
to ones that fully address natural tree mortality would
escalate the costs at least 20 times, which casts doubts on the
feasibility of improving line security through a hazard tree
program. The examination of natural tree mortality leads to
the conclusions that improving line security through a
hazard tree program will be extremely challenging and, where
gains are made, the results will not be enduring.
QUANTIFYING TREE MORTALITY
Tree mortality rates depend on local tree species
and conditions. A cursory review of various stand data
suggests annual mortality rates ranging from 0.5% to 3% are
quite common (Johnstone 1976; Plonski 1981; Campbell
and Liegel 1996; Crookston 1997; Harmon 1999; Curtis et
al 2000; U.S. Forest Service 2001). The lowest tree
mortality rates uncovered in this review occurred in the
Pacific Northwest. Annual mortality rates of 0.3% to
0.5% were observed in ponderosa pine forests (Harmon
1999). Mortality in coastal forests of Sitka spruce and
western hemlock is highly variable, ranging from 0.8% to 3.0%
per year (Harmon 1999).
Examination of local forest stand data representative
of the area will reveal the annual mortality. While this
determination is necessary, annual percentage of mortality is
not directly useful. Application requires the current tree
density be known so as to transform percentage to trees per unit
(i.e., mile) of line. It is expected that most North American
utilities will find tree mortality will annually add 50 to 150 trees
mi1yr1 per 60 ft of treed ROW side to the
workload. Given such a spread, utilities will want to quantify the tree
risk specific to their area. A model that can predict tree
mortality based on current or found tree densities would be useful.
A comparison of the number of trees added to the
workload through mortality versus the number removed by
the hazard tree program will reveal the impact of tree
mortality on future workload. This information may prove useful
in attracting more funding for vegetation management.
In calculating annual tree losses, as was done
for lodgepole pine, a straight-line average was used for the
sake of convenience. This method may appear fitting for
the Pacific Northwest interior (Figure 5) and even the
South Carolina mixed-wood forests (Figure 4). Mortality rates
in fire-origin species (Figure 3) clearly do not follow a
The percentage of annual mortality in Figures 3, 4, and
5 is modeled. The same geometric progression was used
in each case, altering the mortality rate for the best fit to
the observed data. The algorithm used is
Pt0 × 1/(1 +
P is tree population (trees/ha)
MR is mortality rate
t0 is the stand age in years at the start of the period
tn is stand age in years at the end of the period considered
Using this equation to predict the residual tree
population a good fit was achieved for several different forest
types (Figures 3,4, and 5). It can be seen that a calculated 3%
rate of annual mortality produces a curve that fits
fire-origin species quite well, while an annual mortality rate of 1.7%
fits the South Carolina mixed-wood example where the
tree density changes appear linear. Thus, at low mortality
rates, this equation yields a declining population approaching
a straight line, whereas higher mortality rates result in
a curved population decline. While this algorithm provides
a tight fit between forecast and actual stand mortality for
provided examples, the number of stands considered
is small. This algorithm is not suitable as a predictive tool
for all forest stands. Its output will need to be compared
to local forest stand data to pre-test its utility. Should it
be found lacking, error can be restricted by limiting
forecasts of mortality to 20 years or less. For longer-term
predictions, local forest researchers will be able to
provide suitable algorithms (Crookston 1997) .
Discussion has focused on tree mortality in
unmanaged stands. Since this tree mortality is based on competition
for light, water, and nutrients, forest management
practices such as commercial thinning that effectively
decrease competition between trees will decrease both the
mortality rate and the number of dead trees. Lines running
through managed stands are not devoid of line strike risks, but
the rate of tree mortality may be half that of the
unmanaged stands (Curtis et al. 2000).
RISK REDUCTION THROUGH HAZARD TREE REMOVAL
Most utilities strive to handle the risk of off-ROW
tree-caused outages through a hazard tree identification
and removal program. To determine if this constitutes a
reasonable and effective approach requires an examination of
a typical hazard tree program and the rate of development
of hazard trees or tree mortality.
To assess the potential of a hazard tree program
to mitigate the risk of treeline strikes, an example for a 69
kV line will be used. This voltage, in the lower end of
transmission service, is chosen because the treeconductor
clearances maintained will fall between those for higher
voltage transmission lines and the lower voltage distribution lines.
Assume the following conditions:
· A 69 kV line is set on an 18.3 m (60 ft) right of way.
· The line is built on 3 m (10 ft) cross arms, and
the average conductor height is 12.2 m (40 ft).
· Sixty percent of the line runs adjacent to a forest edge.
· Dominant tree height is 25.9 m (85 ft), and tree
density is 620 trees/ha (250 trees/ac).
· A hazard tree program is in place. It is on a 5-year
cycle and removes an average of 18.75 trees per
kilometer (30 trees per mile).
(An informal survey (20012002) of seven utilities
found that hazard trees are removed as a part of the
normal maintenance cycle. The maintenance cycle ranged from 3
to 7 years except in Hawaii where the cycle is much
shorter. Most hazard tree programs removed about 5 trees per
mile, with the most intense averaging 10 to 15 trees per mile.
A hazard tree program removing 30 trees per mile was
viewed as very aggressive and a major undertaking by all the
From this, we can calculate:
· Clear width = 7.6 m = 25 ft [(60 ft ROW 10 ft
cross arm)/2]. (Clear width is the distance measured on
the ground from the trunks at the tree line to the
nearest conductor. See Figure 2.)
· Trees up to 22.7 m (75 ft) from the nearest
conductor could strike the line (as determined by triangulation).
· Residual trees occur over 22.7 m 7.6 m = 15.1
(75 ft 25 ft = 50 ft) of ROW side of exposure.
· The residual tree population is 947 trees/km = 1,515
trees/mi [(50 ft × 5,280 ft/mi)/43,560
ft2/ac × 250 trees/ac].
· The residual tree population is decreased by
2% through the hazard tree program (30 trees
A good hazard tree identification and removal
program may substantially improve line reliability. However, the
risk addressed by the hazard tree program is only that of
trees perceived to be susceptible to failure. The risks
associated with the impact of lightning, severe wind, and ice loading
on healthy, structurally sound trees are not addressed by
a hazard tree program because sound, healthy trees are
not removed. This risk is not insignificant. Many utility
foresters will attest to the fact that as many as half the trees that
fail show no noticeable defects. An Eastern Utilities study
found that only 44% of the trees or limbs that failed had
an indicator of structural weakness (Simpson and Van
Bossuyt 1996). Niagara Mohawk found of the trees that failed,
36% were dead and 64% were live (Finch and Allen 2001).
A 2% reduction in the residual tree population examined
in the context of typical tree mortality rates of 0.5% to 3% per
year (Johnstone 1976; Plonski 1981; Campbell and Liegel
1996; Crookston 1997; Harmon 1999; Curtis et al 2000; U.S.
Forest Service 2001) reveals that the benefit of even a
very aggressive hazard tree program as used in this example will only
be significant for a relatively short time. Most of the reliability
gain will erode prior to the next maintenance cycle. Since we
cannot predict which of the residual trees will next become
decadent, the enduring outcome of the example hazard tree program is
no more than a 2% reduction in treeline strike risk.
Given the same tree characteristics, the residual
tree population is lower for higher-voltage transmission lines
due to a greater maintained treeconductor clearance and
line height. Thus, removing 18.75 trees per kilometer (30
trees per mile) would yield a greater risk reduction. Conversely,
for distribution lines, due to smaller maintained
treeconductor clearances and lower line heights, the residual tree
population is higher; hence, removal of 18.75 trees per kilometer
(30 trees per mile) yields a smaller percentage change.
This example serves to illustrate that the extent of
the risk arising from off-ROW trees is not meaningfully
addressed, in an enduring fashion, through a typical
hazard tree identification and removal program.
QUANTIFYING THE RISK OF TREELINE STRIKES
For power lines running adjacent to or through forests
or natural tree stands, the risk of treeconductor contact
is directly related to the number of trees within striking
distance of the line. This risk far outweighs the risk arising from
trees considered part of the normal maintenance regimen,
which places the emphasis on trees within the ROW
requiring pruning. The rate of tree mortality has been shown
to constitute a far greater risk to reliability than has
been previously recognized. How might the tree risk be mitigated?
The tree risk can be quantitatively mitigated by
decreasing the number of trees capable of striking the line either
by increasing the clear width or the line height.
To determine the effects of clear width (clear width
and clear distance are used interchangeably) on line security,
tree canopy height, tree density, and the line height can be used in
a mathematical derivation (optimal clear width
calculator, Guggenmoos 2000) of risk exposure. The graphic output,
the line strike probability chart (Figure 8), shows how the risk
of line strike changes with the clear width. The derivation of
risk shown in the line strike probability chart assumes all
possible directions of tree fall have an equal probability. All
trees capable of striking the line are considered equal regardless
of condition. It reflects differences in mortality based on
species only indirectly through tree density (see Figure 3). The ability
to withstand wind and ice or snow loading,
species-specific patterns of decay and failure, and the probability of
weather events of specified severity are not incorporated
when producing the line strike probability chart. The risk factor
in the line strike probability chart is not a standalone
probability of tree failure. Rather, the risk factor is used to compare two
or more construction or maintenance options for a specific area.
The line strike probability chart (Figure 8) shows that
at a 0 m clear width, the risk factor is 1. The risk
factor reaches 0 when the clear width is so great that no
falling tree can strike the line. In this example, the risk
factor reaches 0 at an 18 m clear width. A point of
particular interest evident in the line strike probability chart (Figure
8) is that there is a point of diminishing return in line
security for the dollar invested in increasing clear width. In
this example, that point is at a clear width of 6 to 7 m. At a
clear width of 6 to 7 m, the risk factor passes through the value
of 0.2. Stating it in another way, assuming the variables in
this example, a 6 to 7 m clear width reduces the risk of
trees striking the line by 80%. The percentage of trees removed
to attain the 0.2 risk factor is 39% (7 m/18 m
The point at which clear width provides a
diminishing return in line security can provide guidance for the extent
of easement required on new lines where an optimal
balance between cost and reliability is desired.
The data produced by the line strike probability
chart can be used in a number of ways. To illustrate,
three examples are provided.
A section of distribution line running through a
forested area is identified as problematic. Under windy
conditions, trees fail and take the line out. The hazard tree
removal program has had limited success. Perhaps widening the
right of way is the solution? But it is difficult to justify making
a major investment without a means of forecasting
the benefitthe impact on reliability.
To produce a line strike probability chart, certain
field data are required. Assume the following conditions:
· line height: 9.1 m (30 ft)
· tree height: 27.3 m (90 ft)
· trees/ha: 298 (120 trees/ac)
· current clear width: 3 m (10 ft)
What would be the benefit of increasing the clear
width to 6.1 m (20 ft)?
Reading from the line strike probability chart (Figure
9), at a 3 m (10 ft) clear width, the risk factor is about
0.68 while at a 6.1 m (20 ft) clear width the risk factor is
about 0.42. That information can then be put into a
simple spreadsheet (Figure 10), which shows increasing the
clear width another 3 m would result in a 37% improvement
in line security.
Adding unit costs to the spreadsheet facilitates a
quick assessment of the cost versus the benefit in increased
line security. In this way, the cost for improvement in
reliability gained through right-of-way widening can be compared
to alternatives such as increasing line height, differing
construction, undergrounding, or using protective devices.
Your utility company plans to build another
transmission line. Due to siting difficulties, the most expedient
approvals are likely if the line is added to an existing
right-of-way. Applying to increase the current easement may also result
in delays, so the company favors using the existing
right-of-way. Before finalizing this decision, management would like
an assessment of the impact this action will have on line security.
Assume the following conditions:
· seventy percent of the line runs through forest;
the clear width will be reduced to 9.1 m (30 ft)
· line height: 18.2 m (60 ft)
· tree height: 27.3 m (90 ft)
· trees/ha: 298 (120 trees/ac)
· current clear width: 19.7 m (65 ft)
A line strike probability chart is produced (Figure
9). From that chart we see that at a 19.7 m (65 ft) clear
width, the risk factor is about 0.03, while at a 9.1 m (30 ft)
clear width, the risk factor is about 0.26. Entering this
information into a spreadsheet (Figure 11) shows the impact
of decreasing the clear width by 10.6 m (35 ft) is a 767%
in line security. In other words, one should expect
about eight times the current number of tree-related outages.
A 48 km (30 mi) segment of a 240 km (150 mi) 25 kV
circuit is being rebuilt. A new 69 kV line is being constructed
and for the portion of the overlap the 25 kV circuit will
be understrung on the taller 69 kV structures. There have
been a significant number of tree problems on the 25 kV line.
The records indicate that over the length of the line there
were seven tree-caused outages in the past year. Management
has indicated that this level of reliability is unacceptable for a
69 kV line. They have asked what would be required to
reduce the tree incidents to no more than one in 3 years.
Engineering has told you that they plan to build the 69 kV line on
3 m (10 ft) cross arms, and the average line height would
be 12.2 m (40 ft).
Assume you find the following field conditions:
· The current 25 kV line is built on 2.4 m (8 ft)
cross arms, and the average conductor height is 8.5 m (28
ft). Sixty percent of the line runs adjacent to a forest
edge. The poles are situated at the edge of a 20 m (66 ft)
· current 25 kV conditions:
· line height: 8.5 m (28 ft)
· tree height: 25.8 m (85 ft)
· trees/ha: 620 (250 trees/ac)
· current clear width: 4.5 m (15 ft)
· 69 kV conditions as proposed:
· line height: 12.2 m (40 ft)
· current clear width: 4.2 m (14 ft)
Two line strike probability charts need to be produced:
one for a 8.5 m (28 ft) line height (Figure 12) and the second
for a 12.2 m (40 ft) line height (Figure 12). The tree risk
factor for the current situation (28 ft height and 15 ft clear
width) is 0.50. The tree risk factor for the 69 kV line on the
same ROW (40 ft height and 14 ft clear width) would be
0.50. Hence, the number of treeconductor incidents on the
69 kV line would be 0.50/0.50 of those experienced on
the current 25 kV line.
Then you need to determine the amount of
improvement required in line security to achieve the
management objective of no more than one tree-caused outage over
3 years. This could be expressed as 0.33 outages per year.
You calculate the required increase in line security to
(1 (.33 outages yr1/(.50/.50
× 7 outages yr1/150 mi × 30 mi))
Using a spreadsheet (Figure 13), you enter the tree
and line data and the starting tree risk factor of 0.50
(derived from the chart Figure 12). Then by iterations of
the new risk factor value, you determine the risk
factor value that brings the line security improvement as close
as possible to 76%. That value is 0.12. Returning to Figure
12, we see that a risk factor of 0.12 occurs at a 10.6 m (35
ft) clear width. The distance from the tree line to the center
line needs to be 35 ft +10 ft/2 (cross arm), or 40 ft. Given
that the line runs along a road allowance, the road side
portion has more than adequate clearance.
You advise management that you can meet the
objective of no more than one outage in 3 years provided the
easement is increased by 20 ft on the off-road side and that
you receive increased funding of $91,638 ($5,091
mi1 x 30 mi x 0.6 tree covered) to widen the right-of-way.
These three examples illustrate the utility of
forecasting the impact of actions on line security. While risk is
quantified in percentage terms, where the history of tree
incidents is known, a simple calculation can convert the data to
the number of tree incidents to expect in the future (as
in Example 3). In doing so, it should be recognized that it is
an estimate that assumes the same tree and weather
conditions from one year to the next.
The approach used in arriving at the risk factor
assumes all trees are susceptible to failure and as such, all
trees capable of striking the line represent a liability. While
this represents a worst-case scenario, it recognizes that
we cannot predict which healthy trees in a stand will
next succumb to the stresses of competition. Thus, while
varying tree mortality rates influence the scope and intensity of
the hazard tree program that should be applied to the
residual trees, mortality rates do not alter the risk factor.
One risk factor rating, taken in isolation, provides
no information about the number of tree-caused outages
experienced on or projected for a line segment. It is only when
the risk factor is used on a comparative basis that it becomes a
tool that informs the process of selecting between maintenance
or construction options. The three examples illustrate that
the changes in risk factor arose from changes in the variables
of line height and/or clear width. Tree species failure and
decay characteristics that either contribute to or
decrease the likelihood of line outages are the same for each possible
option. These tree species failure and decay characteristics are
also reflected in the tree-related outage experience. Hence, the
risk factor can be used to derive a percentage change in
reliability without the need to identify the specific tree failure modes.
IMPORTANCE OF TARGET CLEARANCE: A CASE STUDY
To this point, tree risk quantification has been examined
in the context of applying it to new construction and
problematic line segments. However, in a sense, utilities have set
their tree risk exposure on a systemwide basis by the
clearance standards adopted.
The factors affecting tree-related outages are (Guggenmoos 1996)
· tree density (number of trees per mile of line)
· clear distance (horizontal distance, measured on
the ground, from tree edge to nearest conductor)
· tree species (based on specific characteristics such
as mature height, propensity to shed branches,
break, bend or uproot)
· soil characteristics
· disease and insect pests
· weather events such as wind, ice and wet snow
· landscape characteristics such as slope
Examining these factors, there is only one utilities
can control: the clear distance (Guggenmoos 1996).
The maintenance cycle is not included in this list since it
would be determined by tree species characteristics,
climatic factors, and clear distance. That is, the maintenance
cycle and clear distance are totally interdependent. If the
clear distance is set, then the maintenance cycle is determined
by the time it takes for growth to span the clear distance
or reach the limit of approach. If the maintenance cycle is
set, then the clear distance must be adjusted to reflect
the amount of growth over the cycle.
When a line clearance program is under review, both
the clear distance and maintenance cycle may be reviewed
and adjusted, as was the case when TransAlta Utilities'
program came under review in 1985. In spite of line clearance
budget increases averaging 32% between 1980 and 1985,
tree-related outages were expanding exponentially
(Guggenmoos 1995) (Figure 14). At the time of the review, pruning
clearances averaged 3 m (10 ft). To increase the cycle length,
the target pruning clearance was increased to 4.5 m (15 ft). It
was decided that trees overhanging lines would not be
acceptable. All overhangs would be removed. Side clearance, which
was initially 6 m back in the 1950s when rural lines were
built, had experienced some in-growth to average 5 m. Just
before the new line-clearance program based on a tree
inventory and local growth rates was launched in 1986, a decision
was made to measure the target 6 m side clearance at line
height to the nearest tree part, rather than along the ground
(clear width). The result of this decision was that the clear width,
as measured to the trunks of adjacent trees, was increased
on average by 4 m. This decision had a profound impact on
Tree-related outages increased from 1985 through
1987 but then began a steady decrease. In fact, by 1991,
tree-related outages dropped 80% from the 1987 levels
(Figure 14). A 70% reduction can be explained by the
increased clear width (Figure 15) using averages for tree height,
tree density, and line height. The remainder of the reliability
gain is attributed to the removal of overhangs (only 0.3% of
total trims) and appropriate pruning cycles.
TransAlta's experience illustrates the clear
distance standard selected is a key determinant of the number
of tree-related outages.
That 85% of the tree-caused outages arise from
off-ROW trees is problematic because it brings the utility into
conflict with property rights. While that would appear to rule
out any increases in clear width without legally increasing
the easement, it need not be so. In fact, TransAlta
distribution lines had no easements except on crown land,
which comprised less than 15% of the line miles. Increasing
the clear width could only be achieved through the
willing cooperation of the landowners.
Considering tree and typical line heights, common
maintained right-of-way widths eliminate tall-growing and
potentially conflicting trees from only a fraction of the area along
power lines. For both transmission and distribution lines, the
major source of tree conflicts is from off-ROW trees.
The clear widths used by North American utilities
are fairly standard and not greatly variable. However,
the exposure of power lines to possible
treeconductor conflicts is highly variable due to variable amounts of
forest cover. In many states and provinces of North
America, forests cover more than 50% of the land base (Smith
and Sheffield 2000; U.S. Forest Service 2000, Forestinformation.com, no date).
The available information suggests that for
North American utilities having both urban and rural lines,
tree failure will be responsible for about 85% of all
tree-caused outages, regardless of the number of trees per mile of
line. While pruning trees on the right-of-way is essential
for public safety, to make significant improvements in
electric system reliability, the risks arising from adjacent,
off-ROW trees must be identified and addressed. Typically these
risks are addressed through a hazard tree identification
and removal program. There has been an industry focus
on improving the science of hazard tree assessments
and increasing staff and contractor competencies in the
identification of these hazards. However, a gap in
utility arboriculture literature regarding natural tree
mortality rates and their implications for achieving reliability
improvements suggests an area deserving of consideration.
Common annual tree stand mortality rates ranging
from 0.5% to 3% reveal that the rate of hazard tree formation
or additions is substantially higher than removal rates
through hazard tree programs. Given that natural tree mortality
adds 21 to 150 trees
mi1yr1 per treed ROW side, it is
probable that most hazard tree programs will do little to
appreciably improve reliability for more than a few years. When
a hazard tree program does substantially improve
reliability, over time the majority of the gains will be eroded by
Treeconductor conflict risks are not limited to
dead and decadent trees. High winds, ice, wet snow, and
lightning can cause healthy, structurally sound trees to fail. Hence,
all trees capable of interfering with power lines constitute
a risk to the safe, reliable transmission of electricity.
Tree height, line height, tree density, and clear width
are variables that can be altered to improve line security.
A mathematical derivation of tree risk using these
variables was used to produce line strike probability charts. The
line strike probability chart reveals there is a point of
diminishing return in line security as clear width increases. In
most cases, making a power line tree-free would not be
financially prudent. The nature of the line strike probability
curve presents a visual that both clarifies and simplifies
the understanding of the risk of trees in proximity of
power lines. As such, it may serve as a useful communication
tool among utilities and stakeholders such as federal and
state foresters, community groups, and regulators.
The variables used in the quantification of the tree
risk facilitate the derivation of the cost of specified
reliability improvements. The cost of benefits can be readily
compared to alternatives, such as increasing conductor
height, installing tree wire, or undergrounding. Being able
to calculate the cost of benefits opens an avenue to
balancing the liability associated with incurring an outage with
an acceptable degree of risk.
An examination of North American forest stand
data shows a common trait, that of decreasing tree density
over time. It is recognized, however, that for trees, death is
a processnot an event at one specific point in time. Dead
or decadent trees retain a certain structural strength and
fail when conditions arise that place them under
unbearable stress. The occurrences of such conditions of stress
are weather related. Thus, while this quantitative approach
to managing treeconductor conflicts offers benefits
under relatively normal weather conditions, the larger
opportunity lies in the avoidance of wind- and storm-related tree outages.
The use of line strike probability charts could
be particularly advantageous when there is a major
pest infestation that significantly increases tree mortality.
The usual maintenance approach would be to make
numerous passes identifying and removing hazard trees. It may
prove more economical to drastically reduce the number of
trees capable of striking the line by widening the right-of-way
to the point where clear width provides a diminishing return
in line security. Under these circumstances, this approach
may appeal to forestry staff because the tree numbers may
be high enough to justify salvage rather than simply
dropping the trees into the forest. In widening, the residual
area requiring hazard tree identification is reduced,
correspondingly avoiding costs. Further, by concentrating a major
tree volume to one maintenance event, the feasibility of
more economical, mechanized removal methods is enhanced.
The derivation of tree risk that has been outlined
views all trees capable of interfering with power lines as a
risk. While a quantitative tree risk factor can be
comparatively applied, it does not provide a measure of risk under
variable weather conditions. A logical extension of this
approach would include a study of tree failure rates, by species,
under various wind intensities; the frequency of destructive ice
or snow and wind storms; and the direction of severe
storms. These factors would serve to both to reduce the tree
risk rating and move toward predictive measures of the
probability of experiencing tree-caused outages and the
severity of system damage.
The mathematical quantification of tree risk applied
to priority areas as identified by outage statistics provides
an opportunity to manage so-called unpreventable
tree-caused outages for real and lasting gains in reliability.
Increasing conductor height and the clear width will improve
reliability. However, unless a line is completely tree-free, a hazard
tree program remains an integral part of the
maintenance process. The number and condition of residual trees
will impact line security. The success of the hazard tree
program will be improved because it need only be applied to
a reduced tree population or residual risk. Most important,
a quantitative approach to managing the risk of
treeconductor conflicts provides a means to progressively
improve electric system reliability.
Bush, R. 2002. In the reliability hot seat. Transmission
& Distribution World Feb. 1.
Business Wires Features. 2001. Home Grown?
U.S. Consumers Show Increasing Interest in Generating
Their Own Electricity. Jan. 29.
California Public Utilities Commission. 2000. Rulemaking
for Electric Distribution Facility Standard Setting. May
4, 2000. Attachment 1, Additional Provisions to G.O. 166.
Campbell, S., and L. Liegel (Tech Coords.). 1996.
Mortality trends. Chapter 1, in Disturbance and Forest Health
in Oregon and Washington. Gen. Tech. Rep.
PNW-GTR-381. U.S. Department of Agriculture, Forest Service,
Pacific Northwest Research Station, Portland, OR;
Oregon Department of Forestry; and Washington Department
of Natural Resources.
Carris, P.S. 2000. Preventing the non-preventable
tree-related outage. UAA Q. 9(1).
Crookston, N..L. 1997. Suppose: An Interface to the
Forest Vegetation Simulator. In Teck, R., M. Moeur, and J.
Adams (Eds.) Proceeding: Forest Vegetation
Simulator Conference. 37 Feb. 1997. Fort Collins, CO. Gen.
Tech. Rpt. INT-GTR-373. U.S. Department of Agriculture,
Forest Service, Intermountain Research Station, Ogden, UT.
Curtis, R.O., G.W. Clendenen, and J.A. Henderson,
2000. True Fir-Hemlock Spacing Trials: Design and
Results. Gen. Tech. Rpt. PNW-GTR-492. U.S. Department of Agriculture, Forest Service,
Pacific Northwest Research Station, Portland, OR.
Desbiens, R. 2001. Bulking up for the next
storm. Transmission & Distribution World Apr 1.
Electric Perspectives. 2001. News & trends: Jobs well
done I: Emergency response. Mar/Apr.
EnergyOnline Daily News. 1996a. Energy Department
Calls Last Month's Western Outage 'Preventable'.
. 1996b. California PUC on Big Outage: Let Us
Know When a Line's Down. www.energyonline.com/news/articles/Archive/outage2.asp.
. 1996c. PG&E's Big Bill from the
. 1997a. PG&E Faces Record Fines for 1994
. 1997b. Mother Lode Fire Costs PG&E $2
. 1997c. Wildfires: It's SoCal Ed's Turn on the
. 1998a. At Least 6 Dead, 3 Million Without
Power As Killer Storm Savages Eastern Provinces. www.energyonline.com/news/articles/Archive/a09can.asp.
. 1998b. CPUC Orders Probe of PG&E Tree-Trimming Practices.
. 1998c. SDG&E Damned If It Does or
EPRI. 1995. The Right Tree for the Right-of-Way.
Technical Brief, TB-105029. Abstract at
Finch, K.E., and C. Allen. 2001. Understanding
Tree-Caused Outages. EEI Natural Resource Conference. Apr.
2001. Palm Springs, CA.
Forestinformation.com. Forest Statistics. www.forestinformation.com/index.asp.
Goodfellow, J.W. 2000. Understanding the Way Trees
Cause Outages. ECI and Allegheny Power. Slide
presentation (34 slides).
Grayson, L. 2001. Legislative and regulatory update:
State regulatory issues. UAA Q. 10(1).
Guggenmoos, S. 1995. New program controls
tree management. Electric Light and Power, 73(2):1518.
. 1996. Outage statistics as a basis for
determining line clearance program status. UAA Q. 5(1).
. 2000. Introducing A New VM Tool &
Service, Optimal Clear Width Calculator.
Hadden, E. 2001. Weather lessons. Transmission
& Distribution World Apr. 1.
Harmon, M.E. 1999. Andrews Experimental Forest.
Regional Studies in the Pacific Northwest: H.J.
Andrews Experimental Forest Long-Term Ecological
Research (LTER) Program.
Johnson, R. R. 1990. East Cascades Prognosis
Geographic Variant of the Forest Vegetation Simulator.
WO-TM Service Center, U.S. Department of Agriculture,
Johnstone, R.A. 2001. Changing Expectations in
Utility Arboriculure. EEI Natural Resource Conference.
Apr. 2001, Palm Springs, CA.
Johnstone, W.D. 1976. Variable-density yield tables
for natural stands of lodgepole pine in
Alberta. Can. For. Serv., Dept. Fish. Envir., For. Tech.
Rep. 20, p. 110.
Keener, R.N., Jr. No date. The Estimated Impact of
Weather on Daily Electric Utility Operations. Duke
Power Company, Charlotte, NC.
Kjellstrand, L. 1998. PUC Penalizes EGS for
Inadequate Service, Customers to Get Refund When Order is
Final. Public Utility Commission of Texas, News
Lewis, S.M. 2001. Utilities Cannot Afford to
Become "Sometimes Power & Light." Transmission
& Distribution World Apr. 1.
Lilly, B.. 2000. The Southeast Variant of the
Forest Vegetation Simulator. Forest Management
Service Center, Fort Collins, CO.
Louisiana Public Service Commission. 1998. General
Order of April 30, 1998, Ensuring Reliable Electric
Service. Docket No. U-22389.
Megawatt Daily. 1999. Michigan Tells Detroit Edison
to Improve Reliability. Dec. 1.
Michigan Public Service Commission, Electric
Division, 2000. Proposed Michigan Electric Distribution
System Performance Standards. Case No. U-12270. May 1.
Olsen, M. 2001. Life on the lines. Sacramento News
& Review Apr 19.
Oregon Public Utility Commission. 2001. Five-Year
Electric Service Reliability Study 19962000. OAR
Owens, D.K. 2001. Strengthening the Critical Link.
EEI Electric Perspectives July/Aug.
Partners in Protection. 1999. FireSmart: Protecting
Your Community from Wildfire. Chapter 1, in Vicars, M.
(Ed.). Partners in Protection, Edmonton, AB.
Plonski, W.L. 1981. Normal Yield Tables (Metric) for
Major Forest Species in Ontario. Ontario Ministry
Natural Resources, For. Res. Group, Toronto, ON.
Poole, D., and K. Clements. 2000. IP wants greater
tree-trimming freedom. The News-Gazette,
PRNewswire. 1999. Detroit Edison Honored for
Electric Service Reliability Programs. Oct. 25.
Rees, W.T., Jr., T.C. Birx, D.L. Neal, C.J. Summerson, F.L.
Tiburzi, Jr., and J.A. Thurber. 1994. Priority Trimming to
Improve Reliability. ISA conference presentation, Halifax, NS.
Rights-of-Way Online. 1999. Regulators Demand
Reliability. BASF. www.rightsofway.com/rightsofway/article.asp?at=
Rogers, B.I. 2001. Puget Sound Energy Tree Watch
Program. EEI Natural Resource Conference, Apr. 2001,
Palm Springs, CA.
St. Petersburg Times. 1999. Florida Electric Utilities
Refocus on Reliability, Customer Service. May 19.
Simpson, P., and R. Van Bossuyt. Tree-Caused
Electric Outages. J. Arboric. 22(3).
Smith, W.B., and R.M. Sheffield. 2000. A Brief Overview
of the Forest Resources of the United States, 1997.
USDA Forest Service, Washington, DC.
Watson, R.T., M.C. Zinyowera, and R.H. Moss (Eds.).
1998. The Regional Impacts of Climate ChangeAn Assessment of Vulnerability (1998). A special report
of the Intergovernmental Panel on Climate Change, Working Group II. Cambridge University Press, UK.
Transmission & Distribution World. 2002.
Vegetation Management. industryclick.com/magazinearticle.asp?
Tomich, J. 2001. Arkansas Ice Storm Cruel to
Poorly Maintained Electric Lines. Arkansas
Democrat-Gazette Jan. 9.
U.S. Forest Service. 2000. Forest Inventory and
Analysis. Resource Planning Act Assessments. 1997 RPA:
The United States Forest Resource Current Situation,
Final 1997 RPA Tables, fia.fs.fed.us/rpa.htm
. 2001. U.S. Forest Facts and Historical Trends.
U.S. Dept. of Agriculture, Forest Service, FS-696, fia.fs.fed.us.
President, Ecological Solutions, Inc.
10 Milburn Place
Sherwood Park, AB, T8A 0T8
Figure 1. New growth has been "burnt off" from occasional line
contact (center of photo).
Figure 2. Clear width: The distance measured on
the ground from the trunks at the tree line to the
Figure 3. Natural tree mortality numbers in the thousands of trees/ha. Most of the dying trees pose
a potential risk to power lines (Johnstone 1978
and Plonski 1981).
Figure 4. Sixty percent of the trees in the stand
die over 50 years. More than 40% are tall enough
to disrupt distribution service.
Figure 5. Mortality is 40% over 90 years, averaging
18 trees/ha annually.
Figure 6. Very old tree stands continue the pattern
of decreasing tree stand density over time.
Figure 7. The risk to a power line due to tree
mortality builds up rapidly. From stand age 30 to 40
years, hazard tree additions are about 100 trees/ha per
year (adapted from Johnstone 1976 and Guggenmoos 1996).
Figure 8. Where clear width is 0, the risk is
considered to be 100%, or 1. When clear width is so
large that trees cannot contact the line on failure (about
18 m in this example), the risk is 0.
Figure 9. Line strike probability chart produced byt
he optimal clear width calculator for the tree height,
line height, and tree density variables in Examples 1 and 2.
Figure 10. A spreadsheet permits a quick derivation
of the line-security implications for a proposed change
in clear width or line height. Addition of tree
removal costs reveals the cost versus the line-security benefit.
Figure 11. Tree risk assessment provides a new
means of informing the planning and maintenance
processes for power lines.
Figure 12. Tree risk assessment is used to compare options for a specific section of power line.
Figure 13. The quanitification of the cost of
line-security improvement through reduction in tree exposure via
increased clear width may justify investment in alternative
construction such as underground lines.
Figure 14. The 80% tree-caused outage reduction between 1987 and 1991 is attributed to the
chosen clearance standards (Guggenmoos 1995).
Figure 15. The tree risk assessment process reveals that the chosen
clearance standard reduced tree exposure on distribution lines 70%.
Résumé. Des personnes ont rapporté des exemples
où les arbres qui poussaient sous des lignes
électriques causaient rarement des pannes de courant électrique.
La vaste majorité des pannes causées par les arbres
l'étaient lors de bris d'arbres, particulièrement si on incluait
les pannes survenues lors d'événements climatiques
sévères. Généralement, les conflits arbres/conducteurs suite à
des bris d'arbres sont classifiés comme imprévisibles parce
que les arbres sont localisés hors de l'emprise. Dans
un environnement émergent plus compétitif, les compagnies
de services vont requérir une moyenne décroissante
des pannes dites imprévisibles. Les secteurs primaires
des pannes imprévisibles sont les zones où les fils passent à
côté ou au travers des milieux forestiers. La mortalité
d'arbres expose une ligne électrique dans le temps à un haut
degré de risque d'incidents avec un arbre. Le risque pour la
ligne est directement relié au nombre d'arbres présents
à l'intérieur de la distance de dommages contre la ligne.
Les emprises conventionnelles de dégagement laissent un
risque résiduel substantiel par rapport aux arbres. Les
programmes d'abattage d'arbres à risques ne permettent pas
d'obtenir des gains efficaces durables. Un nouveau
modèle mathématique, le calcul optimal de largeur d'emprise,
est employé pour évaluer le risque des arbres, et ce au
travers de largeurs variables de dégagement et de hauteurs des
fils. Les degrés de risques dans les chartes de probabilités
de dommages aux fils permettent de produire des comparaisons quantitatives d'options de construction
et d'entretien. La charte de probabilité de dommages à la
ligne indique qu'il y a un point de diminution des
bénéfices retrouvés quant à la sécurité de la ligne en fonction
de chaque dollar additionnel investi pour élargir la largeur
de dégagement de l'emprise.
Zusammenfassung. Andere haben berichtet,
daß Vorkommnisse, wo Bäume in Leitungen wachsen, selten
zu Stromausfällen führen. Die Mehrzahl von
baumbezogenen Stromausfällen resultieren aus Baumversagen,
besonders wenn man die Stromausfälle während
schlechter Wetterereignisse einbezieht. Im allgemeinen sind
die Konflikte, die aus Baumversagen resultieren,
als unvorhersehrbar eingestuft, weil die Bäume außerhalb
der Oberleitungen lokalisiert sind. In dem
konkurrenzstarken Umfeld brauchen die
Stromversorger ein Gefühl von abnehmenden
sogenannten unvorhersehbaren Stromausfällen. Die
vorrangigen Bereiche für unvorhersehbare Stromausfälle sind
Standorte, wo die Stromversorgung ein Waldgebiet schneidet
oder tangiert. Das Baumsterben verursacht ein hohes Risiko
für Unfälle entlang der Leitung.. Das Risiko für die Leitung
ist direkt verbunden mit der Anzahl der Bäume, in
deren Fallbereich die Leitungen liegen. Konventionelle
Abstände hinterlassen dennoch ein substanzielles Restrisiko.
Die Programme zu Entfernung von Gefahrenbäumen
liefern nicht ausreichend eine Entlastung aus der Haftung.
Hier wird ein neues mathematisches Modell angewendet, um
das Baumrisiko anhand von Parametern wie variabler
Abstand und Leitungshöhe zu untersuchen. Die aufgelisteten
Risiken in den Stromschlag-Wahrscheinlichkeits-Vermerkblättern
erlauben einen quantitativen Vergleich von
Konstruktion und Erhaltungsmöglichkeiten. Diese Vermerkblätter
zeigen, dass es einen Punkt gibt, die Leitungssicherheit in
Relation zum eingesetzten Kapital für zusätzliche Pflege setz..
Resumen. Se han reportado casos donde los
árboles que crecen dentro de líneas eléctricas raramente
resultan con problemas de fugas de energía. La vasta mayoría
de fugas relacionadas con los árboles se dan de fallas
durante condiciones climáticas severas. Generalmente, los
conflictos por el árbol conductor resultantes de fallas son
clasificados como imprevisibles debido a que los árboles
están localizados fuera del derecho de vía. Las áreas
muchas veces se encuentran donde las líneas corren adyacentes o
a través de rodales forestales naturales. La mortalidad de
los árboles expone la línea a un riesgo alto de incidentes
a través del tiempo. El riesgo está directamente relacionado
al número de árboles dentro de la línea. Los
aclareos convencionales dejan un riesgo residual de importancia.
Los programas de remoción de los árboles peligrosos no
son costeables muchas veces y un modelo matemático nuevo,
el del Cálculo del Aclareo Óptimo, es usado para evaluar
el riesgo del árbol con base en la altura y ancho de la
línea. Esto permite comparaciones cuantitativas de construcción
y opciones de mantenimiento. La probabilidad de daño
indica que hay un punto de retorno por dólares invertidos
en aclareo adicional.