The duration of load effect in lumber as stochastic degradation
Abstract
This paper proposes a gamma process for modelling the damage that accumulates over time in the lumber used in structural engineering applications when stress is applied. The model separates the stochastic processes representing features internal to the piece of lumber on the one hand, from those representing external forces due to applied dead and live loads. The model applies those external forces through a timevarying population level function designed for timevarying loads. The application of this type of model, which is standard in reliability analysis, is novel in this context, which has been dominated by accumulated damage models (ADMs) over more than half a century. The proposed model is compared with one of the traditional ADMs. Our statistical results based on a Bayesian analysis of experimental data highlight the limitations of using accelerated testing data to assess longterm reliability, as seen in the wide posterior intervals. This suggests the need for more comprehensive testing in future applications, or to encode appropriate expert knowledge in the priors used for Bayesian analysis.
Keywords: gamma process; degradation; duration of load; wood products; accumulated damage models
1 Introduction
Wood placed under a sustained load over a period of time in an engineering application will sustain damage due to the “duration of load” (DOL) effect (Karacabeyli and Soltis, 1991). The importance of this effect led to the development of models for predicting it so that it could be incorporated into the establishment of design values and safety factors in applications. However shortcomings in these models that we now describe led to the development by the authors of the alternative approach described in this paper. The two approaches are then compared in an illustrative application.
Long term damage in a piece of lumber, resulting in a reduction of its strengthbearing capacity, depends on the load level and how it is applied (e.g., via bending, compression, tension). The speed at which a piece weakens over time may also depend on a combination of factors such as the visco–elasticity of the wood, temperature and moisture. Even a relatively small constant load applied over a sufficiently long period of time may lead to failure (known as creep rupture). According to the review of Rosowsky and Bulleit (2002), the effect was first recognized by Haupt (1867). But it does not seem to have been formally incorporated into design standards until Wood et al. (1960) produced the so–called Madison Curve for doing so. The curve is still in use today for estimating the DOL effect on the strength of wood.
However the purely empirical approach of Wood led only to a fitted curve. So an alternative dynamic model was developed to describe how damage accumulated over time as a function of the stress load profile (Barrett and Foschi, 1978a, b; Gerhards, 1979). These accumulated damage models (ADMs) differ in detail, but the idea is the same, to focus on the rate at which damage accumulates rather than the damage itself, using an ordinary differential equation (ODE)
(1) 
which represents the rate of damage accumulation for a randomly selected piece of lumber. The vector contains (random) parameters associated with the piece itself, with their joint probability distribution depending on population parameters that must also be fitted to implement the model. Once a piece is selected, the model (1) deterministically describes the rate at which damage accumulates in that piece.
While the accumulated damage at time , , is unobservable, the ODE provides a framework onto which the other elements of the model can be attached. It is calibrated so that when and no damage has occurred, and at time when the piece fails. Here , where (psi) is the applied stress at time and (psi) is the ‘shortterm breaking strength’ of the piece (commonly defined to be the stress at which the piece would fail were it to be subjected to a ramp load test of duration 1 minute).
For definiteness, this paper will focus on a representative and well–known ADM, the “Canadian model” proposed by Foschi (1984). That model is based on the twoterm approximation obtained from a Taylor expansion of in Equation (1) as a function of , namely
(2) 
where , , , , are log–normally distributed random effects for the piece. Here is known as the stress ratio threshold, and . Thus in this model, no damage accumulates in the piece when .
However Ellingwood and Rosowsky (1991) point out that the Canadian model cannot be nondimensionalized. That is a serious issue, since a model that represents a natural process cannot ultimately depend on how the quantities involved in the model are measured. The concern was seen to be of sufficient importance that Ellingwood and Rosowsky (1991) exclude the model from their comparative analysis of ADMs. The review of ADMs in Hoffmeyer and Sørensen (2007) instead modifies the Canadian model to correct the dimensions; this difficulty of the Canadian model, as well as in other ADMs, was also described in Zhai (2011) and Zhai et al. (2012). Wong and Zidek (2016) also address the problem by invoking the Buckingham theorem to build reparametrized models that are dimensionally consistent while retaining their functional form.
Another difficulty associated with the ADM approach is its computational burden due to the need to solve ODEs such as Equation (2) numerically for each individual piece of lumber, one that restricts the use of standard likelihoodbased methods for their analysis. As a result, uncertainties in both the parameter estimates and subsequent reliability calculations are difficult to quantify. Yang et al. (2017) address this latter difficulty by proposing approximate Bayesian computation techniques to perform the analysis on a solid statistical platform; however, a large cluster of CPUs is needed to carry out the subsequent reliability calculations with high–accuracy ODE numerical solvers.
As a final limitation of the ADM approach, randomness in the process of damage accumulation within a given piece is ignored, which may not be realistic. In consequence, estimates of ADM parameters are difficult to interpret as population level and piece–specific modeling are inherently intertwined. So this paper presents a new approach for modelling the DOL effect that overcomes the difficulties described above. It is based on the gamma process, a standard approach to modelling degradation that has a long history (Lawless and Crowder, 2004). But it has not previously been used for wood products as far as the authors are aware. To successfully apply this approach to lumber, we formulate a model for the timevarying shape parameter that accounts for the timevarying loads which pieces must sustain.
In a major point of departure from the ADM approach above, the degradation of a piece of lumber under the gamma process remains random (even conditional on it having being selected): it is represented by a stochastic process , that describes the damage accumulated up to time . That process is internal to the piece, and can be thought of as representing its random progress of damage. The future combination of dead and live loads, which may be a random process, are external to that piece. Given a realized load profile, these two ingredients are fused through the deterministic time–varying population level shape parameter. This separation of internal and external sources of variability has advantages in terms of the interpretability of the results and facilitates the use of principled statistical methods of analysis.
The paper presents a number of notable findings that we now summarize.

Weak evidence is found of a threshold effect below which no degradation in the population occurs, that being an estimated threshold level of 413psi for the population from which the test data in our illustrative application are drawn. But the posterior credibility bands are wide, making the possibility of no threshold quite plausible.

The experimental data suggest that the degradation as a function of time for a lumber population cannot be explained by the simple power law commonly used in gamma process applications.

Our reliability analysis suggests that under a simulated future dynamic occupancy load, the chance of failure of a piece of lumber before the end of fifty years is 9%. This contrasts with a more optimistic estimate 1.5% obtained by application of an ADM under the same simulated future.

Finally, the analysis of future residual life of a piece of lumber that has survived at least four years under a constant load of 3000psi has a long righttailed distribution with a median survival time in the range of 22 to 333 years if that same load is sustained indefinitely. That substantial amount of uncertainty revealed by our analysis, points to the need for much more testing to precisely estimate reliability under low sustained loads.
The paper is organized as follows. Section 2 introduces the gamma process as a way of describing the damage due to the stress applied to wood when placed in service. Section 3 describes how the gamma process may be used to model degradation. In particular it is shown how the random degradation process for a randomly chosen piece of lumber gets coupled to a model for its population and how the applied stress profile operates to cause damage to accumulate. A major contribution follows next in Section 4: a Bayesian approach for applying the gamma process model is developed and applied to data obtained in an accelerated testing experiment designed to explore the duration of load effect. A discussion about the lessons learned in Section 4 about the reliability of lumber comes next in Section 5. Another application follows in Section 6; there it is shown how the residual life of a piece of lumber in service can be predicted. Further discussion and concluding remarks follow in Section 7.
2 The gamma process as a specimen–specific stochastic model
In this section we briefly review the basics of the gamma process as it relates to modeling lumber degradation.
Let be the stochastic process representing the accumulated damage (or degradation in the terminology of reliability theory) in a piece of lumber at time . Assume and that is nondecreasing over time, as any damage sustained is irreversible. We say that the piece reaches a state of failure at time when the damage exceeds a prespecified threshold level indicating failure. Without loss of generality, we may scale the degradation process so that failure occurs at . Virtually, the degradation process can be thought to continue for even though by that time the specimen will have failed.
Conditional on the parameters for a randomly selected lumber specimen, assume has stochastically independent increments, i.e. for any sequence of times , the increments are stochastically independent. The distribution of these increments may depend on factors internal to the specimen as well as the external effects of the applied stress, resulting in damage that accumulates as a series of successive jumps of random size. The particularly simple family of models we adopt assumes is a compound Poisson process with intensity function , i.e.
where conditional on the model parameters
while the random jumps , which are independent of the Poisson count process, have a gamma distribution with shape parameter and scale . Standard theory then implies that conditional on the model parameters
As the intensity parameter increases and the gamma shape parameter decreases, we approach in the limit, the so–called gamma process that has an infinite number of infinitesimally small jumps. That model has been used extensively to model degradation. More formally
where is a nondecreasing function, and denotes the gamma distribution with scale parameter and shape parameter . The scale is a scalarvalued quantity that could also depend on fixed covariates associated with a specimen, such as the modulus of elasticity. From standard theory we then obtain
Provided that multiple gamma processes have the same scale , which in effect means each has a scale that is a known multiple of , their sum is also a gamma process. More precisely, assume that conditional on and , the , are independent gamma processes with shapes and scales . Then the sum
(3) 
is also a gamma process with shape and scale .
This is a useful property since it provides a convenient framework for combining different processes that contribute to degradation. The process corresponding to damage due to the applied load profile is that of primary interest in this paper. As potential extensions, other external factors that contribute to damage such as the timevarying moisture content and temperature of the environment could be incorporated as separate components in Equation (3). However, we do not at present have the data to illustrate these refinements to the model.
3 Degradation to failure
3.1 Probability distribution of failure time
The gamma process induces a probability distribution of failure times , which we briefly review as follows. Detailed proofs of these results can be found in Paroissin and Salami (2014). The survival function for is
(4) 
where denotes the upper incomplete gamma function. When is differentiable, it follows that the probability density of needed for the construction of the likelihood function is
(5)  
where is the digamma function and is the generalized hypergeometric function of order .
3.2 Damage due to load applied
Of primary interest is characterizing the gamma process representing damage due to the stress applied. Suppose the load profile over time with which the population is stressed, is given. Then has a fundamental role in determining the corresponding value of . In particular, must account for the degradation effects of the entire load history profile until time . For lumber degradation, we assume two basic properties for :

If for , then , where is a threshold stress level below which the population does not undergo degradation.

If is held at a constant level larger than for , then is decreasing over the interval .
The first property implies that degradation does not progress during periods when the stress is too low to cause damage. The threshold is a population analogue of the damage threshold commonly seen in ADMs (see Introduction). The second property captures the DOL effect: if the load is held constant at a stress level high enough to cause failures in the population, degradation continues as that constant load is maintained but the rate at which it occurs is expected to slow over time. These properties will guide the specific choice of .
3.3 A model for the shape parameter
We now develop a specific functional form for the shape parameter along with parameters to be estimated from data in the illustrative example. The “power law” and its variants have been commonly used to model degradation and serves as a useful starting point for developing specific model implementations.
Suppose is a given load level held constant over time. Then we can conceive a simple form for to characterize the degradation in a population of pieces subject to that load from time to as
(6) 
where is an increasing function that captures the DOL effect, and are positive constants, and . Here the term is constant over time, depending only on the size of the load. It is zero when that stress level is sufficiently low in accord with property (i), that is, when which corresponds to the stress threshold below which no degradation occurs. The function governs the rate of degradation in the population over time under that fixed load. The simple form with would reproduce the wellknown power law. Various modifications can be made to increase its flexibility to model the degradation behaviour, and we will perform our subsequent analysis using the form where are all positive parameters with , which has the feature of mixing two different power law growth rates. In particular by setting the constraint we expect that will capture the shorterterm effect well, while the role of becomes more important over longer time durations.
In practice, the load may vary over time. Let denote a sequence of load levels spanning the range of loads under which the population may be subjected. Then for each load level , , we can consider the amount of incremental degradation due to load beyond that which was sustained from load . Then a natural analogue to Equation (6) for this load increment, for time 0 to , is
where is the total time duration for which the load exceeded . Thus the constant term captures the incremental ‘jump’ in that occurs due to load level being reached. Similarly , as a function of the total length of time for which the load level is sustained, now models its corresponding DOL effect.
We can then combine the contributions of all the load levels to construct for any arbitrary given load profile. Using our chosen form for , we thus obtain
(7) 
which reduces to Equation (6) in the special case that the load is held constant at from time to . It can be seen that is differentiable, since if , we have
(8) 
Thus when the exponents and are each less than 1, is decreasing over any period with a fixed load level, in accord with property (ii).
A specific sequence of load levels needs to be chosen for computation. These serve as the incremental thresholds over which additional degradation contributions are added into the model. For example, if psi and psi, then any loads in the interval would contribute the same amount to in this model as a load of exactly 3000psi. Naturally, the range of loads may be discretized as finely as desired to faithfully reproduce the stress history, at the cost of additional computation time. In our demonstration we use an equallyspaced sequence for with intervals of 20psi. An artifact of the discrete load levels in the model is that if the load profile has periods of continuous increase, the resulting becomes jagged as the load passes the different thresholds rather than smoothly increasing with the load. In this case a line segment can be used to smooth between the time points when successive load thresholds are reached, to serve as an acceptable approximation.
4 A Bayesian analysis of degradation
This section presents a Bayesian analysis of data from an accelerated testing experiment designed to explore the duration of load effect.
4.1 The data
The real data we subsequently analyze come from the DOL experiment reported in Foschi and Barrett (1982). It consists of a total of 637 pieces of visually graded 2x6 Western Hemlock, divided for testing under three different load profiles (all time units in hours unless otherwise indicated):

198 pieces were assigned the load profile
i.e., the load was increased linearly until reaching 3000psi, and held at that constant level for 4 years. Hence pieces that do not fail by the end of the 4year period when the test is truncated have their failure time censored.

300 pieces were assigned the load profile
which is similar to the above, now with a constant load level of 4500psi for 1 year. Pieces that do not fail by the end of the 1year period when the test is truncated have their failure time censored.

139 pieces were assigned the load profile until failure.
In the DOL literature, profiles 1 and 2 are known as ‘constant load’ tests, while profile 3 is known as a ‘ramp load’ test. These are socalled ‘accelerated’ testing schemes that were originally designed to help elucidate the longterm DOL effect using tests of relatively shorter duration (Barrett and Foschi, 1978b).
Each piece that failed during the test had its failure time recorded. Pieces that did not fail during the test duration had their censoring times recorded (i.e., 4 years for group 1 and 1 year for group 2). No covariates for individual specimens were recorded in the data.
4.2 Fitting the degradation model
We now perform an illustrative analysis of these accelerated testing data based on the model developed, using the techniques of Bayesian inference. Let denote the vector of parameters to be inferred, which consists of the five parameters associated with the model for along with the gamma process scale parameter , namely . Let denote the joint prior distribution on . Then using the likelihood in Equation (5), the posterior distribution of based on an independent sample of test specimens with recorded failure times is given by
(9) 
where denotes evaluating Equation (7) for at time according to the load profile associated with specimen . For some specimens the actual failure times are not observed, as the test has ended after a specified duration without the specimen failing. Then the likelihood contribution for those specimens is replaced by the corresponding survivor function, namely computed by Equation (4) where is the truncation time.
Equation (9) thus can accommodate all the test data to be analyzed under the different loading profiles employed in the experiment. Importantly, we emphasize it is assumed that the same set of parameters can model the degradation of the population under any loading scenario. That assumption, which implies that the parameters of a fitted model can then be used with any load profile of interest, has been fundamental to much of the previous work with ADMs that involve the probabilistic assessment of longterm lumber reliability. An example of such follows in Section 5.
To proceed with the analysis, we use vague independent Normal(, ) priors for each of the parameters in , along with the restriction . As the form of the posterior is intractable for direct sampling, we employ Markov Chain Monte Carlo (MCMC) techniques to obtain sample draws from it. To obtain reasonable starting values for the MCMC, we first used Nelder–Mead iterations to optimize the posterior. Then, to improve convergence and the efficiency of posterior exploration via MCMC, we used parallel tempering (Swendsen and Wang, 1986) distributed over 120 compute cores, with each core running a MCMC chain using simple MetropolisHastings iterations and temperatures geometrically spaced from 1 to 20. Swaps between chains were performed every five iterations. The first 5,000 iterations were discarded as burnin, and the following 15,000 iterations from the chain representing the target posterior distribution constitute our final samples.
Summaries of the posterior samples of the parameters are shown in Table 1. A few observations can be noted. First, there is a clear distinction between the powers and , with posterior means of 0.019 and 0.40 respectively, indicating that a single power law does not adequately explain the observed degradation over time. Second, there is only weak evidence for a stress threshold below which no population degradation occurs; the MCMC samples yield a posterior mean for the threshold level of 413psi and a highly uncertain 95% posterior interval so that a very low threshold is plausible. Third, the highest uncertainty is in the parameter , whose central 95% posterior probability interval spans two orders of magnitude: . This indicates that the true degradation behaviour over longer time durations (i.e., a year or more) is highly uncertain from these data alone, with the two constant load tests having been truncated at 1 and 4 years.
Parameter  Posterior quantiles  Posterior mean  

50%  2.50%  97.50%  
0.019  0.012  0.027  0.019  
0.00729  0.00071  0.03732  0.01026  
0.39  0.25  0.60  0.40  
0.00088  0.00071  0.00108  0.00088  
0.388  0.041  0.584  0.359  
0.21  0.16  0.26  0.21 
The proposed model fits the data for the three test scenarios well, as can be seen in the plots in Figure 1. The cumulative distribution functions (CDFs) computed from the sampled parameter vectors largely capture the empirical distributions. The 95% posterior bands, shown in grey, are also tight for the time ranges over which failures are observed. Beyond the test truncation times, namely 4 years for the 3000psi constant load group and 1 year for the 4500psi constant load group, the uncertainty increases substantively as seen in the width of the posterior intervals. Hence projections of degradation over the long term, say 30 or 50 years, based on these data alone would likewise have very high variability.
5 Reliability analysis: an illustrative example
We now turn to applying the fitted model to an example of a predictive scenario, such as those analyzed in reliability assessments. Foschi et al. (1989) use stochastic processes to characterize load profiles on individual lumber members over the lifetime of a wood structure, and an adapted example of a heavier than typical 50year load profile for a residential dwelling unit is shown in the left panel of Figure 2. This profile is a piecewise constant function obtained by summing different component loads. Intuitively, the total load at any given time includes the constant dead weight of the structure, along with load from occupancy which varies by resident. In addition, the ‘spikes’ correspond to various shortterm loads that are expected to occur periodically in homes.
Using the parameters from the fitted model, we may compute corresponding to this load profile using Equation (7). The solid black curve shows computed for this 50year period using the sampled parameter vector with the highest posterior density. It can be seen that increases rapidly the first time the load exceeds a new threshold, for example, at time 2 years (load 1675psi) and 15 years (load 2050psi). Subsequent loadings translate to more modest degradation increases over time, as expected from the DOL effect; for example, the second time the load exceeds 2000psi at time 48 years its effect on is much more diminished. As before, the grey area represents 95% posterior bands based on the MCMC samples.
Ultimately the probability of failure by the end of the 50year period is of primary interest. This is determined by the value of at 50 years, along with the scale parameter of the gamma process according to Equation (4). We obtain the posterior mean for the probability of failure of 0.090, and a central 95% posterior interval of .
The reliability calculations based on the gamma process model are fast and simple, compared to the ADM approach which requires numerically solving an ODE for a large number of simulated pieces to estimate the probability of failure. To compare results, the approach of Foschi et al. (1989) based on the Canadian ADM and their parameter estimates from these same data, yield a 50year failure probability of 0.015 for this load profile. Thus there is a large discrepancy between the longterm predictions from the different approaches, even though both approaches are able to fit the empirical data quite well. However Foschi’s approach does not provide for the construction of confidence intervals to assess uncertainty. We comment on this issue further in the discussion section.
6 Predicting the residual life of lumber in service
As a further application of the fitted Bayesian model, we may use the MCMC samples to compute the posterior probability distributions of the residual life for pieces that have not failed up to a given time . This requires a knowledge of , which in our model is computed from the load profile and the fitted parameters, as well as a characterization of the expected future loads . Letting denote the random variable for the failure time, then of interest is the distribution of which represents the remaining lifetime. It has survivor function
which may be computed using Equation (4).
To illustrate, we use the two constantload scenarios in the experimental data, where specimens were held at load levels of 3000psi and 4500psi for 4 years and 1 year respectively. Consider the distribution of remaining lifetime of the surviving specimens, if these constant load levels were maintained indefinitely. These survivor functions are shown, for up to 100 more years, in Figure 3, with posterior uncertainty shown by the grey bands. These distributions have very long right tails, corresponding to the strongest members of the population which can carry these load levels almost indefinitely. As such, the mean residual lifetime is not very meaningful. Instead quantities such as the time until 50% of the survivors fail, namely the median of these distributions, may be of interest. Using the MCMC samples, we calculate the 95% posterior intervals of these medians to be years under 3000psi and years under 4500psi. It can be seen that there is much higher uncertainty associated with these distributions at the lower load level.
7 Discussion and conclusions
In the analysis of the experimental data we found that the effect of degradation from a constant load due to time, as modeled in the shape parameter, was not a power law . This is evident by examining the plots in Figure 1. With a simple power law, the CDF would be approximately linear as a function of logtime during the constant load period. Instead, the empirical CDF increases quite nonlinearly with time on the logscale. This led us to posit adding a second power term to the model, yielding with . This form provides a good fit to the data, however with wide posterior intervals for the parameters and . That in turn translates to the high uncertainty that we find associated with using tests of 1 and 4 year durations to predict reliability and residual lifetime over much longer periods, such as 50 years. Larger tests, or over longer periods, would be necessary to reduce this variability.
In the work by Foschi et al. (1989), a crucial parameter in the Canadian ADM used for reliability analysis is the ‘stress threshold’ . In that model it is hypothesized that an individual piece of lumber does not accumulate damage when the load is below , where is the strength of that piece as measured in a shortterm ramp load test. That work reported an estimate for the population mean of to be 0.533; based on that estimate along with a population mean shortterm strength of 6900psi, most pieces do not eventually fail under the load levels seen in the residential example of Figure 2. However, in subsequent reanalysis of that model based on the same data, the population mean for was found to be highly uncertain and a strong Bayesian prior was needed to stabilize its estimate (Yang et al., 2017); in fact, a mean of can still fit the empirical data well by adjusting the other ADM parameters. Hence, when such parameter uncertainty is accounted for, the ADM approach likewise would yield wide prediction intervals. It may well be that the estimate of 0.533 reflects some other, not explicitly reported prior knowledge about the behaviour of lumber, e.g. how many wood structures have survived the test of 50 or 100 years. However in the current application no information concerning that issue was available. In the context of the gamma process approach, such information could easily be incorporated into the priors for Bayesian analysis, to set more realistic constraints on the rate of degradation over longer periods.
We would further note that as a piecelevel parameter in the ADM does not have a direct relationship with our estimated damage threshold of 413psi for the population. In the ADM, the population mean of is the load below which the average piece in the population is undamaged; however, the realization of cannot be assessed for any individual piece since it is unobservable. In contrast the 413psi population threshold in our model represents the stress level below which all members of the population are undamaged. Nonetheless as discussed above, both approaches show little evidence of a high damage threshold by analyzing the Hemlock data alone, when uncertainty is considered. Specialized proofloading tests (e.g., Woeste et al., 2007) may instead be more useful if estimating the damage threshold is of primary interest.
Another point of comparison between the ADM and our proposed approach lies in the number of parameters to be estimated. Fitting the Canadian ADM in particular requires estimating 10 population parameters (the five lognormal means and variances from which the random effects in Equation (2) are drawn for specific pieces of lumber), some of which do not have a clear physical interpretation. As found in Yang et al. (2017), a number of different sets of these population parameters could lead to essentially the same likelihood, suggesting that while the Canadian ADM can fit the empirical data well, it may be overparametrized leading to worse prediction performance due to the inflated uncertainty about the individual parameters. Our model fits the empirical data well with four fewer parameters (six), and it is simpler to see that the resulting uncertainty in prediction stems primarily from the uncertainty in the estimation of degradation rate over longer periods based on accelerated testing data.
It can be said that the results of applying the accumulated damage modeling approach along with its predecessor, the empirical model of Wood et al. (1960), have laid a foundation for incorporating long term stress effects into the calculation of design values that have stood the test of time. So why a critical review of these models at this time? The answer lies in the need for application of the methods to a new generation of forest products such as strand based wood composites (Wang et al., 2012a, b) that are also susceptible to DOL effects. Given that the new applications do not automatically inherit the record of success of the ADM, prudence suggests a reevaluation of the approach given its limitations as described in the Introduction, one that takes full advantage of the new computational and statistical methods now available. Since engineered wood composites have much lower shortterm strength variability compared to lumber, the size of the DOL effect (and its estimation) for these materials would have a more significant role in determining appropriate safety factors.
The above considerations led the authors to explore the alternative to the ADM presented in this paper and it was found to overcome many of the difficulties described above with the ADM approach. The model based on the gamma process is simpler to interpret with fewer parameters, separates external (population) and internal (individual piece) sources of variability, and lends itself well to standard statistical assessments of uncertainty. The degradation approach also led to a number of new discoveries as previously summarized in the Introduction. In particular, a key finding from our analysis is that the accelerated testing data yields poor predictors of the long term future of a piece of lumber in service. Our analysis shows very wide credibility bands for the median time to failure, particularly when the sustained load level for the test is low. This finding suggests much larger accelerated tests are needed to ensure the reliability of predictions.
Acknowledgements
The work reported in this manuscript was partially supported by FPInnovations and a CRD grant from the Natural Sciences and Engineering Research Council of Canada. The data analysed in this paper were provided by FPInnovations. We are greatly indebted to Conroy Lum and Erol Karacabeyli from FPInnovations for their extensive advice during the conduct of the research reported herein.
References
 Barrett, J. and Foschi, R. (1978a). Duration of load and probability of failure in wood. part i. modelling creep rupture. Canadian Journal of Civil Engineering, 5(4):505–514.
 Barrett, J. and Foschi, R. (1978b). Duration of load and probability of failure in wood. part ii. constant, ramp, and cyclic loadings. Canadian Journal of Civil Engineering, 5(4):515–532.
 Ellingwood, B. and Rosowsky, D. (1991). Duration of load effects in lrfd for wood construction. Journal of Structural Engineering, 117(2):584–599.
 Foschi, R. O. (1984). Reliability of wood structural systems. Journal of Structural Engineering, 110(12):2995–3013.
 Foschi, R. O. and Barrett, J. D. (1982). Loadduration effects in western hemlock lumber. Journal of the Structural Division, 108:1494–1510.
 Foschi, R. O., Folz, B., and Yao, F. (1989). Reliabilitybased design of wood structures. Number 34. Dept. of Civil Engineering, University of British Columbia.
 Gerhards, C. C. (1979). Timerelated effects on wood strength: A linear cumulative damage theory. Wood science, 11:139–144.
 Haupt, H. (1867). General theory of bridge construction. Appleton, New York.
 Hoffmeyer, P. and Sørensen, J. D. (2007). Duration of load revisited. Wood Science and Technology, 41(8):687–711.
 Karacabeyli, E. and Soltis, L. A. (1991). State of the art report on duration of load research for lumber in north america. In Proceedings of the 1991 International Timber Engineering Conference. London, United Kingdom.
 Lawless, J. and Crowder, M. (2004). Covariates and random effects in a gamma process model with application to degradation and failure. Lifetime Data Analysis, 10(3):213–227.
 Paroissin, C. and Salami, A. (2014). Failure time of non homogeneous gamma process. Communications in StatisticsTheory and Methods, 43(15):3148–3161.
 Rosowsky, D. V. and Bulleit, W. M. (2002). Another look at load duration effects in wood. Journal of Structural Engineering, 128(6):824–828.
 Swendsen, R. H. and Wang, J.S. (1986). Replica monte carlo simulation of spinglasses. Physical Review Letters, 57(21):2607.
 Wang, J. B., Foschi, R. O., and Lam, F. (2012a). Durationofload and creep effects in strandbased wood composite: a creeprupture model. Wood science and technology, 46(13):375–391.
 Wang, J. B., Lam, F., and Foschi, R. O. (2012b). Durationofload and creep effects in strandbased wood composite: experimental research. Wood science and technology, 46(13):361–373.
 Woeste, F., Green, D., Tarbell, K., and Marin, L. (2007). Proof loading to assure lumber strength. Wood and fiber science, 19(3):283–297.
 Wong, S. W. and Zidek, J. V. (2016). Dimensional and statistical foundations for accumulated damage models. arXiv preprint arXiv:1708.03018.
 Wood, L. W. et al. (1960). Relation of strength of wood to duration of load. Madison, Wis.: US Dept. of Agriculture, Forest Service, Forest Products Laboratory.
 Yang, C.H., Zidek, J. V., and Wong, S. W. (2017). Bayesian analysis of accumulated damage models in lumber reliability. arXiv preprint arXiv:1706.04643.
 Zhai, Y. (2011). Dynamic duration of load models. Master’s thesis, University of British Columbia, Department of Statistics.
 Zhai, Y., Pirvu, C., Heckman, N., Lum, C., Wu, L., and Zidek, J. V. (2012). A review of dynamic duration of load models for lumber strength. Technical report, TR 270, Department of Statistics, University of British Columbia.