This chapter concerns a number of R tools that are extensions or accessories to the materials we have discussed so far. Because they are treated here at the end of the book does not mean that they are unimportant. However, my concerns have been the machinery to find estimates of nonlinear parameters by optimizing functions. A number of the tools in this chapter stress other aspects of statistical estimation that illuminate the data or models in other ways.

As maximum likelihood estimation is such a common task in computational statistics, several tools and packages exist for carrying out some of the forms of ML tasks.

`mle`

in the ** stats4** (part of the base R distribution, R Core Team (2013), but it appears that one needs to load it with

`require(stats4)`

) is intended for minimizing a function `minuslogl`

using a method chosen from `optim()`

. This tool appears to have fallen into disuse, possibly because it seems to be rather fragile. However, it does compute the solution for our Hobbs maximum likelihood example introduced in Chapter 12, and we include an illustration of use of fixed parameters (masks).require(stats4, quietly = TRUE) lhobbs.res <-function(xl, y) {# log scaled Hobbs weeds problem - - residual base parameters on log(x)x <-exp(xl)if(abs(12 * x[3]) > 50) {# check computabilityrbad <-rep(.Machine$double.xmax,length(x))return(rbad) }if(length(x) != 3)stop("hobbs.res - - parameter ...

Start Free Trial

No credit card required