Parameter estimates

In this section, we are going to discuss some of the algorithms used for parameter estimation.

Maximum likelihood estimation

Maximum likelihood estimation (MLE) is a method for estimating model parameters on a given dataset.

Now let us try to find the parameter estimates of a probability density function of normal distribution.

Let us first generate a series of random variables, which can be done by executing the following code:

> set.seed(100) 
> NO_values <- 100 
> Y <- rnorm(NO_values, mean = 5, sd = 1) 
> mean(Y) 

This gives 5.002913.

> sd(Y) 

This gives 1.02071.

Now let us make a function for log likelihood:

LogL <- function(mu, sigma) { 
+      A = dnorm(Y, mu, sigma) 
+      -sum(log(A)) 
+  } 

Now let us apply the function mle to estimate the ...

Get Learning Quantitative Finance with R now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.