mle package:stats4 R Documentation _M_a_x_i_m_u_m _L_i_k_e_l_i_h_o_o_d _E_s_t_i_m_a_t_i_o_n _D_e_s_c_r_i_p_t_i_o_n: Estimate parameters by the method of maximum likelihood. _U_s_a_g_e: mle(minuslogl, start = formals(minuslogl), method = "BFGS", fixed = list(), ...) _A_r_g_u_m_e_n_t_s: minuslogl: Function to calculate negative log-likelihood. start: Named list. Initial values for optimizer. method: Optimization method to use. See 'optim'. fixed: Named list. Parameter values to keep fixed during optimization. ...: Further arguments to pass to 'optim'. _D_e_t_a_i_l_s: The 'optim' optimizer is used to find the minimum of the negative log-likelihood. An approximate covariance matrix for the parameters is obtained by inverting the Hessian matrix at the optimum. _V_a_l_u_e: An object of class 'mle-class'. _N_o_t_e: Be careful to note that the argument is -log L (not -2 log L). It is for the user to ensure that the likelihood is correct, and that asymptotic likelihood inference is valid. _S_e_e _A_l_s_o: 'mle-class' _E_x_a_m_p_l_e_s: x <- 0:10 y <- c(26, 17, 13, 12, 20, 5, 9, 8, 5, 4, 8) ## This needs a constrained parameter space: most methods will accept NA ll <- function(ymax=15, xhalf=6) if(ymax > 0 && xhalf > 0) -sum(stats::dpois(y, lambda=ymax/(1+x/xhalf), log=TRUE)) else NA (fit <- mle(ll)) mle(ll, fixed=list(xhalf=6)) ## alternative using bounds on optimization ll2 <- function(ymax=15, xhalf=6) -sum(stats::dpois(y, lambda=ymax/(1+x/xhalf), log=TRUE)) mle(ll2, method="L-BFGS-B", lower=rep(0, 2)) summary(fit) logLik(fit) vcov(fit) plot(profile(fit), absVal=FALSE) confint(fit) ## use bounded optimization ## the lower bounds are really > 0, but we use >=0 to stress-test profiling (fit1 <- mle(ll, method="L-BFGS-B", lower=c(0, 0))) plot(profile(fit1), absVal=FALSE) ## a better parametrization: ll2 <- function(lymax=log(15), lxhalf=log(6)) -sum(stats::dpois(y, lambda=exp(lymax)/(1+x/exp(lxhalf)), log=TRUE)) (fit2 <- mle(ll2)) plot(profile(fit2), absVal=FALSE) exp(confint(fit2))