80 likes | 198 Vues
This presentation explores the derivation and implementation of the Newton-Raphson update method, starting from the first and second derivatives of the log-likelihood function. We discuss the advantages of using Fisher scoring as an alternative method for maximization, highlighting its lower computational cost and ease of implementation. Performance evaluation is conducted by estimating the standard error of maximum likelihood estimates (MLEs), where the diagonal terms of the inverse of Fisher information matrix are utilized to obtain these standard errors.
E N D
Assignment one presentation Question 2.5 (a)-(d) byHao Ding
Derive Newton-Raphson Update A Starting point First derivative of Log – Likelihood Second derivative of Log – Likelihood
Newton-Raphson: Implementation solve(gpp) %*% gp
Fisher Scoring Compute the Fisher Information Using Fisher Information to replace second derivative
Newton VS Fisher Scoring Implementation cost • Fisher Scoring is easier to implement: less computational cost Performance
Estimate Standard Error of MLEs . Evaluate the Fisher Information with MLEs The standard errors are the diagonal terms of the inverse of Fisher Information