Bratt Swan Bratt Swan - 2 months ago 9
R Question

Mean and standard deviation in normal distribution and log-normal distribution

Below is my code:

set.seed(1)


par(mfrow=c(1,2))
lognorm.gen <- function(mu,sigma){
ns <- rnorm(1000,mu,sigma)
ns <- exp(ns)

hist(ns,probability = T, main = expression(paste("Sample Density Curve", mu, sigma)))
y <- seq(0,15,length=100)
lines(y,dlnorm(y,mu,sigma))
}

lognorm.gen(0,0.25)


I generated samples from normal then transformed them into lognormal distribution. Firstly, I am using
mu
and
sigma
as parameters in
rnorm()
, then I was supposed to use
exp(mu)
and
exp(sigma)
in
dlnorm()
. However, the plot showed that line and histogram are off a lot. Instead,
mu
and
sigma
in
dlnorm()
fit line into histogram well. So I am wondering why I shouldn't use
exp(mu)
in this case?

Answer

Please read ?dlnorm:

 dlnorm(x, meanlog = 0, sdlog = 1, log = FALSE)
 plnorm(q, meanlog = 0, sdlog = 1, lower.tail = TRUE, log.p = FALSE)
 qlnorm(p, meanlog = 0, sdlog = 1, lower.tail = TRUE, log.p = FALSE)
 rlnorm(n, meanlog = 0, sdlog = 1)

meanlog, sdlog: mean and standard deviation of the distribution on the
      log scale with default values of ‘0’ and ‘1’ respectively.

Mean and standard deviation are specified in log scale. That is why you still need the same mu and sigma as you used in rnorm, not exp(mu) and exp(sigma).