I am trying to use LASSO for variable selection, and attempted the implementation in R using the
glmnet
set.seed(1)
library(glmnet)
return = matrix(ret.ff.zoo[which(index(ret.ff.zoo) == beta.df$date[1]),])
data = matrix(unlist(beta.df[which(beta.df$date == beta.df$date[1]),][,-1]), ncol = num.factors)
dimnames(data)[[2]] <- names(beta.df)[-1]
model <- cv.glmnet(data, return, standardize = TRUE)
coef(model)
> coef(model)
15 x 1 sparse Matrix of class "dgCMatrix"
1
(Intercept) 0.009159452
VAL .
EQ .
EFF .
SIZE 0.018479078
MOM .
FSCR .
MSCR .
SY .
URP .
UMP .
UNIF .
OIL .
DEI .
PROD .
> coef(model)
15 x 1 sparse Matrix of class "dgCMatrix"
1
(Intercept) 0.008031915
VAL .
EQ .
EFF .
SIZE 0.021250778
MOM .
FSCR .
MSCR .
SY .
URP .
UMP .
UNIF .
OIL .
DEI .
PROD .
cv.glmnet
model$lambda.1se
The model isn't deterministic. Run set.seed(1)
before your model fit to produce deterministic results.