crabbymetrics
  • Home
  • API
  • Binding Crash Course
  • Supervised Learning
    • OLS
    • Ridge
    • Fixed Effects OLS
    • ElasticNet
    • Synthetic Control
    • Logit
    • Multinomial Logit
    • Poisson
    • TwoSLS
    • GMM
    • FTRL
    • MEstimator Poisson
  • Semiparametrics
    • Balancing Weights
    • EPLM
    • Average Derivative
    • Double ML And AIPW
    • Richer Regression
  • Unsupervised Learning
    • PCA And Kernel Basis
  • Ablations
    • Variance Estimators
    • Semiparametric Estimator Comparisons
    • Bridging Finite And Superpopulation
  • Optimization
    • Optimizers
    • GMM With Optimizers
  • Ding: First Course
    • Overview And TOC
    • Ch 1 Correlation And Simpson
    • Ch 2 Potential Outcomes
    • Ch 3 CRE And Fisher RT
    • Ch 4 CRE And Neyman
    • Ch 9 Bridging Finite And Superpopulation
    • Ch 11 Propensity Score
    • Ch 12 Double Robust ATE
    • Ch 13 Double Robust ATT
    • Ch 21 Experimental IV
    • Ch 23 Econometric IV

ElasticNet Example

This page mirrors examples/elastic_net_example.py.

1 Fit A Regularized Linear Model

import numpy as np
from pprint import pprint

from crabbymetrics import ElasticNet

np.set_printoptions(precision=4, suppress=True)
rng = np.random.default_rng(1)
n = 600
k = 6
beta = np.array([2.0, -1.5, 0.0, 0.0, 0.8, -0.3])
intercept = -0.4

x = rng.normal(size=(n, k))
y = intercept + x @ beta + rng.normal(scale=0.7, size=n)

model = ElasticNet(penalty=0.1, l1_ratio=0.5)
model.fit(x, y)

print("true intercept:", intercept)
print("true coef:", beta)
pprint(model.summary())
true intercept: -0.4
true coef: [ 2.  -1.5  0.   0.   0.8 -0.3]
{'coef': array([ 1.849 , -1.3812,  0.    ,  0.    ,  0.6863, -0.2843]),
 'coef_se': array([0.0313, 0.0314, 0.0305, 0.0291, 0.0329, 0.03  ]),
 'intercept': -0.2741791532039864,
 'intercept_se': 0.031205312308178204}