Cross-fit ridge nuisance estimation for partially linear DML and binary-treatment AIPW
This page covers the two cross-fit estimators in crabbymetrics:
PartiallyLinearDML for a scalar continuous treatment
AIPW for a binary-treatment average treatment effect
Both use ridge regressions as nuisance learners. If a penalty grid is passed, each nuisance fit selects its own penalty within each training fold before predicting on the held-out fold.
are all cross-fit ridge nuisance models. The implementation clips \(\hat \pi(X)\) away from \(0\) and \(1\) to stabilize the finite-sample weights.
Show code
from html import escapeimport numpy as npfrom IPython.display import HTML, displayimport crabbymetrics as cmdef html_table(headers, rows): parts = ["<table>","<thead>","<tr>",*[f"<th>{escape(str(header))}</th>"for header in headers],"</tr>","</thead>","<tbody>", ]for row in rows: parts.append("<tr>")for cell in row: parts.append(f"<td>{escape(str(cell))}</td>") parts.append("</tr>") parts.extend(["</tbody>", "</table>"])return"".join(parts)
PartiallyLinearDML is the right estimator when the target is a scalar coefficient on a continuous treatment inside a partially linear model.
AIPW is the right estimator when the target is a binary-treatment ATE.
Both are intentionally narrow: the nuisance learner is ridge, the folds are explicit, and the reported standard errors come from the orthogonal influence function implied by the final score.