Automated doubly robust estimation
Design goals:
Many important ideas and methods in robust statistics and causal inference can themselves be understood in terms of “efficient influence functions,” mathematical objects akin to directional derivatives on probability distributions. Surprisingly, despite their frequent usage in definitions and proofs, they are rarely used directly as computat…
Design goals:
Many important ideas and methods in robust statistics and causal inference can themselves be understood in terms of “efficient influence functions,” mathematical objects akin to directional derivatives on probability distributions. Surprisingly, despite their frequent usage in definitions and proofs, they are rarely used directly as computational tools.
In this module, we aim to implement general algorithms for efficiently computing influence functions and to use these in turn to build general versions of doubly-robust estimation and other robust estimation approaches that compose with ChiRho’s counterfactual semantics and Pyro’s rich library of scalable inference algorithms.
Non-goals:
Making doubly robust estimators work well in practice will likely require more sophisticated data splitting approaches than the standard of a single, non-overlapping train/test split, but building general machinery for data splitting is out of scope for this module.
Also out of scope for now is extending our implementation of efficient influence functions to represent so-called “empirical influence functions,” closely related quantities that are used elsewhere in machine learning and statistics. The relationship between efficient and empirical influence functions is discussed briefly in an appendix.
Notes:
This milestone is closed.
No open issues remain. View closed issues or see open milestones in this repository.