Simulation and theoretical studies have led to an increasing recognition that efficient generalized method of moments (GMM) estimators for models specified through moment restrictions may be severely biased for the sample sizes typically encountered in applications. GMM bias is associated with particular estimators for the Jacobian and efficient metric terms implicit in efficient GMM estimation. Inference procedures associated with efficient GMM may similarly be expected to deviate from the nominal normal or chi square distributions provided by asymptotic theory. Recently a number of other estimators which are asymptotically equivalent to GMM have been suggested. Empirical likelihood, exponential tilting and continuous updating estimators are particular examples.

This masterclass is concerned with a treatment of the class of generalized empirical likelihood (GEL) methods for moment condition models. The GEL class of estimators includes empirical likelihood, exponential tilting and continuous updating as special cases as well as estimators based on the Cressie-Read power divergence family of discrepancies. The GEL method offers attractive alternative one-step efficient estimators, not requiring explicit calculation or estimation of the efficient metric as in GMM, that are asymptotically equivalent to those based on efficient two-step or iterated GMM. GEL estimators are less prone to bias as factors that lead to bias in efficient GMM are either partially or completely absent. This particular property of GEL is especially attractive when the number of moment restrictions is large.