We consider two approaches to estimation. The first approach extends the model by adding the observed initial values as an extra regressor. This allows consistent estimates to be obtained by error-components GLS. This estimator is shown to be equivalent to the optimal GMM estimator for the normal homoskedastic error components model. The second approach considers a mild restriction on the initial condition process under which lagged differences in the dependent variable can be used to construct linear moment conditions in the levels equations. The complete set of moment conditions can then be exploited by a linear GMM estimator in a system of first-differenced and levels equations, rendering the non-linear moment conditions redundant for estimation. This estimator is strictly more efficient than non-linear GMM when the additional restriction is valid. Monte Carlo simulations are reported which demonstrate the dramatic improvement in performance of the proposed estimators compared to the usual first-differenced GMM estimator, especially for high values of the autoregressive parameter.