Stein's estimator for the mean of a multivariate normal distribution with uniformly lower mean squared error than the sample mean and several of its generalizations are presented briefly in an empirical Bayes context and applied to three examples with real data. These estimators perform much better than the classical estimators. The first application predicts final 1970 batting averages for 14 major league players from their early season performance. The predictions resulting from Stein's estimator are more accurate than the maximum likelihood estimator for every batter. Then toxoplasmosis prevalence rates for 36 El Salvador cities are estimated. The generalization of Stein's estimator used for this situation is substantially better than the usual estimator. Finally, in 51 situations a computer simulation is used to estimate the size of Pearson's chi-square test for comparing binomial means. Stein's estimator and its multivariate generalizations are approximately twice as efficient as the maximum likelihood estimator.