Expectation propagation is a technique from computer science for overcoming tractability obstacles in inference for Bayesian graphical models. We show that the same principles can be used to provide fast approximate *frequentist* statistical inference. Focussing on the binary mixed model setting, we show that the inference is usually very accurate and likelihood-type theory is plausible for investigating questions such as consistency and asymptotic efficiency. It scales well to very large data sizes. This research represents joint work with Peter Hall, Alan Huang, Iain Johnstone, John Ormerod and James Yu.