We propose using model reparametrization to improve variational Bayes inference for a class of models whose variables can be classified as global (common across observations) or local (observation specific). Posterior dependency between local and global variables is reduced by applying an invertible affine transformation on the local variables. The functional form of this transformation is deduced by approximating the posterior distribution of each local variable conditional on the global variables by a Gaussian distribution via a second order Taylor expansion. Variational Bayes inference for the reparametrized model is then obtained using stochastic approximation. Our approach can be readily extended to large datasets via a divide and recombine strategy. Using generalized linear mixed models, we demonstrate that reparametrized variational Bayes provides improvements in both accuracy and convergence rate compared to state of the art Gaussian variational approximation methods.