We consider the problem of estimating functions in a Gaussian regression distributed and nonparametric framework where the unknown map is modeled as a Gaussian random field whose kernel encodes expected properties like smoothness. We assume that some agents with limited computational and communication capabilities collect M noisy function measurements on input locations independently drawn from a known probability density. Collaboration is then needed to obtain a common and shared estimate. When the number of measurements M is large, computing the minimum variance estimate in a distributed fashion is difficult since it requires first to exchange all the measurements and then to invert an M χ M matrix. A common approach is then to circumvent this problem by searching a suboptimal solution within a subspace spanned by a finite number of kernel eigenfunctions. In this paper we analyze this classical distributed estimator, and derive a rigorous probabilistic bound on its statistical performance that returns crucial information on the number of measurements and eigenfunctions needed to obtain the desired level of estimation accuracy.