Generative models of probability density are common in descriptive modeling of data, i.e., summarizing large data sets and exploring the dependencies between their variables. Mixture models are a class of generative models particularly suitable for clustering. If it is assumed that each data sample stems from one of a set of sources that generate data independently, it is desirable to model the data with a mixture of ``experts'' corresponding to the sources. The experts cannot learn autonomously, since each of them needs to know the contribution of the others. In this paper we introduce a mixture model of autonomous experts that approximates the plain mixture density model by a set of experts that learn autonomously. Each expert stores simplified representations of the other experts and uses them to judge the contribution of the others. The representations are updated intermittently. During learning the modeling task becomes divided and conquered automatically. Back to my online publications