Аннотация:
An improved estimate for the rate of convergence for nonlinear Markov chains is studied. Such processes are nonlinear in the sense of the distribution law, i.e. the transition probabilities are dependent on both the current state and the current probability distributions of the process. The obtained result generalizes the existing results on convergence by taking estimate over multiple number of steps. An example provided where the new result is working well, while the existing one is inapplicable.