Using overflows to control parallel and distributed computations

Бесплатный доступ

An abstract algebraic notion of overflow is introduced and used for problems of parallel and distributed computing. Its applicability is illustrated by some examples. This work is second in the series devoted to memory-free computations and locality. (in Russian).

ID: 14336128 Короткий адрес: https://sciup.org/14336128

Ред. заметка