to perform the matrix multiplications that are the fundamental compute unit of neural networks. Wires connect those elements to one another, and to memory, in a grid. Systolic arrays are so named ...
Titans architecture complements attention layers with neural memory modules that select bits of information worth saving in the long term.