Daily News
· 1 min read
Apple AI Updates: April 24, 2026
1. Apple Research Introduces ParaRNN, a Parallel-Trainable Nonlinear RNN
Apple. Apple’s ML research team published ParaRNN, a nonlinear recurrent architecture that can be trained in parallel at transformer-style scales — the paper’s framing is that this is the first time nonlinear RNNs have been made practical to train with billions of parameters. The work is of particular interest as a Transformer alternative for long-context regimes, where recurrence’s memory-bounded cost scales linearly in sequence length. Apple has been steadily pushing RNN-adjacent architectures and efficient on-device inference research, and ParaRNN is the most ambitious scaling result in that thread so far. Source