Infinite Memory Transformer: Attending to Arbitrarily Long Contexts Without Increasing Computation Burden
Researchers from Instituto de Telecomunicações, DeepMind, Institute of Systems and Robotics, Instituto Superior Técnico and Unbabel propose “∞-former” — a transformer model with unbound...
Source: syncedreview.com
Researchers from Instituto de Telecomunicações, DeepMind, Institute of Systems and Robotics, Instituto Superior Técnico and Unbabel propose “∞-former” — a transformer model with unbounded long-term memory (LTM) that can attend to arbitrarily long contexts.