Google Trains a 540B Parameter Language Model With Pathways, Achieving ‘Breakthrough Performance’ | Synced

A Google Research team further explores the scaling approach for improving language modelling, leveraging the new Pathways distributed ML system to train a 540 billion parameter autoregressive tran...

By · · 1 min read

Source: Synced | AI Technology & Industry Review

A Google Research team further explores the scaling approach for improving language modelling, leveraging the new Pathways distributed ML system to train a 540 billion parameter autoregressive transformer, Pathways Language Model (PaLM), that achieves state-of-the-art few-shot performance.