Meet ALBERT: a new ‘Lite BERT’ from Google & Toyota with State of the Art NLP performance and 18x fewer parameters.

Less Wright
7 min readSep 28, 2019

TL;DR = your previous NLP models are parameter inefficient and kind of obsolete. Have a great day.

[*Updated November 6 with Albert 2.0 and official source code release]

Google Research and Toyota Technological Institute jointly released a new paper that introduces the world to what is arguably BERT’s successor, a much smaller/smarter Lite Bert called ALBERT. (“ALBERT: A Lite BERT for Self-supervised Learning of

--

--

Less Wright

PyTorch, Deep Learning, Object detection, Stock Index investing and long term compounding.