Digital transformation is exciting - as a solution provider, we get it. It is no wonder we idealize the digital era, it's almost like the American Dream; people who believed in it, believed that anything is possible.
In 2018, Google published bidirectional, transformer-based pre-training of large scale language model BERT, breaking 11 state-of-the-art records in Natural Language Processing. It has brought great excitement for the NLP field.
Very quickly, BERT has spread like wild fire within the research community, derivative research work have started to emerge.
While the shockwaves BERT created have yet to calm down, another a brand new model emerged today.