XLNet — A new pre-training method outperforming BERT on 20 tasks

Posted by Max Xu on 20 June 2019

In 2018, Google published bidirectional, transformer-based pre-training of large scale language model BERT, breaking 11 state-of-the-art records in Natural Language Processing. It has brought great excitement for the NLP field.

Very quickly, BERT has spread like wild fire within the research community, derivative research work have started to emerge.

While the shockwaves BERT created have yet to calm down, another a brand new model emerged today.

Read More

Is Your Company Ready for Deployment of Virtual Assistants? (Chatbots)?

Posted by Lee Hwee Teo on 15 May 2019

Synopsis

Enterprise chatbots are commonly known as virtual assistants, which are essentially automated virtual employees. They make use of conversational Artificial Intelligence (AI), such as Natural Language Processing (NLP), which has advanced tremendously in last two years. This article details the most common use cases and roadblocks in deploying enterprise chatbots.

Read More

How Should Bots Speak to Humans?

Posted by Spencer Yang on 10 November 2016

BotSpeak database — 1 human, 200+ bots, 892 bot dialogs

TL;DR —Designing great bot dialog is tough but crucial to increase engagement. We talked to over 200 Facebook Messenger bots to extract conversation dialogs for your reference: BotSpeak. Evaluating the output from bots, we adopted a pure data approach for lexical improvements and anecdotal advice on better presentation of the bot conversations.

How do you feel about conversations with bots so far? Are they engaging, disappointing or interesting?

Read More