XLNet is a generalized autoregressive pretraining method for language understanding. Unlike standard autoregressive (AR) language models that only use either past or future context, XLNet maximizes the expected log-likelihood of a sequence with respect to all possible permutations of the factorization order. It integrates the advantages of both AR language modeling and autoencoding (AE) while avoiding their limitations. It's commonly used for tasks like text classification, question answering, and natural language inference.
This tech insight summary was produced by Sumble. We provide rich account intelligence data.
On our web app, we make a lot of our data available for browsing at no cost.
We have two paid products, Sumble Signals and Sumble Enrich, that integrate with your internal sales systems.