GPT-2

Generative Pretrained Transformer 2, commonly known by its abbreviated form GPT-2, is an unsupervised transformer language model and the successor to GPT. GPT-2 was first announced in February 2019, with only limited demonstrative versions initially released to the public. The full version of GPT-2 was not immediately released out of concern over potential misuse, including applications for writing fake news.[51] Some experts expressed skepticism that GPT-2 posed a significant threat. The Allen Institute for Artificial Intelligence responded to GPT-2 with a tool to detect "neural fake news".[52] Other researchers, such as Jeremy Howard, warned of "the technology to totally fill Twitter, email, and the web up with reasonable-sounding, context-appropriate prose, which would drown out all other speech and be impossible to filter".[53] In November 2019, OpenAI released the complete version of the GPT-2 language model.[54] Several websites host interactive demonstrations of different instances of GPT-2 and other transformer models. https://en.wikipedia.org/wiki/OpenAI#GPT-2


Edited:    |       |    Search Twitter for discussion