(2022-07-01) Transformers The Rise And Rise Of Hugging Face

Ruchin Kulkarni: Transformers: The rise and rise of Hugging Face. “The original agreement on the European Economic Area was signed in August 1992.” Let’s try translating that sentence to French. This is an example that researchers at Google Brain used to illustrate the magic of the “transformer” model in a 2017 paper titled “Attention is all you need” - the greatest thing to happen to Natural Language Processing/Understanding (NLP/U) since sliced bread.

Understanding the underlying meaning and context of an input is a pretty powerful thing, and is the secret sauce powering all the cutting-edge NLP-powered tools you’re likely to encounter today

Today, if you want to leverage the power of transformers, and don’t have the processing power of Google to build it from scratch (we’ll pause while you check) you can, thanks to three européens who open-sourced the transformers library (and many others too, btw) and are well on their way to democratizing machine-learning - Clem Delangue, Julien Chaumond, and Thomas Wolf - co-founders of 🤗 - Hugging Face.

Bit by the ML bug, his work on a collaborative note-taking app idea connected him with a fellow entrepreneur building a collaborative e-book reader - Julien Chaumond.

The duo met with Chaumond’s friend from college, who was now active in ML research, and together they set out to build an “open-domain Conversational AI” - the sort of AI that features in the movie “Her.” (chatbot)

Julien says that the chatbot was an excuse for the early team to dive into the state-of-the-art NLP and the bleeding-edge research of the time. It was an early runaway success

Accuracy improvements in the Hugging Face bot’s responses didn’t seem to correlate with growth or retention.

Around two years later, the “Attention is all you need” paper marked the beginning of the age of transformers. Hugging Face, who had already released parts of the powerful library powering their chatbot as an open-source project on GitHub, open-sourced the hot new thing in NLP and made it available to the community.

On May 7, they raised $100 million in Series C funding at a $2B valuation

Since pivoting away from the chatbot, Hugging Face has been on a mission to advance and democratize artificial intelligence through open source and open science.

With ~100,000 pre-trained machine-learning models and <10,000 datasets currently hosted on the platform, Hugging Face enables the community and 10,000+ companies including Grammarly, Chegg, and others to build their own NLP capabilities

Even in the early days of Hugging Face, the founders were quick to notice that the community of people interested in “large language models applied to text” was dense. In open-sourcing their early libraries, they stumbled upon a handful of super-users in the community

Hugging Face aspires to build the #1 community for machine learning.

Hugging Face taps into some key community dynamics that drive engagement and growth. Chief among them was the Hugging Face Hub. The team started building the hub when they found the need for a platform for users of transformers and dataset libraries to easily share their models or datasets

CEO and co-founder Clem believes that with open-source models, Hugging Face can harness the power of a community to do things differently - “deliver a 1000 times more value” than a proprietary tool

For Hugging Face, monetization is still an early play - they started their paid offerings just last year, and already count 1000+ companies as customers including Intel, eBay, Pfizer, and Roche. The advances in transfer learning meant that transformers are effective not just in NLP, but in other domains as well

Transformers evolving to become a general-purpose architecture for speech, computer vision, and even protein structure prediction holds Hugging Face in good stead - at the intersection of overlapping domains.


Edited:    |       |    Search Twitter for discussion