DEV Community

Julien Simon
Julien Simon

Posted on • Originally published at julsimon.Medium on

Talk @ Databricks Data and AI Summit 2022-”Machine Learning Hyper-Productivity with Transformers…

Talk @ Databricks Data and AI Summit 2022-”Machine Learning Hyper-Productivity with Transformers and Hugging Face”

According to the latest State of AI report, “transformers have emerged as a general-purpose architecture for ML. Not just for Natural Language Processing, but also Speech, Computer Vision or even protein structure prediction.” Indeed, the Transformer architecture has proven very efficient on a wide variety of Machine Learning tasks. But how can we keep up with the frantic pace of innovation? Do we really need expert skills to leverage these state-of-the-art models? Or is there a shorter path to creating business value in less time?

In this code-level talk, we’ll gradually build and deploy a demo involving several Transformer models. Along the way, you’ll learn about the portfolio of open source and commercial Hugging Face solutions, how they can help you become hyper-productive in order to deliver high-quality Machine Learning solutions faster than ever before.

New to Transformers?

Top comments (0)