DEV Community

Heiko Hotz for AWS

Posted on • Edited on

NLP@AWS Newsletter 02/2022

Image description

Hello world. This is the monthly AWS Natural Language Processing(NLP) newsletter covering everything related to NLP at AWS. Feel free to leave comments & share it on your social network.

NLP@AWS Customer Success Story

Measuring customer sentiment in call centres is a huge challenge, especially for large organisations. Accurate call transcripts can help unlock insights such as sentiment, trending issues, and agent effectiveness at resolving calls in call centres.

Wix.com expanded visibility of customer conversation sentiment by using Amazon Transcribe, a speech to text service, to develop a sentiment analysis system that can effectively determine how users feel throughout an interaction with customer care agents.

Learn more about this AWS customer success story in this blog post: https://aws.amazon.com/blogs/machine-learning/how-wix-empowers-customer-care-with-ai-capabilities-using-amazon-transcribe/

Image description

AWS AI Language Services

How to approach conversation design with Amazon Lex
In this blog post you will learn how to draft an interaction model to deliver natural conversational experiences, and how to test and tune your application.

Image description

NLP on Amazon SageMaker

NLP models can be an extremely effective tool for extracting information from unstructured text data. When the data that is used for inference (production data) differs from the data used during model training, we encounter a phenomenon known as data drift. When data drift occurs, the model is no longer relevant to the data in production and likely performs worse than expected. It’s important to continuously monitor the inference data and compare it to the data used during training.

Image description

Hugging Face has been working closely with SageMaker to deliver ready-to-use Deep Learning Containers (DLCs) that make training and deploying the latest Transformers models easier and faster than ever. Because features such as SageMaker Data Parallel (SMDP), SageMaker Model Parallel (SMMP), S3 pipe mode, are integrated into the container, using these drastically reduces the time for companies to create Transformers-based ML solutions such as question-answering, generating text and images, optimizing search results, and improves customer support automation, conversational interfaces, semantic search, document analyses, and many more applications.

In this post, we focus on the deep integration of SageMaker distributed libraries with Hugging Face, which enables data scientists to accelerate training and fine-tuning of Transformers models from days to hours, all in SageMaker.

Image description

NLP@AWS Community Content

In October and November, AWS & Hugging Face held a workshop series on “Enterprise-Scale NLP with Hugging Face & Amazon SageMaker”. This workshop series consisted out of 3 parts and covers:

  1. Getting Started with Amazon SageMaker: Training your first NLP Transformer model with Hugging Face and deploying it
  2. Going Production: Deploying, Scaling & Monitoring Hugging Face Transformer models with Amazon SageMaker
  3. MLOps: End-to-End Hugging Face Transformers with the Hub & SageMaker Pipelines

The workshop has been recorded and the resources have been made available on Github so you are now able to do the whole workshop series on your own to enhance your Hugging Face Transformers skills with Amazon SageMaker.

Youtube Playlist: Hugging Face SageMaker Playlist
Github Repository: huggingface-sagemaker-workshop-series

Image description

Stay in touch with NLP on AWS

Our contact: aws-nlp@amazon.com
Email us about (1) your awesome project about NLP on AWS, (2) let us know which post in the newsletter helped your NLP journey, (3) other things that you want us to post on the newsletter. Talk to you soon.

Top comments (0)