DEV Community

Cover image for Day 4: Step-by-step Guide to Language Service in Microsoft Azure AI Services - 2
SudhirGhagare
SudhirGhagare

Posted on

Day 4: Step-by-step Guide to Language Service in Microsoft Azure AI Services - 2

Now that we have created the language service resource in the last post, in this post, we are going to try out Language service in our Visual Studio code IDE.

If you don't have Visual Studio Code & Git please install Visual Studio code and install git first before starting with these posts.


Integrating Azure AI-Language Functionality

You'll develop your text analytics app using Visual Studio Code. The code files for your app have been provided in a GitHub repo.

Tip: If you have already cloned the mslearn-ai-language repo, open it in Visual Studio code. Otherwise, follow these steps to clone it to your development environment.

  1. Open Git Bash and type the following command: git clone https://github.com/MicrosoftLearning/mslearn-ai-language

Check out the project GitHub Repo

  1. When the repository has been cloned, open the folder in Visual Studio Code.

  2. In Visual Studio Code, in the Explorer pane, browse to the Labfiles/01-analyze-text folder and expand the CSharp or Python folder depending on your language preference and the text-analytics folder it contains. Each folder contains the language-specific files for an app into which you're going to integrate Azure AI-Language text analytics functionality.

  3. Right-click the text-analytics folder containing your code files open an integrated terminal and type the following command :

C#:

dotnet add package Azure.AI.TextAnalytics --version 5.3.0
Python:

pip install azure-ai-textanalytics==5.3.0
pip install python-dotenv

Enter fullscreen mode Exit fullscreen mode
  1. In the Explorer pane, in the text-analytics folder, open the configuration file for your preferred language
  • C#: appsettings.json
  • Python: .env
  1. Update the configuration values to include the endpoint and a key from the Azure Language resource you created (available on the Keys and Endpoint page for your Azure AI-Language resource in the Azure portal)

  2. Save the configuration file.


Note that the text-analysis folder contains a code file for the client application:

C#: Program.cs
Python: text-analysis.py

Open the code file and at the top, under the existing namespace references, find the comment Import namespaces. Then, under this comment, add the following language-specific code to import the namespaces you will need to use the Text Analytics SDK:

C#: Programs.cs

// import namespaces
using Azure;
using Azure.AI.TextAnalytics;
Enter fullscreen mode Exit fullscreen mode

Python: text-analysis.py

# import namespaces
from azure.core.credentials import AzureKeyCredential
from azure.ai.textanalytics import TextAnalyticsClient

Enter fullscreen mode Exit fullscreen mode

In the Main function, note that code to load the Azure AI Language service endpoint and key from the configuration file has already been provided. Then find the comment Create client using endpoint and key, and add the following code to create a client for the Text Analysis API:

C#: Programs.cs

// Create client using endpoint and key
AzureKeyCredential credentials = new AzureKeyCredential(aiSvcKey);
Uri endpoint = new Uri(aiSvcEndpoint);
TextAnalyticsClient aiClient = new TextAnalyticsClient(endpoint, credentials);
Enter fullscreen mode Exit fullscreen mode

Python: text-analysis.py

# Create client using endpoint and key
credential = AzureKeyCredential(ai_key)
ai_client = TextAnalyticsClient(endpoint=ai_endpoint, credential=credential)
Enter fullscreen mode Exit fullscreen mode

Save your changes return to the integrated terminal for the text-analysis folder, and enter the following command to run the program:

C#: dotnet run
Python: python text-analysis.py

Observe the output as the code should run without error, displaying the contents of each review text file in the reviews folder. The application successfully creates a client for the Text Analytics API but doesn't make use of it. We'll fix that in the next procedure.


Add code to detect language

Now that you have created a client for the API, let's use it to detect the language in which each review is written.

In the Main function for your program, find the comment Get language. Then, under this comment, add the code necessary to detect the language in each review document:

C#: Programs.cs

/

/ Get language
DetectedLanguage detectedLanguage = aiClient.DetectLanguage(text);
Console.WriteLine($"\nLanguage: {detectedLanguage.Name}");
Enter fullscreen mode Exit fullscreen mode

Python: text-analysis.py

Get language

detectedLanguage = ai_client.detect_language(documents=[text])[0]
print('\nLanguage:{}'.format(detectedLanguage.primary_language.name))
Enter fullscreen mode Exit fullscreen mode

Note: In this example, each review is analyzed individually, resulting in a separate call to the service for each file. An alternative approach is to create a collection of documents and pass them to the service in a single call. In both approaches, the response from the service consists of a collection of documents; which is why in the Python code above, the index of the first (and only) document in the response ([0]) is specified.

Save your changes. Then return to the integrated terminal for the text-analysis folder, and re-run the program.

Observe the output, noting that this time the language for each review is identified.


Add code to evaluate sentiment

Sentiment analysis is a commonly used technique to classify text as positive or negative (or possible neutral or mixed). It's commonly used to analyze social media posts, product reviews, and other items where the sentiment of the text may provide useful insights.

In the Main function for your program, find the comment Get sentiment. Then, under this comment, add the code necessary to detect the sentiment of each review document:

C#: Program.cs

// Get sentiment
DocumentSentiment sentimentAnalysis = aiClient.AnalyzeSentiment(text);
Console.WriteLine($"\nSentiment: {sentimentAnalysis.Sentiment}");
Enter fullscreen mode Exit fullscreen mode

Python: text-analysis.py

# Get sentiment
sentimentAnalysis = ai_client.analyze_sentiment(documents=[text])[0]
print("\nSentiment: {}".format(sentimentAnalysis.sentiment))
Enter fullscreen mode Exit fullscreen mode

Save your changes. Then return to the integrated terminal for the text-analysis folder, and re-run the program.

Observe the output, noting that the sentiment of the reviews is detected.


Add code to identify key phrases

It can be useful to identify key phrases in a body of text to help determine the main topics that it discusses.

In the Main function for your program, find the comment Get key phrases. Then, under this comment, add the code necessary to detect the key phrases in each review document:

C#: Program.cs

// Get key phrases
KeyPhraseCollection phrases = aiClient.ExtractKeyPhrases(text);
if (phrases.Count > 0)
{
    Console.WriteLine("\nKey Phrases:");
    foreach(string phrase in phrases)
    {
        Console.WriteLine($"\t{phrase}");
    }
}
Enter fullscreen mode Exit fullscreen mode

Python: text-analysis.py

# Get key phrases
phrases = ai_client.extract_key_phrases(documents=[text])[0].key_phrases
if len(phrases) > 0:
    print("\nKey Phrases:")
    for phrase in phrases:
        print('\t{}'.format(phrase))
Enter fullscreen mode Exit fullscreen mode

Save your changes. Then return to the integrated terminal for the text-analysis folder, and re-run the program.

Observe the output, noting that each document contains key phrases that give some insights into what the review is about.

Add code to extract entities

Often, documents or other bodies of text mention people, places, time periods, or other entities. The text Analytics API can detect multiple categories (and subcategories) of entity in your text.

In the Main function for your program, find the comment Get entities. Then, under this comment, add the code necessary to identify entities that are mentioned in each review:

C#: Program.cs

// Get entities
CategorizedEntityCollection entities = aiClient.RecognizeEntities(text);
if (entities.Count > 0)
{
    Console.WriteLine("\nEntities:");
    foreach(CategorizedEntity entity in entities)
    {
        Console.WriteLine($"\t{entity.Text} ({entity.Category})");
    }
}
Enter fullscreen mode Exit fullscreen mode

Python: text-analysis.py

# Get entities
entities = ai_client.recognize_entities(documents=[text])[0].entities
if len(entities) > 0:
    print("\nEntities")
    for entity in entities:
        print('\t{} ({})'.format(entity.text, entity.category))
Enter fullscreen mode Exit fullscreen mode

Save your changes. Then return to the integrated terminal for the text-analysis folder, and re-run the program.

Observe the output, noting the entities that have been detected in the text.

Add code to extract linked entities

In addition to categorized entities, the Text Analytics API can detect entities for which there are known links to data sources, such as Wikipedia.

In the Main function for your program, find the comment Get linked entities. Then, under this comment, add the code necessary to identify linked entities that are mentioned in each review:

C#: Program.cs

// Get linked entities
LinkedEntityCollection linkedEntities = aiClient.RecognizeLinkedEntities(text);
if (linkedEntities.Count > 0)
{
    Console.WriteLine("\nLinks:");
    foreach(LinkedEntity linkedEntity in linkedEntities)
    {
        Console.WriteLine($"\t{linkedEntity.Name} ({linkedEntity.Url})");
    }
}
Enter fullscreen mode Exit fullscreen mode

Python: text-analysis.py

# Get linked entities
entities = ai_client.recognize_linked_entities(documents=[text])[0].entities
if len(entities) > 0:
    print("\nLinks")
    for linked_entity in entities:
        print('\t{} ({})'.format(linked_entity.name, linked_entity.url))
Enter fullscreen mode Exit fullscreen mode

Save your changes. Then return to the integrated terminal for the text-analysis folder, and re-run the program.

Observe the output, noting the linked entities that are identified.

Top comments (1)

Collapse
 
jettliya profile image
Jett Liya

Embark on a seamless language journey with Microsoft Azure AI Services. Our step-by-step guide unravels the intricacies of language services, empowering you to harness the full potential of Azure's cutting-edge capabilities. From text analytics to language understanding, this guide ensures you navigate each step with clarity and precision. Elevate your AI language experience with Microsoft Azure – where innovation meets accessibility. To learn more about artificial intelligence tools and how they can be used, visit this website: AiChief