DEV Community

Ayaka Hara
Ayaka Hara

Posted on • Updated on

Ingest data from Azure Table Storage into Data Explorer

LINQ allows us to query multiple entities from Azure Table Storage. However, the maximum number of entities that are returned in a single query with LINQ Take operator is 1,000 (MS doc) and you may need to code more to retrieve what you want.

This blog illustrates how to ingest data from Table Storage into Data Explorer via Data Factory to prepare to query large numbers of entities with Kusto.

Step 1: Create Azure Data Explorer

  1. Create Azure Data Explorer Here is an example setting: Alt Text
  2. Go to the resource and click "Create database"
  3. Create Azure Data Explorer Database (e.g. Database name: loadtest)
  4. Click the database (e.g. "loadtest") and select "Query"
  5. Click "Ingest new data"
  6. Create table in the database and ingest data Note: table name should be not including "-(dash)" Alt Text
  7. Click "Edit schema"
  8. Select "Ignore the first record"
  9. Make sure all data type are correct Alt Text
  10. Click "Start ingestion" Note: Please copy the mapping name Alt Text

Step 2: Create Azure Data Factory

  1. Create Azure Data Factory

Step 3: Prepare Azure Active Directory

  1. Go to Azure Active Directory
  2. Click "App registrations" and register an application Alt Text
  3. Go to "Certificates & secrets" and add a client secret Note: Please don't forget to copy the secret
  4. Go to Azure Data Explorer and click "Permissions"
  5. Add the service principal just created Alt Text

Step 4: Set Azure Data Factory to copy data from Azure Table Storage

Step 4-1: Create a base pipeline on Azure Data Factory

  1. Create Data Factory (You can select "Configure Git later")
  2. Go to resource
  3. Click "Author & Monitor"
  4. Click "Author (pencil icon)"
  5. Click "Add new resource (plus icon)" and select "Pipeline"
  6. Click "Move & transform" and drag "Copy data" to the right pane
  7. Set General in the bottom pane Alt Text

Step 4-2: Set up input data (Source) from Table Storage - Source

  1. Click "Add new resource (plus icon)" again and select "Dataset"
  2. Search "Azure Table Storage" and click "Continue"
  3. Click "New" in the bottom pane and set the linked service (Table Storage)
    • Select the table storage from Azure subscription or Enter manually
    • Click "Test connection"
    • Click "Create" Alt Text
  4. Select the table you want to copy from pulldown list and update the dataset name in the right pane Alt Text
  5. Back to pipeline setting and select the dataset on the "Source" section in the bottom pane

Step 4-3: Set output data (Sink) to Data Explorer

  1. Click "Add new resource (plus icon)" again and select "Dataset"
  2. Search "Azure Data Explorer" and click "Continue"
  3. Click "New" in the bottom pane and set the linked service (Data Explorer)
    • Select the data explorer cluster from Azure subscription or Enter manually
    • Put your Service principal Id (= e.g. Application (client) ID of "sp-adf-ayhara-loadtest") and the client secret which you copied earlier
    • Select Data Explorer database
    • Click "Test connection"
    • Click "Create" Alt Text
  4. Select the table as destination from pulldown list and update the dataset name in the right pane Alt Text
  5. Back to pipeline setting and select the dataset, table, and ingestion mapping name on the "Sink" section in the bottom pane Alt Text

Step 4-4: Set mapping

  1. Click import schemas
  2. Check if the mapping is correct Alt Text

Step 5: Ingest data from Table Storage into Data Explorer database

  1. Click "Debug" Alt Text
  2. Click "Details (glasses icon)"if you want to see the progress
  3. Once successfully data is copied: Alt Text
  4. Go to Data Explorer and try to query something Alt Text
  5. Back to Data Factory and click "Validate All"
  6. Publish all if there is no error!

Now, you're ready to query data ingested from Azure Table Storage with Kusto.

Next step - Query table data in Azure Data Explorer with Kusto to analyse load test results

Discussion (0)