ServiceNow report data to Airtable
I manage a platform product roadmap and the majority of work lives in ServiceNow. ServiceNow began as a type of help desk ticketing utility. It appears they are positioning themselves as a general business workflow engine. I do not have API access as far as I know but I can generate reports and download them as CSV's. There appears to be a module in ServiceNow to do Agile Software development workflows but I lack the time or the interest to discover how to use this functionality. In the mean time we are using a flexible board that will place cards into our Visual Task Board based on the assignment group. This enables our team to work in a Kanban style flow. Unfortunately the burden is on me to generate metrics and charts based on their progress.
ServiceNow reporting interface
ServiceNow has a decent reporting interface that makes it easy to find data in tables and connect data structures together. I access the vtb_card table and populate specific columns based on the assignment group of my team. Specifically Task.Number, Lane, Task.Short Description, Task.State, Label 1, Label 2, Label 3, Label 4, Label 5, Label 6, Label 7. The most annoying thing about the labels are they are specific to a particular board and do not retain their names. However, it is not terrible to work around it. From there right click on the column name and export to CSV.
Preprocessing the data before uploading to AirTable
The one thing I had to do first with the exported CSV was to convert it to UTF8. The default encoding was not UTF8 and I used powershell to help achieve this. I also use the Google Cloud powershell library to upload my converted csv to cloud storage. Where I can trigger a serverless compute function to process the upload to airtable. This powershell is triggered by a usual batch script with the command. There's probably a way to do it in python but PowerShell was easy enough.
PowerShell -NoProfile -ExecutionPolicy Bypass -Command "& './upload\_backlog.ps1'"
$srcRoot = "dailybacklogreport\"
$utf8 = New-Object System.Text.UTF8Encoding $false
Get-ChildItem $srcRoot\* -recurse -Include *.csv | ForEach-Object {
$content = $_ | Get-Content -Raw
Set-Content -Value $utf8.GetBytes($content) -Encoding Byte -Path $_.FullName
}
$fileList = Get-ChildItem -Path $srcRoot -File -Force -Recurse
$file = $fileList | Select-Object -First 1
New-GcsObject -Bucket "alanmbarrdatasyncing" -File $file.FullName -ObjectName "platform_engineering/backlog.csv" -Force
Remove-Item -Path "$srcRoot\*" -Include "*.csv" -Force
Once uploaded it triggers the Google Cloud Function to run on file creation. I use pandas to do the data filtering. I only want to send new records or records that changed due to AirTable Limits. What is missing from the script is keeping the labels in sync over time. It's possible but I haven't found it worth the effort to write a custom check.
The requirements.txt for the GC Function needs to contain the libraries for data processing.
airtable-python-wrapper==0.12.0
pandas==0.24.2
gcsfs==0.2.3
def parse_report(data):
import pandas as pd
def filterColumn(row):
return list(filter(lambda row: row != None, row.values))
startWithPandas = pd.read_csv('gs://' + data['bucket'] + '/' + data['name'])
startWithPandas["label_1"] = startWithPandas["label_1"].apply(lambda x: "Defect" if x == True else None )
startWithPandas["label_2"] = startWithPandas["label_2"].apply(lambda x: "High Priority" if x == True else None )
startWithPandas["label_3"] = startWithPandas["label_3"].apply(lambda x: "Feature" if x == True else None )
startWithPandas["label_4"] = startWithPandas["label_4"].apply(lambda x: "Idea" if x == True else None )
startWithPandas["label_5"] = startWithPandas["label_5"].apply(lambda x: "Request" if x == True else None )
startWithPandas["label_6"] = startWithPandas["label_6"].apply(lambda x: "Story" if x == True else None )
startWithPandas["label_7"] = startWithPandas["label_7"].apply(lambda x: "Low Priority" if x == True else None )
startWithPandas["Labels"] = startWithPandas[["label_1", "label_2", "label_3", "label_4", "label_5", "label_6", "label_7"]].apply(filterColumn,axis=1)
return startWithPandas.drop(columns=["label_1", "label_2", "label_3", "label_4", "label_5", "label_6", "label_7"])
def parse_airtable_data():
from airtable import Airtable
import os, pandas as pd
base_key = 'app5beEUuaFQatE2T'
table_name = 'Features & Requests'
airtable = Airtable(base_key, table_name, api_key=os.environ['AIRTABLE_KEY'])
res = airtable.get_all(fields=["task.number","task.state","task.short_description","lane"])
data = []
for r in res:
f = r["fields"]
data.append(
{"task.number":f["task.number"],
"task.state":f["task.state"],
"lane":f["lane"],
"task.short_description":f["task.short_description"],
"recordId":r.get("id")
})
airtable_data = pd.DataFrame(data)
return airtable_data
def process_differences(servicenow,airtable_data):
m = servicenow.merge(airtable_data, on=["task.number","task.state","task.short_description","lane"], how='outer', suffixes=['', '_'], indicator=True)
differences = m[(m["_merge"]!= "both") ]
return differences
def upload_to_airtable(differences):
import os
from airtable import Airtable
base_key = 'app5beEUuaFQatE2T'
table_name = 'Features & Requests'
airtable = Airtable(base_key, table_name, api_key=os.environ['AIRTABLE_KEY'])
unique_ids = set(differences["task.number"].values)
for item in unique_ids:
row = differences[differences["task.number"] == item]
if len(row) == 2:
recordId = row[row["recordId"].notnull()]["recordId"].values[0]
f = row[~row["recordId"].notnull()]
fields = f.to_dict(orient='records')[0]
del fields["recordId"]
del fields["_merge"]
if len(fields.get("Labels")) == 1 and fields.get("Labels")[0] == "":
del fields["Labels"]
airtable.update(recordId,fields = fields)
else:
fields = row.to_dict(orient='records')[0]
del fields["recordId"]
del fields["_merge"]
if isinstance(object, list):
fields["Labels"]=list(filter(lambda x:x!="",fields["Labels"]))
else:
del fields["Labels"]
airtable.insert(fields)
def upload_airtable_data(event, context):
"""Triggered by a change to a Cloud Storage bucket.
Args:
event (dict): Event payload.
context (google.cloud.functions.Context): Metadata for the event.
"""
file = event
print(f"Processing file: {file['name']}.")
if file['name'] and file['name'].startswith('platform_engineering/'):
servicenow_frame = parse_report(file)
airtable_frame = parse_airtable_data()
diff = process_differences(servicenow_frame,airtable_frame)
upload_to_airtable(diff)
print(f"Processing Complete.")
Top comments (0)