Conducting an SEO competitor analysis can provide valuable insights into your performance in search compared to your competitors. This process also uncovers the sources of traffic for your competitors. Furthermore, it helps you determine your SEO priorities and strategy if it's not too late in some cases.
Final result of SERP Heatmap - Presidential Elections in Slovakia 2024
This will be the result of your effort while playing around with a few lines of Python code.
Let's start
In this article, I'll explore how you can leverage the Python library Advertools (from Elias Dabbas) to enhance your competitive analysis visualization capabilities. Additionally, you'll be rewarded by an interactive SERP heatmap demonstrating coverage, appearance and average ranking for competitors you would like to observe. Let's start to set the environment.
How to install Jupyter Lab
For this tutorial, I focus on setting up an environment in OSX
. In this case, I choose Jupyter Lab environment and working space. You can skip this if you are familiar with Google Colab and head right to the coding section.
Let's start with a simple installation in your Terminal
. Now just head to Applications > Utilities
and look for Terminal.app. Open the Terminal
and add this simple command to start the installation.
brew install jupyterlab
If you are using Linux you can use the same command or try a different way to install Jupyter Lab on a platform you prefer as mentioned on the official installation page
How to start Jupyter Lab
All needed has been installed onto your system. Now enter this command into Terminal
.
jupyter lab
In your default browser will open a tab with http://localhost:8888
.
Let's start writing the code
Head to File > New > Notebook
. A new tab will open for you. First thing first you need to install libraries to play around with the needed variables.
Installing libraries
These libraries will be installed into Jupiter Lab
. There is a tiny difference when it comes to standard Terminal
commands.
You have to enter before the command !
(exclamation).
!pip install advertools
!pip install adviz
In your Jupiter Lab
environment it will look like this. To run the command you stay in the active field where your code is and hit Shift + Enter
.
To continue to enter the rest of the code you need to hit the +
icon on the top to add a new empty field for new code syntax.
Importing packages
What has been installed needs to be imported. So we'll import pandas
, advertools
, adviz
and plotly
. Enter into the new field this code.
import pandas as pd
import advertools as adv
import adviz
import plotly.express as px
Setting up Custom search engine and API key
You have to set the Custom search engine ID (in code known as cx) to query your preferred search engine via API then later on with the list of your keywords you would like to see results for. Visit Programmable search engine homepage for What to search?
select ○ Search the entire web
, for Search settings
select ○ Image search
and ○ SafeSearch
. Tick reCAPTCHA and press Create
.
From the whole script, you have to copy just the part after cx=
.
<script async src="https://cse.google.com/cse.js?cx=YOUR-SEARCH-ENGINE-ID">
</script>
<div class="gcse-search"></div>
To have Custom Search API head here and press Get a Key
. The code you'll copy is referred to key
in the script. What goes to the Jupyter lab field is in final this code.
cx = "YOUR-SEARCH-ENGINE-ID"
key = "YOUR-CUSTOM-SEARCH-API-KEY"
Check keywords via API
In this part, we are going to query API via Advertools
as adv
with the help of serp_goog
. I created the variable dataz
where results will be stored. You can name it whatever you want. The value gl=["sk"]
defines what Google
you want to query. In this case, it is Google.sk
.
In your final code, you can add more keywords. For code purposes, I choose just a few. In my final code, I entered quite a lot of variations of keywords and presidential candidates' names. Feel free to experiment.
dataz = adv.serp_goog(cx=cx,
key=key,
gl=["sk"],
q=["prezidentské voľby preukaz",
"prezidetske volby preukaz",
"prezidentské voľby voličský preukaz",
"prezidentské voľby kandidáti",
"voľby 2024",
"prezidentské voľby",
"prezidentské voľby slovensko",
"prezidentské voľby 2024",
"prezidentské voľby slovensko 2024",
])
When you run this field with Shift + Enter
the process of querying will start. You have to wait.
To know what else you can set up for
serp_goog
enter an empty field of Jupyter Labadv.serp_goog?
(with ? question mark)
Draw SERP Heatmap
This code is straightforward. It can be more robust and you can modify via fig
almost all values you will see in the SERP Heatmap, but for this case, I kept it as simple as it can be. With fig.layout.title
we define title
of the heatmap. I call variable dataz
via serp_heatmap
. Here we define a number of results as num_domains
(number of domains). That's all that we have to set up to draw the SERP Heatmap.
adviz.serp_heatmap(dataz,
num_domains=10)
fig = adviz.serp_heatmap(dataz)
fig.layout.title = "SERP Heatmap - Presidential Elections 2024 in Google.sk"
fig
Interactive result of SERP Heatmap - Presidential Elections
As shown at the beginning you should now see the result of the SERP Heatmap in the new field of Jupyter Lab. It's fully interactive, so you can hover over the circles to see what keywords on what position have been triggered for each domain as shown on this Interactive SERP Heatmap.
Unstable results of SERP Heatmap
Results can vary based on the date you run your analysis because most of the websites produce content and Google algorithms reflect changes and recalculate coverage and position for each domain/website (mostly frequently updated news websites) almost simultaneously. For example, before the first round of the Presidential election in Slovakia, there were lots of results from domains like
Wikipedia
,
How to modify SERP Heatmap results based on top-level domain
If you want to narrow your results in the SERP Heatmap just to domains from a specific country in other words restrict search results to documents originating in a particular country, you may use boolean operators in the
cr
parameter's value. The whole code for domains originating just from Slovakia iscr=["countrySK"]
(place it in the code where you specify keywords). Google Search then determines the country of a document by analyzing the top-level domain (TLD) of the document's URL, and the geographic location of the Web server's IP address.
Thanks to Steven Lelham for the cover image from Unsplash.
Top comments (0)