I recently had a cloud migration client who was at the beginning stage of their discovery phase and looking to jump straight to “which database platforms should I be using in the cloud?” - a tall ask you might say, but following the three steps below they were able to discover and analyze all of their database servers in just two weeks.
Stepping back for a moment, let’s review “the why” - Why is database analysis important to perform at the earliest stages of a cloud migration?
Why Assess Databases for Cloud?
Increasingly enterprises are coming around to the view that the cloud is more than a datacenter. They frequently tout new drivers such as: unlocking new business capabilities, becoming more agile, and leveraging our data to become more competitive.
The application-centric and data-led migration initiatives we see today are the new normal, and gone are the days of infrastructure-led, lift-and-shift-only migrations. 😅
As such, if you are setting a strategy and planning migrations you need answers to questions like (a) which technologies will we use, (b) what skill sets do I need to develop/hire, and (c) how difficult will it be? Those answers are provided by a comprehensive database and application assessment.
We perform these assessments rapidly using purpose-built discovery and assessment tools, and today I will focus on the three steps you can take today to answer these questions.
Step 1 - Discovery
nmap -sV -p1433,1521-1525,3306,5432,7474,27017 192.168.2.2/24 -oX databases-on-my-network.xml
tidal sync nmap databases-on-my-network.xml
Leveraging the well known and free Nmap, together with Tidal Tools (docs) we can find all the servers on our network that are listening on default database ports, and obviously, you can tweak these for your environment if you know of other ports that your Ops teams may use in addition to these.
In the example above these are:
- TCP/1433 - Microsoft SQL Server
- TCP/1521-1525 - Oracle Server
- TCP/3306 - MySQL
- TCP/5432 - PostgreSQL
- TCP/7474 - Neo4J
- TCP/27017 - MongoDB
Running this nmap command is fast, and results in a portable XML file containing the results.
Note: It’s common practice to have firewalls and IPS devices in enterprise networks to either block or report on this type of discovery. We recommend working collaboratively with your IT Operations and Security teams to determine which networks to run it against, and from what node in the network to begin with.
Once you have your XML file(s) of results, uploading those to the tidal platform is a quick way to bolster your migration inventory with these special servers:
tidal sync nmap databases-on-my-network.xml
Step 2 - Analysis
tidal analyze db databases.yml
With your inventory complete, you can now get credentials to be able to query each of these databases. We recommend discovery tools that require read-only access, like the user permissions required in this guide under Getting Started.
It is worth noting that you do need to analyze the databases you intend to migrate. That is, be sure to analyze your production databases. While it is tempting to claim success after analyzing development or test servers, the reality is that production databases often have (a) different features enabled, but almost always have (b) separate application dependencies and (c) performance requirements.
Step 3 - Plan
Review the analysis results in your Tidal Migrations workspace for each database, and determine if the cloud-native technologies you would like to adopt will be easy or difficult to migrate to adopt for each database. You’ll uncover database features in your Oracle and SQL databases that require extra work to Replatform to alternatives like PostgreSQL, MySQL and others, as well as the number of application dependencies, database sizes and IO performance requirements.
Having these data points at your fingertips for one database is very useful for the migration teams in later stages, allowing them to migrate with certainty and not waste time in endless meetings, or chasing down precious DBA resources (who has enough of them anyway?). Getting this information in aggregate across all your databases early on: priceless. Now we can plan effectively, based on our databases and our actual usage requirements.
Conclusion
As you can see, at even the earliest stages of cloud migration discovery, migration teams can quickly capture data-driven insights from not only their applications and server infrastructure but their databases too. Having this comprehensive assessment available early on informs migration strategy, business case formation as well as migration team staffing requirements. More data upfront means less surprises during migration execution and far better outcomes.
Written By:
David Colebatch, Chief Migration Hacker, Tidal Migrations
Top comments (0)