DEV Community

zakwillis
zakwillis

Posted on

A newish automation configuration tool ripe for releasing - collaborators?

About this post and me

Normally I consult in side large enterprises. I own a limited company in the UK which I am currently building a property platform for. This post covers software built during the process. If you are a good developer, entrepreneurially minded and want to connect - do it.

The need for configuration management and deployment management in dotnet

A seriously overlooked feature of the SDLC (Software Development Life Cycle) is the overhead all software projects suffer when managing configuration and deployment. In fact, there are three main challenges in moving software between development environments to production and executing these applications.

  • Configuration management
  • Deployment. 
  • Execution. 

A big chunk of my career was spent in the financial sector. What sets finance apart from other sectors is the significant number of processes occurring and the use of a workload automation/batch automation solution; Control-M, Autosys, Dollar Universe. A key advantage of these is to give enterprise oversight of highly complex data processing into sub domains or areas of processing. In simple speak, we have batches running for different areas of the business such as risk, collateral management, position management, product control etc.

What I find astounding is when working in different sectors to finance, relatively large organisations have not heard of batch automation tools. Often, we will find task scheduling software, job running frameworks, use of database schedulers (sql server agent or oracle scheduler). Worse, I have seen clients develop their own substandard solutions.

Framing the problem with process management (orchestration)

From development, through testing, to production we want to dramatically reduce the time spent organising and configuring applications and data processing. 

Devops to the rescue? 

Development Operations seeks to smooth out many of the time consuming tasks facing software delivery teams. This can cover a range of techniques including, containerisation, workload automation, configuration management, continuous integration, test automation, release management to name a few.

The concepts sitting under Devops are completely valid but even if present within an enterprise are too heavy handed and too time consuming.

Setting objectives

  • Applications should be able to run in parallel if required.
  • The learning curve for running jobs in sequence should be low (testers and analysts shouldn't have to learn complicated scheduling software).
  • Configuration should not be complicated to manage.
  • Configuration should be discoverable from higher environments.
  • Batches should be able to discover other executables and applications. 

The specific problem I was trying to solve

During my time developing my data architecture for the findigl property platform, the same problem kept reappearing. How does a developer manage testing and deployment, running multiple instances of applications, without using a host of batch automation and continuous integration solutions?

As this work took place just with me, I couldn't justify the time spent configuring Jenkins or team city. Getting a license for dollar universe or control-m was out of the question - setting up and managing batch scheduling software often has a dedicated team within the enterprise. I did look at quartz.net and hangfire.io - both have their use cases but didn't want to write any code at all to deploy and execute jobs.

The final challenge was to reduce complexity. I find, as a developer, the hardest thing is context switching. Maintaining mental notes of processes and configuration shouldn't be consuming my brain power too much. It is not simply a case of developing software but piecing together the many steps and processes needed to achieve the outcome. 

It became clear that all process automation runs off applications stored in different folders using different configuration parameters.

Info Rhino Process Automation (IRPA) to the rescue

Once it became clear that every process was always a call to an executable, that the executable always ran with different execution parameters, everything made sense. 

Clearer still, why was I maintaining similar configuration across applications? Some sites have such draconian release management procedure including run books.

IRPA does the following;-

  • Permits basic batch definition.
  • Discovers executables. 
  • Executes batches.
  • Seamlessly controls identification of configuration in different environments, backing configuration up.

A real use case for IRPA managing scraping batch ETL from website to database to website

Anybody who has worked with scraping data and putting it into a structured data store knows the pain in doing this. One of my batches has to do millions of searches, cleaning the data, and this can take days to process. What arose, was hundreds of executions performing different tasks. The result was a mess. Hundreds of instances of applications and batch files, with configuration. The process became virtually unmanageable. This was despite a fairly simple process flow of tasks - only seven unique applications were involved.

So I set about creating an application which could take care of deployment, reusing the existing batch execution file. From these simple steps, I could now;

  • Backup production and test configuration. 
  • Identify production applications and save those. 
  • Redeploy hundreds of applications from build within minutes. 
  • Discover jobs by pattern matching, reducing the amount of maintenance. 
  • Start execution.

In essence, there are three applications which ties IRPA together;

  • Job generator. 
  • Job deployer.
  • Job executor. 

These three applications perfectly allow relatively low technical skill set operators to create their own batches.

Where does IRPA fit into the SDLC?

Many smaller companies could use IRPA alone, but we don't see it replacing;

  • Continuous Integration. 
  • Batch automation software. 
  • Release Management. 

Whilst IRPA can do those things, we see IRPA dramatically reducing time to delivery, reducing errors, improving testing and speeding up feedback.

Think of IRPA as a glove that fits on top of your software application architecture

Next steps for IRPA?

So far, Info Rhino has been testing and using IRPA for over a year. We have some enhancements we feel are needed and are looking to;

  • Add enhancements to the job executor for stopping and rerunning batches. 
  • Adding reporting to facilitate better oversight of jobs and processes. 
  • Include front ends for managing the configuration and batches.
  • Add documentation and standards to the suite to make it easier to use. 

We don't feel that much extra is needed in terms of features but we have work to make the product more intuitive and user friendly.

Can I buy IRPA from Info Rhino?

Currently, IRPA is featured on our website

We consider the product at beta version and are planning an official launch on a new site. Please do get in touch. 

Can I help develop IRPA? Is it open-source? 

We haven't open sourced it yet. We definitely want to hear from developers and potential partners who work in c#/sql server or dotnet core.

What platforms does IRPA run on? 

Currently, it runs on the .Net Framework only (windows). There are plans to move this to dotnet core. 

Conclusion on Info Rhino's process automation suite 

My personal opinion is it is incredible, but then I would say that? Of course not.

The time it saves understanding a set of applications, their configuration and batch processing is immense. 

A key goal is reducing the time spent on non-essential tasks. Any company involved in enterprise software development will benefit from IRPA. 

Top comments (0)