DEV Community

Cover image for Best Executable Expert Knowledge in Apps Now!
Frank Font
Frank Font

Posted on • Edited on

Best Executable Expert Knowledge in Apps Now!

Decades ago I learned a practical software development technique that turns business and other subject matter experts directly into developers and maintainers of expert knowledge.

I still use this technique today. It helps finish complex software implementations fast.

An example from long ago: 1990s

I was the lead developer and designer of a financial planning software application that a large international pharmaceutical wanted to give each of its 10,000 employees. This was so long ago that the app was going to be distributed on a hard-cased little disk. This was the early 1990s.

A professional financial planner had been working with the pharmaceutical firm to model the retirement planning options of its employees. A lot of expert planning work and fine tuning went into the financial formulas. And when I was brought in to start designing and building the graphical application we would be delivering, the fine tuning was still happening.

Everyone was eager for me and my team to get started converting the financial models into program code. I asked what the models were currently in. They said Excel.

I said no to re-implementing the models.

Instead I found a way to read the spreadsheets real-time from the program code we eventually delivered. The financial planner was tweaking his formulas and delivering updated spreadsheets right up to the last minute of the software’s development.

He did not have to go through a programmer to make formula updates. Instead of delays and human brokered translations, he remained the developer and maintainer of his expert contribution to the application.

The application was a success, delivered on time. No bugs. We got awards.

My boss had successfully taken the same approach with a commercially profitable expatriate management application some years before. A financial expert owned and delivered the financial models. No programmer blocking the way to updates.

An IRS example from some time ago: early 2000’s

The IRS was trying to modernize its systems; some over 30 years old at that point, and I was brought in as an XML expert. They needed experts in this markup technique because the modernization effort was modeling tax form processing rules as markup instead of bespoke program code. Seemed like a step in a direction I liked.

I arrived on the project as the last addition to a 7 person team of XML markup focused professionals; part of a ramp-up of bodies being thrown at an implementation effort that was not progressing to the finish line fast enough. There was tension.

The entire team was placed in a room of a secure building and asked to divide and conquer conversion of all federal tax form laws of that year into XML models for processing in the new system.

The laws themselves were being translated by a separate team of analysts into requirement statements, one tax form at a time,

The XML team would then manually convert the translated rules into markup.

On my arrival, the team had been working on this problem for months. It was slow going.

The team leader and one of the members walked me through the process of modeling the rules as the XML structure needed by the new solution’s processing engine. Tedious process to manually mark this up.

They assigned me several tax forms. Everyone had some tax forms.

I spent two days doing this manually. At the end of each day, everyone reported their progress. There was at best incremental advancement through a form each day. I of course had only the smallest of progress to report.

At the end of the third day I reported I was done with all my forms and was ready for more. The team leader gave me a look as if I might be incompetent, perhaps I was a bad hire; and she offered to sit with me after the meeting to see what I had done.

She sat with me and found I was done. In less than a week I had completed what team members had been working on for months.

How did I do this? The requirements were being given to us as Excel spreadsheets. I spent my third day on the job writing a program that read spreadsheet encoded requirements and generated the XML markup I had learned about in the days before.

By writing that program to convert the requirements directly into usable markup I had connected the tax-law experts directly to the processing system. Teams of people to manually translate their insights were not needed.

The approach was adopted project wide for other efforts. The project was e-services and it was the first IRS modernization project in 30 years not to end in failure. We received awards and I received awards.

A Medical Example from early 2000’s

A well known medical professional association, internationally known for it’s registries of medical outcomes, was too understaffed to develop and deploy a new registry on a tight schedule.

Medical registries of the type they were and are known for are internet connected systems where hospitals and medical providers contribute patient outcome information useful for studies into what works and what does not. Information that can help people live better and/or longer lives. Cool.

I was trained in their existing process. Very heavy on database stored procedure code for data validation. Validation is critical for registries like this.

Stored procedures are very programmer dependent. Hmm.

So I proposed a design that pulled the data quality validation rules out of the programmers’ hands and into the data quality experts hands. In this case, XML markup that non-programmers could read and maintain. I knew just who to add to the project; a colleague from the IRS who already knew the power of this approach.

We completed and released the registry to the medical community on time. And, I was told that for the first time in the organization's history -- it was a registry release that was bug free from day one.

Letting experts do their thing directly speeds up delivery and improves the likelihood of expert level accuracy.

Example from early twenty-teens

A client was redesigning a high-profile big-data consuming financial reporting application. This redesign would use HADOOP MapReduce to massively parallelize the information processing using open-source frameworks and cloud infrastructure.

As a senior software development consultant I was tasked with the validation stage of data intake. Data was coming in, as if from a firehose, from many financial institutions on a daily basis.

I inherited the beginnings of a high-performance validation program. The validation rules were complicated and subject to periodic revision by organization lawyers and analysts.

Completing this validation program was going to be stressful, and would probably result in a requirement that one or more programmers periodically update it to keep up with input from subject matter experts.

A bespoke high-performance program that needed regular business-expert guided program updates did not smell like it would lead to sustainable success.

So I moved the validation rules out of program code and into an XML markup that non-programmers could validate on sight, and with some practice, update and revise independently.

The validation rules were successfully applied to the incoming flood of big-data by a high-performance program that was generated from those rules. So, instead of investing my time into writing one high-performance validation program I wrote a program that read the rules and generated the source code of a high-performance program on demand.

Trusted high performance validation programs, custom generated from the validation rules managed directly by business experts, were produced as needed whenever needed.

This concept and approach worked so well the client adapted it into other systems, and I’ve been told, uses a refined descendant of this in its premier processing platform today.

One more medical domain example: middle teens

I had the privileged opportunity to be the lead developer and technical architect of an award winning open sourced Veterans Administration radiology workflow application. From the start, it was clear this would be a challenging undertaking.

The application would need to integrate with the existing Electronic Health Records (EHR) system of the VA while converting an existing paper based process into a new practical web based application.

The EHR at the VA has its own specialized domain language; so we hired experts in that domain to help us navigate that connectivity to our application’s dedicated data-layer. Those experts coded up that part themselves; with agreed touch-points that I and other developers could build on in our application.

Then there was a question of how to implement protocol and timely patient exam validation rules; rules that would examine real-time patient information and preventatively detect dangerous patient conditions such as contrast agent being prescribed to a patient presenting with poor kidney function. There was an opportunity to improve outcomes, and perhaps in the heat of a moment, save a life.

Was I going to translate these medical rules into bespoke program code? Like before, of course I would not. So for this application, I proposed and then implemented an expert system module with an interface where doctors could directly examine, vet, and update important rules leading to warnings and alerts. Again, programmers not required to keep the quality and accuracy of these settings where they needed to be.

The system was delivered on-time and on-budget and passed all quality reviews for release into the medical domain.

The Generalized Technique

What do the above examples all share in common? One thing is they all had sophisticated domain-expert knowledge baked into them. Another is that they were delivered on time. Another is that they were recognized as being high-quality and innovative. People were impressed.

And the most critical thing they all share, the thing that this article is about, is that the domain-experts contributed directly. Their knowledge was not translated into obscure program code that only other programmers could update and maintain. Instead, and this is the critical part; the ability to maintain and update domain specific expert knowledge was put in the hands of the domain experts.

Decision guide for when to invest in non-programmer expert rule creation and maintenance

Only useful when interacting with domain experts?

No this technique is useful for any situation where you expect to have significant or critical functionality evolve over a period of time.

Don’t build programs where you know you will need programmers to make periodic business-driven updates to program code. Instead, build programs where the volatile stuff is carved out into user configurable content. And keep this stuff under version control.

I like using JSON these days for capturing business dependent behaviors. It works well. You can use it on day-one in a POC and role it into production keeping experts engaged in owning the complicated stuff through the whole process.

Programmers don’t need to stand between experts and the applications that need their insights.

For applications with evolving rules and complex or nuanced logic, enlist the experts into managing that logic directly.

This is a good way to deliver sophisticated solutions faster than many people think is possible. It’s fun to surprise people in this way. Have this kind of fun.

Top comments (0)