DEV Community

Cover image for About “intuitive” programming
Aleksander Parchomenko
Aleksander Parchomenko

Posted on • Originally published at aleksander-parkhomenko.Medium on

About “intuitive” programming

Intro

Recently, when my daughter asked me to help her with chemistry homework she gave me Mendeleev periodic table she uses in school. The last heaviest element on it was Oganeson (Og on the picture below) with the nuclear number equals 118. The curiosity of the situation for me was that over 15 years ago, during my PhD thesis preparation the official name Darmstadtium (Ds, for nuclear number equals 110) was in the process of acceptance.

Mendeleev periodic table

Fig.1 - Mendeleev periodic table

Darmstadtium

One of parts of the my PhD dissertation was the theoretical calculation of the Ds 271 and Ds 269 lifetimes and theoretical description of the Darmstadtium nuclei decay chains: technically, synthesized super-heavy nuclei observed due to the “trace” — it consequently emits α-particles, sometimes neutrons or gamma bean and in the end, two known stable heavy nuclei. Observing a number of alphas (Helium —  it is very stable, because it consists of two protons and two neutrons, that are strongly bidden), energies and time differences during alphas, neutrons and gammas registration in cyclotron, one can deduce, that such nucleus (in our case, Ds) has been synthesized and actually we saw the decay of it. That was a months of experimental work of a number of researchers.

**_Ds_** α-decay chain

Fig.2 - Ds α-decay chain

Our work was theoretical — calculations of parameters of decay: alpha and gamma energies, nuclei half-life and comparison with an experimental ones.

Approach

The way we obtained decay chain was the comparison the probabilities of fall of the nucleus to to its ground state or low energy state with probability of α-decay to the state with the same quantum characteristics. These probabilities also have been tested, compared and parameterized with experimentally measured data.

I’ve decided calculate decay path with use of .NET framework. At that time it was, I think .NET 1.1, maybe .NET 2.0.

From perspective of years of experience of programming starting from .NET Framework 2.0 to .NET 6, Java 16, in an era of Cloud Native architecture and k8s, it was't beautiful code compatible with design patterns or coding standards. It has`t been analysed by Sonar for tech debt, or reviewed during Pull Requests, refactored, it never even was'n versioned! It was written intuitively, just the way I felt the challenge I’ve tried to solve.

And yes, it simply worked, it simply works. The results were very close to the experimental ones. This fact in research often decides that the theory is good candidate to live pretty long period of time.

Presently, looking backward, after taking part in various software projects in different technologies, management culture, terms and requirements similar problem I observe: the thin line between working and beautiful code.

Off cause I like SOLID, clean, tested, “self-documented”, readable, well formatted code. We have powerful tools and processes to achieve this. But often, more and more time developers spend time on “refactoring”, rather than on product quality improvement.

That’s why, in my opinion, “manager” or tech leader with engineering background or at least the person who understanding technical problems and has common language with developers will be a good option.

And do not forget about intuitive coding. At all levels of our experience it is our feeling of the problem! After few weeks our code will need to be refactored! :)

Top comments (0)