DEV Community

Discussion on: LINUX KERNEL: Researchers from University of Minnesota had no bad intentions- lift ban

Collapse
 
exadra37 profile image
Paulo Renato

Put it this way - would you want someone toying with the code running the electrical grid, maybe a nuclear power plant? How about your local water treatment plant? Would this be OK? I should think not.

Wouldn't you want to be sure that this code running the critical infrastructure and the process that leads to it is tested for security vulnerabilities in the process of merging the code to the mainstream code?

The increase number of attacks to the supply chain of software in the last years clearly reveals a need to all processes to be under scrutinize for ways they can be exploited, so that they can also be fixed and patched.

So, In my opinion putting at test the Linux process of merging patches to the code is a good thing and needed, so that everyone can learn with the mistakes and the whole process gets better and more tampered proof.

I am not defending the researchers neither accusing them, because I don't know enough, but I think this type of research needs to be done, maybe they just haven't choose the best way of doing it.

Collapse
 
ssimontis profile image
Scott Simontis

I am all for exposing security flaws, but ethics are key when you do security research. There should have been some forward notification of the maintainers that an information security project was going to contribute potentially lethal code.

This also makes a key point that open-sourcing software does not make secure software. Very few people are qualified to do security reviews on a codebase, and without their expertise, one cannot say code is secure because it has passed public scrutiny.

Collapse
 
justchapman profile image
Chapman

Without a doubt - the approval process needs scrutiny. That's something even the dev team has openly admitted. But there's a right way and a wrong way to do it. You don't go testing how to defuse a live nuke by just clipping random wires and hoping for a good outcome. The research team should have had complete approval from senior leadership who knew exactly what they were doing beforehand.

Thread Thread
 
exadra37 profile image
Paulo Renato • Edited

But there's a right way and a wrong way to do it. You don't go testing how to defuse a live nuke by just clipping random wires and hoping for a good outcome.

This are not exactly comparable things.

If the code gets merged to the mainstream branch its not released to the public immediately or it shouldn't be, therefore the researchers would then reveal that they have failed and the commit would be reverted.

The research team should have had complete approval from senior leadership who knew exactly what they were doing beforehand.

I have worked in the past in a factory where a leak could kill everyone in a radius of up to x kilomenters, depending on the wind and leak.

Emergency exercises where carried out to test the responsiveness of local authorities and of all employees in the factory, and when done everyone involved was aware that they would occur, therefore the outcome was always excellent. So they always contained the leak in around 20 minutes with the help of the fireman department of the nearest city.

They would appoint an hour to the exercise and then have police in all intersections from the fireman station to the factory and all employees involved in the emergency were already in their battle positions, therefore the exercise was always a tremendous success.

In real life if a leak occurs the police will not be in all intersections, or in any, the fireman will take the double of the time to leave the station and to arrive to the factory, the employees in the factory will have to stop their current task, run to the nearest protection equipment, get dressed up and then go to fight the leak, this if they don't get killed by the leak before they have time to put the mask.

So, yes I agree with tests being carried out without their knowledge, because that's the only true way to test their resistance against the supply chain attack, anything else it's just theoretical and may not reveal the issues that their process may have.

If their process releases the Linux kernel to the public when a merge request is approved, then their process is flawed and vulnerable to be exploited more easily. They must put it in a staging phase before it can reach the public.