As a software engineer, my passion lies in crafting not just functional but elegantly simple code. Amidst the chaos of complex systems, lines, and algorithms, simplicity is like a breath of fresh air. It allows me to cut through the noise, to see the patterns and structures more clearly. To me, simplicity isn't just a preference; it's a necessity. It makes my code more maintainable, understandable, and reliable.
"Simplicity is the ultimate sophistication." - Leonardo da Vinci
The first time I came across Clean Code, I was captivated. This aligns perfectly with my preference for organized environments where I can think clearly. I always strive to tidy up my workspace before starting any task, as it helps me gain clarity. Even in my children's education, I'm drawn to the Montessori method for its elegant environment. I've observed how children thrive in organized spaces, fostering a sense of calmness and boosting self-esteem.
Clean Code represents a set of principles and practices that prioritize simplicity, clarity, and maintainability in software development. It emphasizes writing code that is easy to understand not only by the author but also by fellow developers who may encounter it in the future. We follow a set of rules and principles designed to promote simplicity, clarity, and maintainability. For instance, meaningful naming, minimizing duplications, IOC (Inversion of Control), SOLID principles, and KISS (Keep It Simple, Stupid).
In this post, I want to share some useful tips for crafting code out of my experience. I’ll try to offer my way of thought for your delightful. We'll look at examples from the world of online shopping, like products, orders, coupons, and more.
Function Structure
"Thinking is the hardest work there is, which is probably the reason why so few engage in it." - Henry Ford
As simple as it looks, composing functions can be a turning point for creating neat and beautiful functions. When writing new functions with clear input and output, I often begin by typing out hypothetical function calls that eventually lead to the desired outcome.
For this example, we will implement applyCoupons
function. Each coupon comes with its own set of rules (such as minimum purchase thresholds or specific product requirements). The task at hand is to determine the best coupon to apply to a given shopping cart.
Here's a basic initial implementation:
export function applyCoupons(cart, coupons) {
let resultCart = cart;
let resultCoupon = null;
for (const coupon of coupons) {
const cartAfterDiscount = applyDiscount(cart, coupon);
if (cartAfterDiscount.price < resultCart.price) {
resultCart = cartAfterDiscount;
resultCoupon = coupon;
}
}
return [resultCart, resultCoupon];
}
Let's examine how the function might appear by typing out hypothetical function calls:
export function applyCoupons(cart, coupons) {
const carts = coupons.map((coupon) => applyDiscount(cart, coupon));
return chooseBestCart(zip(carts, coupons));
}
Creating hypothetical function calls:
- Helps break down the problem into smaller, simpler tasks - it's easier to solve small problems. If the function is well-defined, readers don't have to check the function itself, reducing mental strain.
- These smaller tasks can be combined into other functions, as long as they're general enough. I'd say it's not a worry for now, only for later uses.
- It prompts you to think of a clear sequence that moves forward only - stopping spaghetti code that declares and updates variables all over the place.
The biggest challenge that might come up is the temptation to dive straight into coding to avoid mental strain. Let me tackle that in the next tip.
The Importance of Refactor
"Change is the only constant in life." - Heraclitus
I first came across the term refactor while practicing Test-Driven Development (TDD). In TDD, the process involves writing a failing test, then writing the code to pass the test, and finally refactoring the code. Over time and through experience, I realized that the refactoring step is, actually, the most important part of this process. The cycle of failing tests transitioning to passing tests allows us to refactor without the fear of breaking, a power that is nothing short of remarkable.
When composing a new function or changing an existing one, I always begin and end the process with refactoring. Why? Because I aim to start with a clean slate and ensure the space remains clean afterward.
To illustrate the concept, let's consider the initial version of applyCoupons and introduce some complexity. For instance, suppose we aim to apply all coupons, starting with the one with the most discount. Our approach would be first to get the best coupon to apply, then update the cart and filter the coupon out of the coupon list.
import { range } from 'lodash';
export function applyCoupons(cart, coupons) {
const [cartWithDiscount] = range(coupons.length).reduce(
([processedCart, couponsToApply]) => {
let bestCoupon;
let cartWithDiscount = processedCart;
for (const coupon of couponsToApply) {
const cartAfterDiscount = applyDiscount(processedCart, coupon);
if (cartAfterDiscount.total < cartWithDiscount.total) {
bestCoupon = coupon;
cartWithDiscount = cartAfterDiscount;
}
}
return [
applyDiscount(processedCart, bestCoupon),
couponsToApply.filter((c) => c !== bestCoupon),
];
},
[cart, coupons]
);
return cartWithDiscount;
}
You may notice that I'm using the reduce function, even though it may seem more complex than just a for loop. However, using reduce allows us to work with the same variables without needing to update anything outside of the current scope. Now, we can proceed to refactor further by extracting a getBestCoupon
function and trim it down:
export function applyCoupons(cart, coupons) {
const [cartWithDiscount] = range(coupons.length).reduce(
([processedCart, couponsToApply]) => {
const bestCoupon = getBestCoupon(processedCart, couponsToApply);
return [
applyDiscount(processedCart, bestCoupon),
remove(couponsToApply, bestCoupon),
];
},
[cart, coupons]
);
return cartWithDiscount;
}
I believe the result could have further refactoring, possibly using a more functional approach.
export function applyCoupons(cart, coupons) {
const [cartWithDiscount] = range(coupons.length).reduce(
chain(applyBestCoupon, filterCoupon),
[cart, coupons]
);
return cartWithDiscount;
}
After refactoring, we can see that the code becomes more readable and adaptable to future modifications. I would argue in favor of refactoring, even in the absence of automatic test coverage, as the code becomes easier to comprehend and potential bugs can be identified during code review, just do it wisely and add some manual tests if needed, but it is worth the effort.
Cognitive Complexity
"Simplicity is the key to brilliance." - Bruce Lee
Cognitive complexity is the level of mental effort required to understand a piece of code. It focuses on how challenging it is for a human programmer to comprehend and reason about the code. We take various factors into account:
- Nesting depth: The level of nesting within control structures such as loops and conditionals.
- Cyclomatic complexity: The number of independent paths through a piece of code.
- Blocks length: The number of lines or statements. Code duplication.
- Variable naming and documentation: The clarity and descriptive nature of variable names.
To simplify code and lower cognitive complexity, we often refactor by extracting blocks of code into separate functions, reducing the block length. Use early return statements when the exit terms are clear to reduce cyclomatic complexity, reusing existing code and name variables and functions in a meaningful name.
One point I want to pick your brain with is avoiding the use of control structures, such as if and for blocks, to simplify code. These structures can lead to nested blocks that run based on certain conditions, which can complicate code readability and maintenance. For instance, consider filtering products by name; let's explore an example.
Here is the basic, initial implementation of products filter:
function filterProducts(products, name) {
const result = [];
for (const product of products) {
if (contains(product.name, name)) {
result.push(product);
}
}
return result;
}
Consider the following code instead:
function filterProducts(products, name) {
return products.filter(productContains({ name }));
}
The second version is shorter and easier to understand. One point I want to highlight here is that we don't always have to perform all tasks within the same loop. As we've learned, O(n) is the same as O(n) + O(n) theoretically. I often prefer to execute certain operations in separate loops rather than overcomplicating a single loop for the sake of performance. However, it's essential to note that this approach isn't always applicable, and there may be cases where performing tasks in the same loop is necessary.
Another approach to minimize the use of control structures is by storing the logic in a configuration object and accessing it from there. For instance, let's consider the scenario of ordering products from various providers:
async function placeOrder(order) {
let result;
if (order.provider === 'provider1') {
result = await orderFromProvider1(order);
} else if (order.provider === 'provider2') {
result = await orderFromProvider2(order);
} else if (order.provider === 'provider3') {
result = await orderFromProvider3(order);
} else {
throw new Error('Unknown provider');
}
return result
}
And we can do it with holding the login in a configuration object:
const providers = {
provider1: orderFromProvider1,
provider2: orderFromProvider2,
provider3: orderFromProvider3
};
function placeOrder(order) {
const orderFromProvider = providers[order.provider];
if (!provider) {
throw new Error('Invalid provider');
}
return orderFromProvider(order)
}
Clearly, this is a simplified example, but I argue that it's more maintainable because we can easily maintain the configuration object and make changes. Implementing a typing system to specify the input and output of all functions can be incredibly advantageous. Furthermore, if the creation criteria become more complex, creating a factory pattern can create the correct function or the appropriate class instance.
Conclusion
By prioritizing simplicity, we can enhance the maintainability, understandability, and reliability of our codebase. From function composition to minimizing control structures, I provided practical insights and tips for crafting clean and elegant code. Moreover, it underscores the significance of adhering to clean code principles, such as meaningful naming and minimizing complexity, to build robust and scalable software solutions.
As a passionate advocate for simplicity, I could talk all day about it. I have plenty more topics in mind, such as YAGNI (You Ain't Gonna Need It), SOLID principles, system design, and dependency management. However, let's keep it simple as suggested in the title.
Top comments (1)
Appreciate the code examples, they make the sentiment crystal clear.
Would love to see a future post on YAGNI!