DEV Community

DevOps Learning ~ Day 1

Starting a new journey is always an exciting endeavour, and today marked the beginning of my exploration into the world of DevOps. Today I have learned the fundamentals, I was introduced to a plethora of concepts that lay the foundation for this intriguing field.

Image description

The Need for DevOps

At the core of my day 1 learning, I understood the fact that why DevOps has become an integral part of modern day software development. DevOps isnā€™t just a buzzword. Itā€™s a method that bridges the gap between development and operations, ensures seamless collaboration and continuous integration. As I pondered the challenges faced during traditional software development, the reasons for adopting DevOps became clear ā€” faster releases, improved collaboration, and enhanced quality assurance.

The Dance of Software Development Life Cycle (SDLC)

Image description

One of the first concepts I encountered was the Software Development Life Cycle (SDLC). Itā€™s the backbone of any software project, comprising phases from idea conception to deployment. DevOps steps onto the stage at crucial points within the SDLC ā€” particularly in building, testing, and deploying software. This real-time collaboration ensures that development and operations teams work in harmony, leading to smoother and more efficient processes.

If you look at the image carefully; Building, Testing and Deploying is the process that a DevOps engineer mostly works with.

Unveiling Virtual Machines and Cloud Computing

As I progressed in my learning, I found myself unveiling the mysteries of virtual machines (VMs) and cloud computing. VMs are like digital versions of physical computers, allowing a user to run multiple operating systems on a single physical machine. The concept of cloud computing, on the other hand, revolutionised the way we deploy and manage a system. It offers scalability, flexibility, and the power to spin up resources as needed.

Basics of Cloud Computing:

  • Flexibility and Scalability: Cloud computing provides the ability to adjust resources based on demand. Unlike physical servers that require lot of investments and can be challenging to scale and manage. Cloud services allow users to provision additional resources or reduce them as required.
  • Cost Efficiency: Cloud services operate on a pay-as-you-go model, meaning you only pay for the resources that you use. This eliminates the need for large capital investments in hardware, reducing financial risks. Additionally, you avoid costs associated with maintaining and upgrading physical servers.
  • Resource Management: Cloud providers are the one who manage the underlying infrastructure, including hardware maintenance, updates, and security patches. This allows businesses to focus on their core competencies without dedicating significant resources, time and labour to server management.
  • Global Accessibility: Cloud services offer global accessibility. Users can access their applications and data from anywhere in the world with an internet connection. This is a significant advantage over traditional setups that require physical presence in data centers.
  • Rapid Deployment: Setting up physical servers can take a lot of time due to procurement, setup, and configuration. Cloud resources can be provisioned in very less time within minutes, enabling rapid deployment of applications and services.
  • Global Reach: Cloud providers have data centers located in various locations around the world. This allows businesses to deliver content and services with low latency to users across different geographical locations.

The Role of a DevOps Engineer

To start with the basics:

i. Automation

ii. Monitoring and Alerting

iii. Security and Compliance

iv. Version Control

v. CI/CD

vi. Documentation

Understanding the role of a DevOps engineer was a pivotal moment in my day 1 journey. A DevOps engineer isnā€™t just a coder or an operations guru ā€” they are the bridge between development and operations. They orchestrate processes, automate tasks, and champion a culture of collaboration. Their expertise lies in tools that streamline the entire development lifecycle, from code creation to deployment.

As I wrap up my reflections on day 1, Iā€™m excited to have embarked on this DevOps journey. From the essentials of why DevOps matters to the foundational understanding of SDLC, virtual machines, cloud computing, and the pivotal role of a DevOps engineer, I can already see the immense potential this field holds.

Stay tuned as I dive deeper into the DevOps universe, exploring the tools, practices, and technologies that make this methodology a game-changer in modern software development.


Basic Commands of Linux

Explore essential Linux commands, file system navigation, permissions of the 1st day.

**whoami **Displays the current userā€™s username.

pwd Prints the current working directory (i.e., the path to the directory youā€™re in).

ls Lists files and directories in the current directory.

ls -a Lists all files and directories, including hidden ones (those starting with .).

ls -la Lists all files and directories in a detailed, long format, including hidden ones.

mkdir folderName Creates a new directory with the specified name.

cd folderName Changes the current directory to the specified folder.

cd .. Moves up one directory level.

cd Changes to the userā€™s home directory.

touch fileName.txt Creates a new empty file with the specified name.

echo ā€œHello worldā€ Prints the text ā€œHello worldā€ to the terminal.

mv fileName.txt folderName2 Moves the file fileName.txt to folderName2.

nano Opens the Nano text editor to create or edit files.

cat fileName.txt Displays the content of the file fileName.txt in the terminal.

history Displays a list of recently executed commands.

rm fileName.txt Deletes the file fileName.txt.

  • System Management:

sudo apt install nodejs Installs the Node.js package using the APT package manager with superuser privileges.

sudo apt update Updates the package list and information for available software packages.

Note: It is recommended to use apt commands on the terminal and apt-get in bash scripts. This is because apt is the newer version and is constantly evolving, which can sometimes break backward compatibility. Using apt-get commands ensures guaranteed backward compatibility and provides more functionality for script usage.

man Displays the manual page for the specified topic, providing detailed information about commands and concepts.


Until next time :) Connect me here :)

Top comments (0)