DEV Community

Sahil Kashyap (He/Him)
Sahil Kashyap (He/Him)

Posted on

Exploring the companies who have shifted from traditional ways of computing to cloud computing

Before jumping on the topic, let’s first start with the basics from the definitions, advantages, dis-advantages, types, models, applications, to the main agenda.

So, What is Cloud computing as a proper definition ?
Cloud computing is the delivery of computing services including servers, storage, databases, networking, software, analytics, and intelligence over the Internet ("The Cloud") to offer faster innovation, flexible resources, and economies of scale.

Now let’s discuss it in a layman language with understanding how does it actually work ? with its Advantages and Dis-advantages.
Cloud is nothing but services which are offered through the internet, wherein you can rent out servers, VMs and many other services virtually, here you need not spend any CapEx or need not pay any upfront cost, it’s as easy as purchasing a netflix subscription but will be charged only for the resources you’ve actually used and this is known as “Pay As You Go” model by this one can store their data, deploy their Website or application or can even make a VM with respect to their own requirement, now let’s discuss how does it actually work in the real time(from a company’s point of view),

Rather than owning their own computing infrastructure or data centers, companies can rent and access anything from applications to storage from a cloud service provider (Ex: aws, Azure, GCP, etc). The benefits of using cloud-computing services are a ton but few of them that highly matters is that firms can avoid the upfront cost and complexity of owning and maintaining their own IT infrastructure, and instead simply pay for what they use, when they use it. In turn, providers of cloud-computing services can benefit from significant economies of scale by delivering the same services to a wide range of customers. Also, cloud providers provide disaster recovery which is actually a very important feature or service of cloud because the data isn’t just stored at one data center or only at one region, there would be multiple copies of the same data at multiple regions to make it easily accessible and to have a backup.

While there are many benefits it also comes with some price such as data security, data loss or theft, data leakage, account or service hijacking, insecure interfaces and APIs, denial of service attacks, technology vulnerabilities, especially on shared environments, cloud downtimes and many more.
Well, What is the history of cloud computing?
Cloud computing as a term has been around since the early 2000s, but the concept of computing as a service has been around for much, much longer as far back as the 1960s, when computer bureaus would allow companies to rent time on a mainframe, rather than have to buy one themselves.

These 'time-sharing' services were largely overtaken by the rise of the PC, which made owning a computer much more affordable, and then in turn by the rise of corporate data centers where companies would store vast amounts of data.

But the concept of renting access to computing power has resurfaced again and again – in the application service providers, utility computing, and grid computing of the late 1990s and early 2000s. This was followed by cloud computing, which really took hold with the emergence of software as a service and hyperscale cloud-computing providers such as Amazon Web Services.

How important is the cloud?
Building the infrastructure to support cloud computing now accounts for a significant chunk of all IT spending, while spending on traditional, in-house IT slides as computing workloads continue to move to the cloud, whether that is public cloud services offered by vendors or private clouds built by enterprises themselves. Indeed, it's increasingly clear that when it comes to enterprise computing platforms, like it or not, the cloud has won.

Tech analyst Gartner predicts that as much as half of spending across application software, infrastructure software, business process services and system infrastructure markets will have shifted to the cloud by 2025, up from 41% in 2022. It estimates that almost two-thirds of spending on application software will be via cloud computing, up from 57.7% in 2022.

That's a shift that only gained momentum in 2020 and 2021 as businesses accelerated their digital transformation plans during the pandemic. The lockdowns throughout the pandemic showed companies how important it was to be able to access their computing infrastructure, applications and data from wherever their staff were working – and not just from an office.

Gartner said that demand for integration capabilities, agile work processes and composable architecture will drive the continued shift to the cloud.

The scale of cloud spending continues to rise. For the full year 2021, tech analyst IDC expects cloud infrastructure spending to have grown 8.3% compared to 2020 to $71.8 billion, while non-cloud infrastructure is expected to grow just 1.9% to $58.4 billion. Long term, the analyst expects spending on compute and storage cloud infrastructure to see a compound annual growth rate of 12.4% over the 2020-2025 period, reaching $118.8 billion in 2025, and it will account for 67.0% of total compute and storage infrastructure spend. Spending on non-cloud infrastructure will be relatively flat in comparison and reach $58.6 billion in 2025.

All predictions around cloud-computing spending are pointing in the same direction, even if the details are slightly different. The momentum they are describing is the same: tech analyst Canalys reports that worldwide cloud infrastructure services expenditure topped $50 billion in a quarter for the first time in Q4 2021. For the full year, it has cloud infrastructure services spending growing 35% to $191.7 billion

Canalys argues that there is already a new growth opportunity for cloud on the horizon, in the form of augmented and virtual reality and the metaverse. "This will be a significant driver for both cloud services spend and infrastructure deployment over the next decade. In many ways, the metaverse will resemble the internet today, with enhanced capabilities and an amplified compute consumption rate," the analyst said.

Now let us discuss about the companies who have shifted from traditional ways of computing to Cloud Computing

*Let’s first start with the super popular - Netflix *

The journey to the cloud at Netflix began in August of 2008, when they experienced a major database corruption and for three days could not ship DVDs to their members. That is when they realized that they had to move away from vertically scaled single points of failure, like relational databases in their datacenter, towards highly reliable, horizontally scalable, distributed systems in the cloud. They chose Amazon Web Services (AWS) as their cloud provider because it provided them with the greatest scale and the broadest set of services and features. The majority of their systems, including all customer-facing services, had been migrated to the cloud prior to 2015. Since then, They've been taking the time necessary to figure out a secure and durable cloud path for their billing infrastructure as well as all aspects of their customer and employee data management. They were happy to report that in early January, 2016, after seven years of diligent effort, They have finally completed their cloud migration and shut down the last remaining data center bits used by their streaming service!

Moving to the cloud has brought Netflix a number of benefits. They have eight times as many streaming members than they did in 2008, and they are much more engaged, with overall viewing growing by three orders of magnitude in eight years:

The Netflix product itself has continued to evolve rapidly, incorporating many new resource-hungry features and relying on ever-growing volumes of data. Supporting such rapid growth would have been extremely difficult out of their own data centers; they simply could not have racked the servers fast enough. Elasticity of the cloud allows them to add thousands of virtual servers and petabytes of storage within minutes, making such an expansion possible. On January 6, 2016, Netflix expanded its service to over 130 new countries, becoming a truly global Internet TV network. Leveraging multiple AWS cloud regions, spread all over the world, enables them to dynamically shift around and expand their global infrastructure capacity, creating a better and more enjoyable streaming experience for Netflix members wherever they are.

they rely on the cloud for all of their scalable computing and storage needs — their business logic, distributed databases and big data processing/analytics, recommendations, transcoding, and hundreds of other functions that make up the Netflix application. Video is delivered through Netflix Open Connect, their content delivery network that is distributed globally to efficiently deliver their bits to members’ devices.

The cloud also allowed them to significantly increase their service availability. There were a number of outages in their data centers, and while they have hit some inevitable rough patches in the cloud, especially in the earlier days of cloud migration, they saw a steady increase in their overall availability, nearing their desired goal of four nines of service uptime. Failures are unavoidable in any large scale distributed system, including a cloud-based one. However, the cloud allows one to build highly reliable services out of fundamentally unreliable but redundant components. By incorporating the principles of redundancy and graceful degradation in their architecture, and being disciplined about regular production drills using the Simian Army, it is possible to survive failures in the cloud infrastructure and within their own systems without impacting the member experience.

Cost reduction was not the main reason they decided to move to the cloud. However, their cloud costs per streaming start ended up being a fraction of those in the data center -- a welcome side benefit. This is possible due to the elasticity of the cloud, enabling them to continuously optimize instance type mix and to grow and shrink their footprint near-instantaneously without the need to maintain large capacity buffers. They can also benefit from the economies of scale that are only possible in a large cloud ecosystem.

Given the obvious benefits of the cloud, why did it take them a full seven years to complete the migration? The truth is, moving to the cloud was a lot of hard work, and they had to make a number of difficult choices along the way. Arguably, the easiest way to move to the cloud is to forklift all of the systems, unchanged, out of the data center and drop them in AWS. But in doing so, you end up moving all the problems and limitations of the data center along with it. Instead, they chose the cloud-native approach, rebuilding virtually all of their technology and fundamentally changing the way they operate the company. Architecturally, they migrated from a monolithic app to hundreds of microservices, and denormalized their data model, using NoSQL databases. Budget approvals, centralized release coordination and multi-week hardware provisioning cycles made way to continuous delivery, engineering teams making independent decisions using self service tools in a loosely coupled DevOps environment, helping accelerate innovation. Many new systems had to be built, and new skills learned. It took time and effort to transform Netflix into a cloud-native company, but it put them in a much better position to continue to grow and become a global TV network.

Netflix streaming technology has come a long way over the past few years, and it feels great to finally not be constrained by the limitations they've previously faced. As the cloud is still quite new to many of them in the industry, there are many questions to answer and problems to solve. Through initiatives such as Netflix Open Source, they hope to continue collaborating with great technology minds out there and together address all of these challenges.

The Second one is - General Electronics

General Electric (GE) began its digital transformation in 2014, but it chose Amazon
Web Services (AWS) as its preferred provider three years later, relying on the
service to host more than 2,000 cloud-based apps and services.

“Adopting a cloud-first strategy with AWS is helping our IT teams get out of the
business of building and running data centers and refocus our resources on
innovation as we undergo one of the largest and most important transformations
in GE’s history,” said Chris Drumgoole, Chief Technology Officer and Corporate
Vice President of General Electric.
GE experimented with the notion of developing its own industrial cloud a few
years before switching to AWS but decided against it. It chose to concentrate on
other areas of its company while entrusting cloud infrastructure to AWS.

GE, the 123-year-old staple of the global industrial sector, is going all in on the
cloud. The company plans to migrate 9,000 applications to public IaaS over the
next three years. It is reducing its data centers from more than 30 to the single
digits. But for a company with $117 billion in annual revenue; tens of thousands
of apps; hundreds of thousands of servers; petabytes of storage and networks in
hundreds of countries around the world, migrating to the cloud isn’t as easy as
lifting and shifting.
Some of GE’s core industries – energy, health care and finance – are heavily
regulated by protocols that were written for a different era. “They assume the
construct of a client-server world,” Drumgoole says. “The regulations are written
in a way that assumes there’s a server, a hypervisor and a physical data center
that you control. Fundamentally, those don’t apply in the cloud world.”
The whole point of the public cloud is that vendors – like AWS - provide those
components as a service to customers. “The constructs of the regulations haven’t
taken into consideration the advances in technology,” he adds. Through the
ONUG working group, Drumgoole is hoping a single, more modern nomenclature
across various providers that regulators can use could be developed for the next
generation of policies.

Embracing a cloud-first mentality across the organization required adjustments
internally, too. Drumgoole arrived at GE two years ago to find the traditional
angst between software developers and infrastructure operators. Devs can’t get
the infrastructure they need; ops folks don’t know what the software teams need. Cloud seemed like the natural answer to this problem.

GE invested in building tools, creating systems and processes for managing it and
ensuring regulatory compliance. When GE’s IT team introduced the cloud
services, some of those software developers and ops teams didn’t want to use it.
“Some of the legacy, single-technology developers struggled with deploying and
moving apps when we took away the support envelope of a traditional
infrastructure team,” he says, adding that the challenge has largely been
overcome, though it required a shift in mindset.

Last and foremost is - Xerox

Migrating to a cloud-first strategy to suit the needs of the modern day B2B business challenge
Over the last 20 years Xerox has adapted its offering considerably. Having started life inventing the copier, laser printer and Ethernet, Xerox more recently launched its service business helping
organizations in the insurance, healthcare, government and retail sectors with their IT and other
back office challenges. Its key focus at present is print technology with a lot of its work centered around 3D printing and metallic print.

Xerox’s SMART Centre website was created for salespeople, both internal and external, to access all of the company’s sales content and collateral to help them on their selling journey. The site was designed to make the salespersons journey a lot easier and more streamlined and sees up to 7,000 unique visitors per month.

Project objective
Xerox has worked with Optimizely for many years using its Ektron Content Management System for its SMART center website. However, the UK&I team saw an opportunity to transform their digital operations over to the cloud. This would not only save on server costs but improve its online experience for colleagues, sales staff and distribution resellers.
Transforming the printing industry one page at a time
Xerox migrated to Optimizely’s Digital Experience Platform which provides a cloud-first approach to engagement including high availability and performance, easy connectivity with other cloud services and existing systems, ability to manage spikes in customer demand and a platform that is ready to seamlessly adopt the latest technology updates. The new platform helped to adapt the site to the changing needs of the salesperson from previously being a library of downloadable content to a site with more functionality, different types of views and more calls to action.

Results
The key reason for the migration was to save on server costs as having everything hosted on a local Xerox service when the company was still using Ektron was costly. The move saved the printing company a considerable amount of money on these costs alone.
Other benefits have included performance improvements in terms of site speed and load times.
Xerox chipped off two seconds from its average page load speeds using the cloud service and has also noticed less instances where the site has experienced downtime, which was common when it was still using its local servers.
The functionality of the site has also improved since the migration. Using advance and insights
enabled on the platform, Xerox was able to create a connection point for most of the company’s
sales tools into a tools marketplace to allow users access to them in an app store format. This has been advanced further with AI by providing recommendations for sales tools that users might want to use depending on which part of the sales cycle that they are currently in.

Future plans
Xerox’s future plans are centered around content. The company produces lots of content daily which is also translated and a lot of analysis is currently happening to see if producing so much content is justified.
Because of the in-depth insights offered on the new platform, Xerox is able to capture usage information to tell who is reading the content and how users are interacting with the platform.

Oldest comments (0)