Data Center Evolution, History; What is a data center? History, 1960-2000, 2000-2022,Today and beyond, The First data center in history, Looking forward to the future.
What is a data center?
Data centers, also known as data centers or data centers, are facilities that provide companies with a network of computers, storage systems, and computing infrastructure to assemble, process, store, and disseminate large amounts of data. Applications, services, and data contained within a data center are typically critical to a business’s daily operations, making them an important asset.
In addition to cloud computing resources and on-premises resources, enterprise data centers increasingly integrate facilities for securing and protecting them. Cloud computing makes it harder for enterprises to distinguish between their own data centers and the data centers of cloud providers.
Historically, data centers can be traced back to the huge rooms for computers in the 1940s, typified by ENIAC, one of the earliest data centers.
Early computer systems were complex to operate and maintain, and so they required special operating environments. In order to connect all the components, numerous cables were necessary, and many methods were devised to accommodate and organize them, such as equipment racks, raised floors, and cable trays (installed overhead or under the elevated floor).
One mainframe consumed a lot of power and was cooled to avoid overheating. The importance of security increased as computers became more expensive and were often used for military purposes. The following guidelines were therefore developed for controlling access to the computer room.
People started deploying computers everywhere during the boom of the microcomputer industry, especially in the 1980s, often with little or no consideration of operating requirements. However, as information technology (IT) operations became more complex, organizations began to realize the importance of controlling IT resources.
As a result of low-cost networking equipment and the introduction of new standards for structured cabling, it was possible for the servers to be located in specific rooms inside the company using a hierarchical design. Around this time, the term “data center” started to gain popularity as applied to specially designed computer rooms.
As a result of the dot-com bubble between 1997 and 2000, data centers boomed. To deploy systems and establish an online presence, companies needed fast Internet connectivity and non-stop operation. For many smaller companies, such equipment was not feasible.
The Internet data center (IDC) is a large facility that provides enhanced capabilities, such as crossover backup: “In the event a Bell Atlantic line was cut, we could move to … to minimize the time an outage would last.”
Data centers in the cloud are referred to as cloud data centers (CDCs). The cost of building and maintaining a data center is usually high. In recent years, these terms have almost completely disappeared and are now referred to as “data centers.”
1960-2000: Data Center Evolution, History
Computing transitioned from vacuum tubes to transistors during the 1950s and 1960s, which occupied much less space. Yet, the mainframe computers of the day took up so much space that they took up entire rooms.
In this era, the government operated many data centers, but companies were also beginning to invest in computers. Computers cost about $5 million at the time or nearly $40 million in today’s currency. Several organizations decided to rent computers instead because of the high price tag, which costs around $17,000 per month ($131,000 in today’s currency).
Computing advanced dramatically in the 1970s. A year later, Intel began selling microprocessors commercially, and Xerox introduced the Alto, a minicomputer with a graphical user interface that led to the rise of the PC. Furthermore, Chase Manhattan Bank created the world’s first local area network.
ARCnet was capable of connecting up to 255 computers. As companies of all types became more interested in air-cooled computing designs that could be housed in offices, the demand for mainframes housed in separate data centers decreased.
Computers took over the computing landscape in the 1980s. Some large technology companies, including IBM, continued to build data centers to house supercomputers, but computing largely moved from back rooms to desktops.
Pendulums swung the other way again in the 1990s. Organizations again began setting up special computer rooms to house some of their computers and networking equipment with the advent of client-server computing models. With the growth of the Internet, the term “data center” came into common use, and some companies began building large data centers to house all their computing equipment.
The early 2000s saw a boom in data center construction due to the dot-com bubble. Every organization suddenly needed a website, which meant they needed a Web server. The emergence of hosting companies and co-location facilities led to thousands of servers being housed in data centers. Power consumption and cooling of data centers became more important issues.
In response to the trend toward cloud computing, some organizations have reduced the number of servers at their own data centers and consolidated into fewer facilities. In parallel, the major public cloud providers have built enormous, energy-efficient data centers.
VMware ESX was launched in 2001. Initially, server virtualisation products used bare-metal hypervisors that ran directly on the server hardware without additional underlying operating systems.
In 2002, Amazon Web Services introduced cloud-based services including storage, computation, and Amazon Mechanical Turk.
Through web services aka cloud computing, Amazon Web Services offers IT infrastructure services to other industries starting in 2006.
The modular data center was introduced by Sun Microsystems in 2007 and reshaped corporate computing in the process.
In 2008-2011, enterprises started focusing on power efficiency, cooling technologies, and facility management of data centers.
In 2011, Facebook launched the Open Compute Project, an initiative to share best practices and specifications for developing energy-efficient and economical data centers.
Google invested USD 7.35 billion in capital expenditures in 2013 to build a global network of data centers.
Telcordia standardized requirements for equipment and spaces in telecommunications data centers.
With the prevalence of edge data centers in 2019, a distributed computing paradigm has emerged, changing industry dynamics.
Data is connected across multiple data centers, the edge, as well as public and private clouds. The data center can communicate across sites, including those on-premises and those in the cloud.
Today and beyond
Modern data centers are moving away from an infrastructure, hardware, and software ownership model toward a subscription-based and capacity-on-demand model.
To meet application demands, especially those of the cloud, today’s data centers need to match cloud capabilities. Thanks to consolidation, cost control, and cloud support, the entire data center industry is now changing.
Cloud computing combined with today’s data centers allows IT decisions to be made on a “call by call” basis about how resources are accessed, but the data centers themselves remain separate and untouched.
The First data center in history
Data centers were first constructed so that the first-ever computer – the Electronic Numerical Integrator and Computer, or ENIAC – could be housed. ENIAC was developed in 1946 by the US Army Ballistic Research Laboratory to store codes for firing artillery. Only 0.05 MIPS of computing power was available with 1,800 square feet of floor space and 150 KW of power in the vacuum tube-based machine.
As early as the 1940s, there were early computer systems, such as the Electronic Numerical Integrator and Computer, or the ENIAC, that were used to run data centers. They were complicated to maintain and operate since they were used by the military. To accommodate all the equipment and implement the necessary security measures, special computer rooms were required with racks, cable trays, cooling mechanisms, and access restrictions.
However, the term data center was not first used until the 1990s, when IT operations began expanding and inexpensive networking equipment became available. The possibility of storing all a company’s necessary servers within the company arose. In the organizations, these specialized computer rooms became known as data centers.
Due to the use of the internet and maintaining a constant presence on the internet during the dot-com bubble in the late 1990s, companies needed larger facilities to house the huge amount of networking equipment. The development of data centers became popular at this time and they began to resemble those described above.
Looking forward to the future
As cloud computing, the Internet of Things, and Cyber-Physical Systems (also called artificial intelligence) continue to flourish, data centers will become the core of the digital economy.
Today’s colocation facilities leverage all the innovations that have been hard-won over the last half-century in terms of connectivity, sustainability, efficiency, and resilience. Therefore, it’s not surprising that business is booming today; according to Research and Markets, colocation is projected to reach 55.31 billion US dollars by the end of 2021.
It is inevitable that further changes will occur. Despite the fact that no one knows what the future holds, state-of-the-art colocation facilities can help organizations prepare.
External resource: siliconangle