What is grid computing?
Grid computing is a processor architecture that combines computing resources from different domains to achieve one main goal. With grid computing, the computers in the network can work on a task together and thus act as supercomputers.
Usually a grid works on various tasks within a network, but it can also work on specific applications. It is designed to solve problems too big for a supercomputer while maintaining the flexibility to handle a wide variety of smaller problems. Computing grids provide a multi-user infrastructure that takes into account the discontinuous requirements of large-scale information processing.
A grid is connected by parallel nodes that form a computer cluster that runs on an operating system, Linux or free software. The cluster can vary in size from a small workstation to multiple networks. The technology is used for a variety of applications, such as: B. for mathematical, scientific or educational tasks over several computer resources. It is widely used in structural analysis, web services such as ATM banking, back office infrastructures, and scientific or market research.
The idea of grid computing was first established in the early 1990s by Carl Kesselman, Ian Foster and Steve Tücke. They developed the Globus Toolkit standard, which included grids for data storage management, data processing, and intensive computation management.
Grid computing consists of applications used for computational computing problems that are connected in a parallel network environment. It connects every PC and combines information into a computationally intensive application.
Grids have a variety of resources based on different software and hardware structures, computer languages and frameworks, either on a network or using open standards with specific guidelines to achieve a common goal.
Grid operations are generally divided into two categories:
Data Grid: A system that processes large distributed data sets that are used for data management and controlled user sharing. It creates virtual environments that support dispersed and organized research. The Southern California Earthquake Center is an example of a data grid; It uses a medium software system that creates a digital library, a dispersed file system, and ongoing archiving.
CPU Scavenging Grids: A cycle scavenging system that moves projects from one PC to another as needed. A well-known CPU scavenging grid is the search for extraterrestrial intelligence that includes more than three million computers.
Grid computing is standardized by the Global Grid Forum and used by the Globus Alliance with the Globus Toolkit, the de facto standard for grid middleware, which includes various application components.
The grid architecture applies the protocol defined by the Global Grid Forum, which includes:
- Grid security infrastructure
- Monitoring and discovery service
- Grid resource allocation and management log
- Global access to secondary storage and GridFTP