Sensaphone, a typical data protection architecture consists of a server whose only purpose is to receive data from the endpoint. This server is responsible for extracting data from the endpoint or receiving data from the endpoint. It can also perform duplicate data deletions, compress, and update the file and media database. All of these features make the server focus on the task to be the best practice. But the more than 10 - year way of doing things is still the best practice.
In the past ten years, many things have changed. The data center needs a high-end system to manage all the responsibilities on the backup server. In addition, because of the very limited computing power, applications are also specially used for single server to ensure that they get the required performance. Now, most of the middle end servers provide enough power to drive the backup process, and the computing power needed to run applications is also very large. By virtualization, people can now stack multiple applications on each application.
The dedicated backup server also has shortcomings. First, companies have to buy high-end servers to back up the data, and in most cases, this data can only happen once a day. Second, the backup server becomes a bottleneck. Although dozens or even hundreds of systems can send data to them at the same time, all of these data must be merged into a system. The backup server is unusual from the point of view of the network and computing.
Another challenge for a dedicated backup server is the size. What will the organization do if the server runs out of the network or computing resources? It has to be a bigger increase, and the backup server is not uncommon for both the network and the computing point of view.
Direct backup
Those who want to modernize the data center may consider another option, that is, a direct backup to the cloud. A direct backup means that a physical server or even a virtual machine sends the data directly to a backup repository based on cloud computing. Direct transmission to cloud backup eliminates the concern of enterprises on computing and network expansion. When a new server is added, these resources are basically zoomed.
The concern for direct backup is the potential impact on the performance of applications, but in computing resources rich modern data centers, computing power is far less worrying than before. Another problem is management, and the enterprise needs to consider how to manage the backup of all these separate components and how to protect the ownership of protected data.
Solving these problems requires a new cloud computing software architecture that can centrally manage thousands of endpoints and integrate them into a repository. Cloud computing is an ideal choice for such actions. The endpoints can perform their own repeating data deleting and compression, effectively sending the new data segments directly to the cloud. The software hosted by cloud computing is in essence the choreography and management engine of all the endpoints it protects. It should also provide global duplication of data to control the cost of cloud storage.
Considering the amount of protected data, users' expectations of protected data and available computing to drive the process, the classical backup architecture with dedicated backup servers needs to be changed. It is no longer a single dedicated server, and the backup software needs to be more dispersed. One way to achieve this goal is a direct backup, that is, the application server sends the data directly to the backup device or target.
沒有留言:
張貼留言