2018年6月3日 星期日

Datacenter migration, from DevOps to DataOps

Datacenter migration,If we were to list the current major trends affecting enterprise data centers, most technical people and technology investors would probably agree on a range of core trends.This list of core trends will include technologies such as cloud computing, container and virtualization, microservices, machine learning and data science, flash memory, edge computing, NVMe, and GPU.These technologies are very important to promote the digital transformation of enterprise organizations.
But the harder question is: what happens next?What emerging technologies or shifts in trends are expected to be the next big event in the data center industry?How will these emerging technologies and trends affect the hardware and software markets?
 Datacenter migration
At present, an emerging trend has begun to gain wide attention in large enterprises.This is a practice called DataOps.The new term is by another more famous "conversation" this concept derived, the concept of "enterprise" in ten years ago has emerged, its purpose is to integrate software development (" dev ") and operation (ops) operation.Although DataOps and DevOps are aligned in some areas of their goals, they are more visible in some of the major changes we see in the data center industry today.
The concept of DevOps was proposed
Let's start with DevOps.As early as 2008, when the conversation this concept was put forward for the first time, IT describes a aims to maximize the application to build and deploy automation and repeatability of IT in the process of practice.The idea is that if software developers and operations professionals can work closely together, it will be faster and cheaper to build and deploy applications.The goals of the practice include improving agility, achieving faster product launch times and continuous application delivery.
Companies such as VMware, Docker, Puppet and Chef have all caught up with the DevOps wave.
The DevOps bubble burst
DevOps has stalled despite an early frenzy and excitement among software developers.According to a 2017 study, DevOps has not fully fulfilled its initial commitment.In this study, a total of 2197 enterprises IT executives surveyed, only 17% of the respondents ranked the conversation has significant strategic influence on its enterprise organization - and this percentage is much lower than big data (41%), and a public cloud infrastructure as a service (39%).One of the interviewees explained that the DevOps approach does not adequately take into account data-intensive applications.
The rise of data
If this has been a trend in almost all affect the industry enterprise group, so there is no doubt that this trend is the enterprise pay more and more attention to the use for data analysis to promote the realization of business value.By 2020, according to a study by IDC, there will be 44 zettabytes of data worldwide, compared with just 3 exabytes in 1986.Whether it is improving customer experience, improving operational efficiency or creating new sources of revenue, data can provide leverage for enterprises across various industries to enhance their competitive advantages in the market.
Why is the data so important
If the use of the data has become the current enterprise adopted by the subversive part of the business model, then consider how to manage and deploy data-intensive applications has become the core of enterprise practice IT.Unlike the lightweight applications that the DevOps approach focuses on, there is a new set of considerations when enterprise organizations start talking about data-intensive applications.
Data management practices involve the entire application life cycle.For example, the development of data science and machine learning applications requires the use of large amounts of training data.The operating team deploys different applications.For performance reasons, data-intensive applications need to consider the locality of the data, which means that the process needs to be deployed near the location where the data is continuously generated.In addition, access to data must be controlled and managed by strict IT security policies as long as data is used by different teams within an enterprise organization.
DataOps are for data-driven applications
These new data-centric concepts have stimulated the data center industry's need for practices that transcend DevOps constraints.In short, DataOps is a flexible and agile way to develop and deploy data-intensive applications.This is largely influenced by the enterprise in machine learning and the development of the data science team encouragement, this needs to be involved within the enterprise software developers and architects, security and management professionals, data scientists, data close cooperation between the engineer and operating personnel.DataOps is a staff/process model that aims to improve repeatability, productivity, agility, and self-service while implementing continuous deployment of data science models.
In our cooperation with some large enterprise groups, we find the data in some large enterprise employs thousands of scientists, I noticed that these enterprises in the infrastructure to support DataOps, platform and tool types corresponding changes have taken place.Although some of the tools used to support enterprise practices (such as containers and virtualization) still is the core of DataOps, but there are other need to force the use of a newer technologies that may indicate the market winner in the next decade.
First, at the tool level, the DataOps practice requires a data science platform that supports related languages and frameworks, such as Python, R, data science notebooks and Github.In addition, a strong solution should help enforce strict data access and management policies at all stages of the process.Data-as-a-service or self-service Data marketing tools are critical.
At the platform level, DataOps requires a unified data architecture that manages and provides access to large amounts of data, including traditional structured data and updated unstructured and streaming data sets.With a global data architecture, data can be managed across physical locations and processed using a wide range of computing engines, including containerized processes.Finally, the platform chosen to support data-intensive applications must be optimized for data locality.
The next generation of market winners
As a software industry veteran and student, I know that the only constant in the industry is constant change.While no one has a crystal ball to predict the future, I think it's safe to say that the data centers of the next decade will be different from those of the past decade.DataOps is a noteworthy trend.As business adoption of these practices becomes more common, I predict we will see a corresponding shift in the technology market.The winners will be companies that can provide the tools and platforms to make it easier to develop and deploy data-intensive applications.

沒有留言:

張貼留言