Datacenter migration, big data analysis and the Internet of things are helping organizations build smarter storage infrastructure.Cheaper and more intensive cpus drive smarter built-in intelligence to every layer of the data storage infrastructure stack.
In storage, for example, too much computing power can be used to deploy agile software defined storage (for example, HP EnterpriseStoreVirtual), switch to super fusion architecture (for example, HyperGrid Nutanix Pivot3, SimpliVity), or by intelligently redistribution storage function to optimize the application server and disk I/O host.
However, all of these built-in intelligence have a fault, is may reduce people in data storage infrastructure and change (any IT changes) between visibility, whether IT is due to the repair and upgrade the user, extension, or complex error and component failures.Or, in other words, with powerful and cheap processors, dynamic optimization makes it increasingly difficult to determine what is happening to its infrastructure.
So when people don't need to know any details and can simply rely on low-level components to do the right thing, until they have an absolutely autonomous data center.Now that corporate public cloud computing does not eliminate the need for in-house expertise, IT may find IT a double-edged sword.In addition, although more intelligent data storage infrastructure to help people configuration, optimization, growth plans, and troubleshooting, it may make people blind or fool, infrastructure and positive efforts to people's "will".
Although all these potential downsides require choices, people want to be in a smarter, more autonomous IT world (even if there is some risk of ai getting out of control).
It's all about data
Remember that the previous analysis was an offline process?Capture some data in a file;Open Excel, SAS, or other desktop workers and receive a suggestion a few weeks later.Today, this analysis is too long and too simple.
Dynamic optimization, with its powerful, low-cost native processors, makes it increasingly difficult to determine what is happening to our infrastructure.
Given the speed and agility of applications and users, not to mention larger data streams and flexible cloud proxies, people need insight and faster answers than ever before.The smart start from a large number of reliable data, produce more and more every day in today's infrastructure data (in fact, due to the rise of the Internet of things, people will soon be submerged in the new data), and to deal with and manage all of this information.
Storage arrays, for example, have long provided insightful data, but historically required vendor-specific, complex and expensive storage resource management applications to take full advantage of it.Fortunately, there have been a number of developments today that have enabled people to have smarter IT systems management and better and faster generation of user data:
Data processing.As the Internet of things grows, storage components are generating more and more detailed data.This growing data requires big data analysis techniques within IT itself.IT administrators spend some time learning Python and Spark skills.
Consumption API. Modern storage platform now offer or produce easy-to-use free apis (said state transfer (API), allowing anyone to (with permission) directly to use almost any type of third party access to critical data analysis tools.The standard API also enables and enhances third-party system management by integrating platforms such as OpenDataSource.
Home support calls.Most storage providers now call home support calls into their arrays, enabling them to send detailed machine logs to vendors for daily processing.Vendors can then use big data tools to aggregate data to provide proactive support and insight to customers for better product management and marketing.Family call function can also be from Glassbeam suppliers as a service, IT can also help provide a client portal as a "value added", directly to the IT end-users provide insight into use and performance.
Fortunately, today's infrastructure has a number of developments that can help people better manage smart IT systems and generate user data.
Visualization.Big data for IT provides a number of excellent visualization tools that are often used by enterprise business intelligence applications (such as Tableau).Therefore, IT itself can now build business-friendly dashboards and reports.At the same time, many vendors have made it easier to create and provide custom product dashboards and shareable widgets with a clearer and more accessible open source visual library, such as d3.js.
Next generation intelligence.Some vendors are doing really smart things outside of visualization.It is not enough for the supplier to help provide specific key performance indicators for advanced products (for example, VMwarevRealizeOperations, Tintri, Pernix/Nutanix) to transfer these detailed data to operational intelligence.As a first step, today's vendors can skillfully accumulate low-level data streams into expert "models" of health, capacity or risk.Some models generate linear predictions for unique "fractions" of each particular platform.Truly advanced modeling can take into account future workload growth and data storage infrastructure upgrade plans, and can predict non-linear performance based on analysis of queuing behavior.
Intelligent machines
With the development of big data analysis and the application of the Internet of things, there must be exciting new developments and more intelligent data storage infrastructure space.
For example, people only see the beginning of the application of machine learning in systems management.Pay attention to the machine learning more intelligent optimization, as software as a service analysis service, embedded client console for dynamic operations, strategic planning in the dashboard and portal for intelligence, and pushes it into the equipment to help them become more independent.
People should not be surprised if a car will drive itself soon.When storage arrays begin to tell people that they can process data.One day, people may have to conduct an enterprise-oriented IT intelligence test for new storage arrays to see if they are ready to run data centers.
沒有留言:
張貼留言