• Hi there and welcome to PC Help Forum (PCHF), a more effective way to get the Tech Support you need!
    We have Experts in all areas of Tech, including Malware Removal, Crash Fixing and BSOD's , Microsoft Windows, Computer DIY and PC Hardware, Networking, Gaming, Tablets and iPads, General and Specific Software Support and so much more.

    Why not Click Here To Sign Up and start enjoying great FREE Tech Support.

    This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Effectively managing the enterprise data deluge

PCHF IT Feeds

PCHF Tech News
PCHF Bot
Jan 10, 2015
49,808
26
pchelpforum.net
Over recent years, emerging technologies such as 5G, the Internet of Things (IoT) and artificial intelligence (AI) have generated significant excitement across many industries. This is largely because they have the potential to dramatically change the way in which information or data is stored and used, enhancing transparency and security, while also improving productivity and insights. However, these technologies also have the potential to generate extreme amounts of data, which will require storage facilities to match, with an emphasis on low latency and high capacity.

About the author


Davide Villa is the EMEAI Business Development Director at Western Digital.


It’s difficult to predict the future, but one fairly safe prediction is that the amount of data produced will continue to expand exponentially in the coming decade. IDC recently predicted that more than 59 zettabytes (ZB) of data will be created, captured, copied, and consumed in the world this year. With this huge explosion in data already happening, how can companies across all industries prepare and optimize their storage systems?

Industry-wide data explosion


Organisations across multiple industries now have troves of raw data that require powerful and sophisticated analytics tools to allow them to gain insights that can improve operational performance and create new market opportunities. The businesses that are able to harness these capabilities effectively will be able to create significant value and differentiate themselves, while others will find themselves increasingly at a disadvantage.

The financial sector is one vertical that is constantly inundated with enormous amounts of data, ranging from banking transactions to analyst projects and stock prices. Another industry is manufacturing, where automation is being applied to IoT, analytics and to production lines, increasing the output capacity of production systems and scaling hundreds of units to thousands or even millions of units per hour.

As businesses across all sectors contend with the perpetual growth of data, they need to rethink how data is captured, preserved, accessed and transformed. Whether it’s high-performance AI, hyper-scale always-on systems or even personal gaming, many legacy storage systems experience lower performance, higher latencies, and poor quality of service when confronted with some of the new challenges of fast data.

The NVMe™ solution to big data


To address these challenges, many businesses are turning to Non-Volatile Memory Express (NVMe), the only protocol that stands out for highly demanding and compute-intensive enterprise, cloud computing and edge data ecosystems. NVMe offers a selection of innovative features that are having a great impact on businesses and what they can do with data. These include:

1. Increased performance


The first flash-based SSDs leveraged legacy SATA/SAS physical interfaces, protocols, and form factors, but none of the interfaces and protocols involved were designed for high-speed storage media. PCI Express (PCIe) was the next logical storage interface, but early PCIe SSDs leveraged proprietary firmware, which was particularly challenging for system scaling. NVMe emerged as a result of these challenges, as it offers significantly higher performance and lower latencies compared to legacy SAS and SATA protocols.

2. Easy to deploy


NVMe storage systems can be implemented without a specialized networking infrastructure — traditional Ethernet or Fibre Channel connectivity will do. This is important as it means no changes are required to the application.

3. Benefits for the bottom line


Conventional protocols consume many CPU cycles to make data available to business applications, and these wasted compute cycles cost businesses real money. IT infrastructure budgets aren’t growing at the pace of data and businesses are under tremendous pressure to maximize returns on infrastructure – both in storage and compute. Because NVMe can handle rigorous application workloads with a smaller infrastructure footprint, organisations can reduce the total cost of ownership and accelerate top line business growth.

Data solutions


We are just starting to scratch the surface of the data revolution. New discoveries coming from IoT, machine learning and new applications are transforming the value of data, and organisations need to re-think their storage solutions in order to efficiently handle these new technologies.

NVMe offers enterprise features that have simply not existed before, opening up a new paradigm for businesses to design and build their applications – with higher performance, lower latencies and at a fraction of the cost. The adoption of NVMe will be a vital step for any organisation who wishes to prepare for the big data revolution of the 2020s and beyond.


Continue reading...