[caption id="attachment_3568" align="aligncenter" width="500"] This will definitely not satisfy your company's data-storage needs.[/caption] How much data storage does any organization need? That’s a tough question for any CIO or IT administrator, one made that much harder by many businesses’ newfound emphasis on data analytics. How much storage does an IT department need to provision when there’s an expectation of keeping and analyzing mountains of information from customers, social networks, retail transactions, and more? Hence, the rise of companies such as Cleversafe, which just announced a series of Intel Xeon processor-based storage arrays capable of capturing data at a 1-terabyte-per-second rate. Given the system’s 10-Exabyte storage capacity, that’s the equivalent of 20 million 3MB photographs per minute, or 1.728 million 50GB high-definition videos. “Petabyte and Exabyte scale data sets simply exceed the boundaries of normal data processing capabilities,” Terri McClure, senior analyst at the Enterprise Strategy Group, wrote in an Aug. 14 statement tied to the Cleversafe announcement. “With traditional storage systems there are too many potential performance bottlenecks to capture data flow at such volumes.” Cleversafe’s models will hit the market at an unannounced point in the fall. Wrestling with data at enormous scale has become a preoccupation of many IT vendors (in addition to the companies that need to handle those enormous data-loads, of course). It’s led to the creation of initiatives such as IBM’s G2 project, which centers on the creation of systems that identify relations between new and existing data in real time—which in theory would speed up decisions about new data while reducing the overall amount of data required for processing and storage. Offerings such as SAP’s High Performance Analytics Appliance (HANA), an in-memory database technology, also give organizations the ability to shift performance-sensitive analytic application workloads onto systems that crunch data at an incredibly fast rate. But while IT vendors are becoming much better at designing systems for storing and processing large amounts of data, there’s an increasing shortage in the number of analysts and programmers capable of building tools that can drill into that data in new and insightful ways. “The enterprise data warehouse was always a pie in the sky to begin with. Howard Dresner, chief research officer for Dresner Advisory Services, told SlashBI in July. “Governance, meanwhile, is already suffering and nobody is coupling investments in data scientists with governance.”   Image: Dino O./Shutterstock.com