[caption id="attachment_1506" align="aligncenter" width="618" caption="IBM's Big Data initiatives are designed to crunch data like it's never been crunched before."] [/caption] IBM’s new Smarter Computing initiative, which includes several improvements to the company’s data-processing and storage technologies, is designed to boost its presence in Big Data. In addition, it could also help the company achieve the industry’s long-held goal of exascale computing, or processing power that exceeds a benchmark of one million trillion floating-point calculations per second. Government agencies want exascale computing by the end of the decade, and IT vendors such as Intel are hard at work on systems capable of delivering that sort of performance. IBM’s newly announced products include the IBM Platform Symphony family, a grid manager for large data workloads and analytics; the IBM System x Intelligent Cluster integrated with IBM Platform HPC software, the better to simplify cluster deployment and deliver faster data results; and the inclusion of IBM Platform Cluster Manager and IBM Platform LSF to the High Performance Computing (HPC) Cloud portfolio. On top of that, IBM unveiled improvements to the IBM General Parallel File System, which will process workloads faster, and the new edition of its iDataPlex system, the IBM System x iDataPlex dx360 M4. Under the umbrella of a “Smarter Storage” initiative, it also introduced a wave of enhancements to storage products such as IBM System Storage SAN Volume Controller (SVC), a storage virtualization system, and IBM Storwize V7000. With these additions, IBM clearly wants a much larger piece of the Big Data market, particularly as it applies to heavy-duty analytics and technical programming, also known as the technical computing market. Research firm IDC predicted that market will expand from $20.3 billion in 2012 to almost $29.2 billion by 2016. According to one analyst, IBM could have a larger goal in mind with its new wave of products. “IBM believes that to achieve its goal of exascale computing, it has to add intelligence to its systems, storage, networking equipment and a whole host (pun intended) of software,” Dan Kusnetzky, an analyst with the Kusnetzky Group, wrote in a June 6 research note. IBM also used its announcement to emphasize the integration of products from its Platform Computing acquisition into the portfolio, a fact not lost on Kusnetzky. “Platform’s customers have been using Symphony and LSF to support technical computing, modeling for commercial workloads and Big Data applications for years,” he wrote. “Unfortunately, very little of that success has shown up on the industry radar screen.” IBM closed its acquisition of Platform Computing in January 2012. The vendor specialized in cluster and cloud-management software and environments for high-performance computing (HPC) applications. IBM evidently means to change that under-the-radar status, he added: “Now that IBM’s Platform Computing group has made it possible for users of Map Reduce to manage their efforts, it is very likely that we’re going to hear more about Platform tools.”   Image: Principal/Shutterstock.com