[caption id="attachment_10061" align="aligncenter" width="500"] Hopefully this piece of abstract art will convey the total chaos of the Web over the next few years.[/caption] The number of Internet users worldwide will explode to 48 percent of the world’s population by 2017, according to Cisco’s annual Visual Networking Index (VNI), which forecasts and analyzes Internet network growth. That’s 3.6 billion Internet users relying on 18 billion global network connections (up from 12 billion connections in 2012) and generating 3 trillion Internet video minutes per month—the equivalent of 6 million years. The average Internet-enabled household will generate 74.5 gigabytes of data per month from 7.1 devices/connections (up from 4.7 devices/connections in 2012). By 2017, annual global IP traffic will hit 1.4 zettabytes, which is more traffic than the Internet managed to generate between 1984 and 2012. (A zettabyte is the equivalent of a trillion gigabytes.) Clearly there’s an insatiable demand for bandwidth that’ll only increase in the years ahead. Cisco’s Visual Networking Index 2013 doesn’t differ greatly from last year’s edition—the latter capped its range at 2016, when it predicted that global IP traffic would reach 1.3 zettabytes. For a firm like Cisco, which sells all sorts of networking equipment and software, a rapid expansion in network traffic could translate into more customers needing to purchase infrastructure and services. That’s a win, so long as the software and hardware innovation continues in ways that meet customer demands. But for analysts and companies that deal in Big Data, a growing Internet presents some very real challenges. There’s already plenty of data in the world; the key is finding tools and processes that can collect, store, and analyze all that information in meaningful ways. Firms with a lot of unstructured data to manage under considerable time and resource constraints, such as Facebook and Netflix, have turned to open-source frameworks such as Apache Hadoop, which can run data applications on large hardware clusters. It’s also driven those firms to create new technologies: in order to deal with the tidal wave of data sloshing through Facebook’s infrastructure, for example, the social network’s engineers built a new scheduling framework known as “Corona,” which offers better data-cluster utilization, scheduling fairness, and job latency. But new technologies don’t stay new for very long. Whether managing an inflow of data (i.e., Facebook, Google), serving massive amounts of data (Netflix, Amazon, Apple), or slicing and dicing that data in order to serve up ads or narrow down customer demographics (pretty much everyone), the job’s about to get a lot harder for analysts and engineers in years to come.   Image: agsandrew/Shutterstock.com