How big is “big data”?

Compuserve for DOS

 

The internet is constantly changing. According to Cisco, by itself, North America will produce over 1.1ZB (!!!!) of information per year by 2016.

I can remember being a kid even before AOL was big, sitting on a computer that weighed over 20 lbs and which ran DOS 3.0 and Windows v.1.1. I would surf the forums on Compuserve, using up my dad’s 5 hour allotment for the month on the first day of the new billing cycle. Then AOL came out and my dad got that as well as Compuserve, still running on that 286 machine.

Then the internet started becoming popular and so AOL and Compuserve added the ability to browse the internet and began offering unlimited access. Of course, being on DOS, I wasn’t able to get onto the internet — it was only for people that could run Windows 3.1 or higher. So when I would go over to a friends house that had a Windows-based machine, we would browse the internet that way.

If you can remember back that far, you’ll recall that AOL had roughly 30 million members at its peak. That’s more than the estimated 2011 population of New York City and Los Angeles, combined. However, AOL was by and far the largest ISP in the world at the time. In 2000, according to Internet World Stats, there were 360,985,492 internet users worldwide, meaning AOL had slightly less than 10% of the global market. This was well before Facebook. Well before Twitter. Even before Ebay, if you can believe it. By comparison, today there are an estimated 2,405,518,376 people that are internet-connected. So in 12 years, (its only the beginning of 2013, so there isn’t data for this year yet), internet use has jumped 566.4%

Getting to the point, looking at the history gives us a little perspective on where we’re headed. According to Google’s CEO, Eric Schmidt, back in 2010: every two days, we create as much information on the internet as we did from the beginning of civilization up to 2003. Speaking in technical terms, according to the video included in this post from ZeroLag: in 2011, North America produced 261 exabytes of data per year. That’s equivalent to 264,274 petabytes, or 273,678,336 terabytes, or 280,246,616,064 gigabytes. That is a LOT of information. The video goes on to estimate that by 2016, North America will produce around 1.1 zettabytes (a little over 1,024 exabytes) of data. Again, that is North America by itself; not the entire world. I’d be curious to see how Cisco compiled their information, but I’m even more curious how we’re going to securely and reliably store all of this data, going forward. Oh, and don’t even get me started on the power and backup requirements!

What are your thoughts?