Uri Friedman has put together a brief history of Big Data on ForeignPolicy.org. Here are a few highlights from the timeline: In 1997, “NASA researchers Michael Cox and David Ellsworth use the term ‘big data’ for the first time to describe a familiar challenge in the 1990s: supercomputers generating massive amounts of information — in Cox and Ellsworth’s case, simulations of airflow around aircraft — that cannot be processed and visualized. ‘[D]ata sets are generally quite large, taxing the capacities of main memory, local disk, and even remote disk,’ they write. ‘We call this the problem of big data’.”

In 2002, “After the 9/11 attacks, the U.S. government, which has already dabbled in mining large volumes of data to thwart terrorism, escalates these efforts. Former national security advisor John Poindexter leads a Defense Department effort to fuse existing government data sets into a ‘grand database‘ that sifts through communications, criminal, educational, financial, medical, and travel records to identify suspicious individuals. Congress shutters the program a year later due to civil liberties concerns, though components of the initiative are simply shifted to other agencies.”

See the full timeline here.

Image: Courtesy Flickr/ mrflip