Small Data

by noospheer

Big data is big buzz these days. Essentially, there’s a lot of data out there and its a massive, attic-style mess — crap is strewn all over the place!

Companies in this space love to talk about peta, exa, zetta or yottabytes (1 petabyte = 1 million gigs). Yet the entirety of Wikipedia English is ~200gb. Seemingly forgotten in the big data world is the process of normalization, where a data set is compressed without losing the information’s fidelity.

With some simple record deduplication, combined with a generic spatial schema that works with tabular and graph data, noospheer will make big data a lot smaller — thus permitting far greater informational diversity and complexity than currently achieved.

You can download wiki here.

Advertisements