This article is more than 1 year old

What you think is big data isn't what the bloke next to you thinks is big

Enter NIST with a standards process to define bigness and assist interoperability

America's National Institute of Standards and Technology (NIST) has added another monster document to its big data frameworks, this time covering interoperability.

The seven-volume, 340-plus-page tome is designed to set the direction for America's input into international standards, the NIST says in its canned announcement.

NIST digital data advisor Wo Chang says the aim is “to develop a reference architecture that is vendor-neutral, and technology- and infrastructure-agnostic to enable data scientists to perform analytics processing for their given data sources without worrying about the underlying computing environment.”

The ballooning accumulation of data is on the group's mind: “The rate at which data volumes, speeds and complexity are growing is outpacing scientific and technological advances in data analytics, management, transport and more”, the announcement notes.

User confusion is another problem, NIST reckons, with people unsure about the characteristics that demand the word “big” in front of the word “data”, and how big data is different from traditional data environments and applications.

Other key questions the request for comment identifies are “How do Big Data systems integrate into our current data systems? What are the central scientific, technological and standardisation challenges that need to be addressed to accelerate the deployment of robust Big Data solutions?”

The NIST Big Data working group, and the documents that are open for comment until May 22, are here. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like