Topic:

Data Infrastructure

Latest Headlines

Latest Headlines

Variety, not volume, biggest big data challenge in 2015

Many in the industry have said since the outset that the term "big data" is unfortunate and that sometime soon we'd stop using it. Are we there yet? If we aren't, we likely will be soon because the focus this year will increasingly be on mastering data variety rather than just coping with data size.

Big data benchmark test results for 1, 3 and 10 terabyte scale factors released

Raghunath Nambiar, chairman of the Transaction Processing Performance Council (TPC) big data benchmark committee and a Cisco Systems distinguished engineer, announced in a blog post last week the release of big data benchmark test results for 1, 3 and 10 terabyte scale factors. 

UK government could use 'big data' to select service, pre-fill forms in behalf of citizens; will US follow suit?

According to a new report, the U.K. government could increase the usefulness of its big data work by using it to pre-fill service request forms for citizens, tailor services to individuals and groups of citizens, aka as personalization, and to better monitor quality and delivery of services on a local and individual level.

Stoppers and cogs in the big data wheel

Quite a bit of attention is given to improving data management and analysis at the data scientist level. But I submit for your consideration today an often overlooked area that needs our utmost attention: the cogs and wedges in both data entry and action execution.

451 Research names '6 Cs' of IT disruption for 2015

For 2015, 451 Research has an interesting list of havoc headed our way--the 6 Cs: Containers, convergence, cloud security, closets, crowd workers, and coexistence. 

Global IT management survey finds common data center issues, frequency of occurrences

Kelton Research, at the behest of TeamQuest, a global IT service optimization provider, conducted a survey of over four hundred IT professionals in 10 countries in an effort to peg common data center issues and their frequency of occurrence. Here's what they found.

IU SciPass in beta, aims to securely speed large data transfers

Like many institutions, Indiana University was frustrated with using Science DMZ to optimize large data transfers because although it has many security features, those same features significantly slow the transfers. SciPass, software now in beta, is IU's pass at finding a better, faster way to handle data transfers. 

Spotlight: Net neutrality now a political football, outcome could affect cost of data transfers

No doubt you heard President Obama urged the FCC to protect net neutrality this week. But the conversation in the capitol has degraded according to an eWeek report. While the President does not have...

Spotlight: Hadoop glossary defines big data terms

The guide defines 20 big data/Hadoop terms. 

Fujitsu bridges the petabyte divide

Vendors are hard at work developing technologies that will help us make the leap to extreme data. Now Fujitsu announced it has achieved "unlimited scalability" with its Storage ETERNUS CD10000 architecture.