Topic:

Data Management and Migration

Latest Headlines

Latest Headlines

Ford 'retools company around big data'

While naysayers and worrywarts continue to wring their hands over "too few big data implementations," visionaries are already moving beyond big data projects and into reshaping their entire companies around it. The latest to do so is Ford, who is now "retooling the company around big data."

Novetta makes connection between unstructured Hadoop data and critical enterprise data

Novetta says that since 9/11, it has developed, deployed and tested Novetta Entity Analytics (NEA) within the federal government.. Now it has announced it is making NEA available for commercial use. The product's focus is on the variety of data more so than data size.

Logicalis US offers encrypted storage option

Logicalis U.S. is heavily pushing flash storage in their cloud offerings and have now added the option of encrypted storage across its suite of products.

Spotlight: The Internet of Things has 4 big data problems

In his post in O'Reilly Radar, Alistair Croll points out four very large and looming big data problems in the Internet of Things. One of them is "datamandering" or data sprawl.

UPDATE: More details on the 1010data data store launch

I reported earlier this week on the impending 1010data Facts data store launch and I promised in that post that I would report back to you with more details. Yesterday I interviewed Sandy Steier, CEO and Co-founder of 1010data, via email and here is what he revealed about this very interesting product. 

Variety, not volume, biggest big data challenge in 2015

Many in the industry have said since the outset that the term "big data" is unfortunate and that sometime soon we'd stop using it. Are we there yet? If we aren't, we likely will be soon because the focus this year will increasingly be on mastering data variety rather than just coping with data size.

Big data benchmark test results for 1, 3 and 10 terabyte scale factors released

Raghunath Nambiar, chairman of the Transaction Processing Performance Council (TPC) big data benchmark committee and a Cisco Systems distinguished engineer, announced in a blog post last week the release of big data benchmark test results for 1, 3 and 10 terabyte scale factors. 

2015: The year data scientists lose their sex appeal?

Despite increasing advancements in big data tools, data flotsam is still obstructing our view of business realties and insights. Meanwhile, business heads and users are impatiently grabbing for the tools to do analysis themselves which, if done improperly, will only add chaos to the already frenzied mix. So what does all this point to in 2015? 

Big data lessons from CES: IoT deluge, analysis bottlenecks and consumer rejects

CES 2015 showed us that the notion of personal privacy no longer exists in mainstream product production. The focus is keenly on the consumer not as customer but as product. And, big data users learned that a flood of useless minutiae from the IoT is headed our way to clog our pipes and create bottlenecks in analysis. So now what? 

SEC launches pilot program for investor analysis, public company financial statement data comparisons

The Securities and Exchange Commission, or SEC, launched a pilot program last week designed to expedite investor analysis and comparisons of public company financial statement data.