Today the Illinois Technology Association (ITA) launched the ITA Internet of Things Council, a cross-disciplinary effort to drive the growth and use of Internet of things technologies in Chicago and the Midwest.
Ryft unveiled Ryft ONE this week, a commercial-grade 1U platform capable of analyzing up to 48 terabytes of data at 10 gigabytes per second or faster, the company claims.
IBM recently announced two new all-flash enterprise storage arrays for big data. IBM says its FlashSystem technology integrates Micron's MLC flash chip technology to take advantage of the chip's density and cost benefits.
A survey conducted by SnapLogic and TechValidate uncovered which big data tools are at the top of most IT leaders' shopping lists and revealed the most common barriers to realizing ROI once the shopping is done.
Speaking of the deluge of data and the Internet of Things, wireless networks will soon be overtaxed too under that strain. The New Jersey Institute of Technology made an infographic which outlines the differences between 3G and 4G Lte and predicts the arrival of 5G networks. Trust me, you want to see this!
Lots of attention is paid to the promise of data from the Internet of Things, but we can only get that data if broadband can deliver it all and if Things have access to the Internet no matter where they are placed.
This morning StackIQ announced the immediate availability of StackIQ Boss 5, a server automation platform "built specifically for big data and private cloud infrastructures."
Many in the industry have said since the outset that the term "big data" is unfortunate and that sometime soon we'd stop using it. Are we there yet? If we aren't, we likely will be soon because the focus this year will increasingly be on mastering data variety rather than just coping with data size.
Raghunath Nambiar, chairman of the Transaction Processing Performance Council (TPC) big data benchmark committee and a Cisco Systems distinguished engineer, announced in a blog post last week the release of big data benchmark test results for 1, 3 and 10 terabyte scale factors.
According to a new report, the U.K. government could increase the usefulness of its big data work by using it to pre-fill service request forms for citizens, tailor services to individuals and groups of citizens, aka as personalization, and to better monitor quality and delivery of services on a local and individual level.