My first stop in finding patterns relevant to the perfect big data storage architecture was Quantum Corp.
FORBES: Ten Properties of the Perfect Big Data Storage Architecture
In essence, the big data storage fetish is a form of denial.
As I pointed out in Curing the Big Data Storage Fetish, much of the money currently going into big data is paying for storage.
FORBES: Ten Properties of the Perfect Big Data Storage Architecture
But in many ways, the two are becoming one in the same cloud resources are needed to support big data storage and projects, and big data is a huge business case for moving to cloud.
FORBES: Cloud and Big Data, Together: A Huge Springboard to Innovation
All the big storage companies are developing products to play in the big data space because of the storage capacity growth as well as the growth in support services for big data.
Big data results in big storage and big business opportunities in all storage applications.
My specialties include storage technologies applied to Big Data analytics (primary), data management, virtualization, and analysis of storage infrastructure acquisition alternatives.
This transaction allows us to increase our focus on these technologies and to further expand on our commitment to drive growth from our core supercomputing business and our new initiatives in mid-range supercomputing, storage and big data.
FORBES: Cray Shares Rally On Super Q1, Strong Full Year Guidance
Too many organizations think they can manage Big Data by throwing increasing amounts of storage at the problem.
Quantum introduced its Lattus family of wide-area storage solutions for big data.
Whether it is storage systems architectures or storage devices enabling big data applications the growth of content is increasing the amount of large data sets that enterprises must work with.
The product includes new levels of security, data protection, interoperability and system performance to extend EMC Isilon storage to a broader range of big data applications.
The Big Data servers can fit two petabytes of storage capacity per rack.
Many companies incorporate LSI storage solutions to accelerate their big data applications.
The deal, expected to be finalized late this year, would allow EMC to provide complete storage infrastructure for so-called big data.
First, Web innovators, such as Facebook, Google and Yahoo, have developed a massively scalable storage and compute architecture to manage Big Data: Hadoop, which parallelizes large data sets across low-cost commodity hardware for easy scale and dramatically reducing the cost of petabyte environments.
Exponential growth in digital information and greater adoption of newer technologies like cloud computing, big data and virtualization have contributed to growth in external storage systems, and EMC will be the biggest beneficiary with its stronghold in SAN and NAS markets as well as continued growth in unified storage.
With the explosion of data resulting from mobile devices, Internet services, social media and business applications, corporate, cloud and big data customers are constantly looking for ways to improve their storage infrastructure costs and their bottom line.
ENGADGET: HGST develops helium-filled, high-capacity hard drives: no, they won't float away
The bank turned to RainStor, which can compress data by up to 97 percent, providing big savings on storage.
FORBES: Managing Financial Data Growth with 30-40X Compression
Donatelli is right, inasmuch as big data centers and the cloud are creating worlds where networking, storage and servers are so tightly coupled as to be in some ways indistinguishable from each other.
Hadoop ushered in the era of the big data database, architected to distribute processing across servers instead of within storage.
FORBES: The 12 Days Of Data Trends: An IT Exec's Outlook For 2013
When we hear about big data "solutions, " they tend to focus on computational approaches -- storage, search and processing -- with human intuition largely absent.
However, he warned against focussing disproportionately on the eight areas of excellence outlined by the Chancellor in November ("big data" computing, synthetic biology, regenerative medicine, agricultural science, energy storage, space, robotics and advanced materials including nanotechnology).
They also offer big data analytics, high performance computing, web and collaboration applications, and archiving and storage.
But cheap storage and computing resources are now available in the cloud, so analysing Big Data is no longer such a financial challenge.
By reducing the data footprint, virtualizing the reuse and storage of the data and centralizing the management of the data set, Big Data is ultimately transformed into small data and managed like virtual data.
Another big trend, she says, is the growth in storage of financial records, scientific data, medical images and other records that do not necessarily need to be archived, but may need to be called up quickly.
Two big disruptions have upended the enterprise data center and infrastructure stack for large companies: virtualization and flash storage.
Sun's announcement marks the information technology industry's latest step toward "utility computing, " a grid system in which companies pipe in data processing and storage from faraway server farms, says Nicholas Carr, author of The Big Switch, a book on utility computing published earlier this week.
But as the cloud resolves issues of data storage and management, and mobile and social technologies allow for easy and immediate information sharing, Big Data now can be crunched and shared throughout an organization, leading to better decision-making and outcomes for all sorts of enterprises.
应用推荐