data storage

Results 1 - 25 of 1079Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 13, 2013
FlexPod Select with Hadoop delivers enterprise class Hadoop with validated, pre-configured components for fast deployment, higher reliability and smoother integration with existing applications and infrastructure.  These technical reference architectures optimize storage, networking, and servers with Cloudera and Hortonworks distributions of Hadoop. Leverage FlexPod Select with Hadoop to help store, manage, process and perform advanced analytics on your multi-structured data.   Tuning parameters, optimization techniques among other Hadoop cluster guidelines  are provided.
Tags : flexpod with hadoop, enterprise data, storage infrastructure, massive amounts of data
     NetApp
By: NetApp     Published Date: -
Enterprise data is growing rapidly - reaching multiple petabytes of data or even billions of files for many organizations. To maximize the business value of this data, enterprises need a storage infrastructure to store, manage, and retrieve a massive amount of data. This ebook shows you how to address large content repository challenges with object storage. You'll learn how to effectively address long-term retention policies, find and retrieve content quickly from long-term repositories and using object storage efficiently.
Tags : object storage, storage infrastructure
     NetApp
By: NetApp     Published Date: Dec 14, 2013
Read how the NetApp Distributed Content Repository Solution is an efficient and risk-reducing active archive solution. Based on customer data, Forrester created a composite organization and concluded that the NetApp Distributed Content Repository delivered a three year ROI of 47% with a payback period of 1.3 months. The key benefits are reduced risk of losing unregulated archived data, denser storage, storage solution efficiency, and compliance for regulated data. The study also provides readers with a framework to do their own financial impact evaluation. Source: The Total Economic Impact Of The NetApp Distributed Content Repository Solution (StorageGRID On E-Series), a commissioned study conducted by Forrester Consulting on behalf of NetApp, March 2013.
Tags : forrester tei
     NetApp
By: NetApp     Published Date: Dec 18, 2013
IT managers have indicated their two most significant challenges associated with managing unstructured data at multiple locations were keeping pace with data growth and improving data protection . Learn how the NetApp Distributed Content Repository provides advanced data protection and system recovery capabilities that can enable multiple data centers and remote offices to maintain access to data through hardware and software faults. Key benefits are: - continuous access to file data while maintaining data redundancy with no administrator intervention needed. - easily integrated and deployed into a distributed environment, providing transparent, centrally managed content storage - provision of secure multi-tenancy using security partitions. - provision effectively infinite, on-demand capacity while providing fast access to files and objects in the cloud. - secure, robust data protection techniques that enable data to persist beyond the life of the storage it resides on
Tags : 
     NetApp
By: NetApp     Published Date: Dec 19, 2013
SAP HANA enables real-time access to mission-critical, business data and thus, revolutionizes the way existing information can be utilized to address ever changing business requirements. This whitepaper describes both the business and technical benefits of implementing the Cisco UCS with NetApp Storage for SAP HANA solution.
Tags : 
     NetApp
By: IBM     Published Date: Sep 02, 2014
This book examines data storage and management challenges and explains software-defined storage, an innovative solution for high-performance, cost-effective storage using the IBM General Parallel File System (GPFS).
Tags : ibm, software storage for dummies
     IBM
By: IBM     Published Date: Sep 02, 2014
Life Sciences organizations need to be able to build IT infrastructures that are dynamic, scalable, easy to deploy and manage, with simplified provisioning, high performance, high utilization and able to exploit both data intensive and server intensive workloads, including Hadop MapReduce. Solutions must scale, both in terms of processing and storage, in order to better serve the institution long-term. There is a life cycle management of data, and making it useable for mainstream analyses and applications is an important aspect in system design. This presentation will describe IT requirements and how Technical Computing solutions from IBM and Platform Computing will address these challenges and deliver greater ROI and accelerated time to results for Life Sciences.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
Learn how to manage storage growth, cost and complexity, while increasing storage performance and data availability with IBM Software Defined Storage solutions including the IBM General parallel File System (GPFS).
Tags : ibm, storage for dummies
     IBM
By: IBM     Published Date: May 20, 2015
Join IBM and Nuance Communications Inc. to learn how Nuance uses IBM Elastic Storage to improve the power of their voice recognition applications by managing storage growth, cost and complexity while increasing performance and data availability. View the webcast to learn how you can: · Lower data management costs through policy driven automation and tiered storage management · Manage and increase storage agility through software defined storage Remove data related bottlenecks to deliver application performance
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
There is a lot of hype around the potential of big data and organizations are hoping to achieve new innovations in products and services with big data and analytics driving more concrete insights about their customers and their own business operations. To meet these challenges, IBM has introduced IBM® Spectrum Scale™. The new IBM Spectrum Scale storage platform has grown from GPFS, which entered the market in 1998. Clearly, IBM has put significant development into developing this mature platform. Spectrum Scale addresses the key requirements of big data storage - extreme scalability for growth, reduced overhead of data movement, easy accessibility , geographic location independence and advanced storage functionality. Read the paper to learn more!
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
The IBM Spectrum Scale solution provided for up to 11x better throughput results than EMC Isilon for Spectrum Protect (TSM) workloads. Using published data, Edison compared a solution comprised of EMC® Isilon® against an IBM® Spectrum Scale™ solution. (IBM Spectrum Scale was formerly IBM® General Parallel File System™ or IBM® GPFS™, also known as code name Elastic Storage). For both solutions, IBM® Spectrum Protect™ (formerly IBM Tivoli® Storage Manager or IBM® TSM®) is used as a common workload performing the backups to target storage systems evaluated.
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: TIBCO     Published Date: Nov 09, 2015
As one of the most exciting and widely adopted open-source projects, Apache Spark in-memory clusters are driving new opportunities for application development as well as increased intake of IT infrastructure. Apache Spark is now the most active Apache project, with more than 600 contributions being made in the last 12 months by more than 200 organizations. A new survey conducted by Databricks—of 1,417 IT professionals working with Apache Spark finds that high-performance analytics applications that can work with big data are driving a large proportion of that demand. Apache Spark is now being used to aggregate multiple types of data in-memory versus only pulling data from Hadoop. For solution providers, the Apache Spark technology stack is a significant player because it’s one of the core technologies used to modernize data warehouses, a huge segment of the IT industry that accounts for multiple billions in revenue. Spark holds much promise for the future—with data lakes—a storage repo
Tags : 
     TIBCO
By: Storiant     Published Date: Mar 16, 2015
Read this new IDC Report about how today's enterprise datacenters are dealing with new challenges that are far more demanding than ever before. Foremost is the exponential growth of data, most of it unstructured data. Big data and analytics implementations are also quickly becoming a strategic priority in many enterprises, demanding online access to more data, which is retained for longer periods of time. Legacy storage solutions with fixed design characteristics and a cost structure that doesn't scale are proving to be ill-suited for these new needs. This Technology Spotlight examines the issues that are driving organizations to replace older archive and backup-and-restore systems with business continuity and always-available solutions that can scale to handle extreme data growth while leveraging a cloudbased pricing model. The report also looks at the role of Storiant and its long-term storage services solution in the strategically important long-term storage market.
Tags : storiant, big data, analytics implementations, cloudbased pricing model, long-term storage services solution, long-term storage market
     Storiant
By: Storiant     Published Date: May 11, 2015
Emerging storage vendors offer data center managers and storage administrators new antidotes for their storage challenges. This research details five companies that provide innovative storage capabilities via new architecture and deployment methods, and looks back at two past Cool Vendors.
Tags : gartner report, cool vendors in storage, storiant, storage vendors, data center managers, storage challenges
     Storiant
By: EMC     Published Date: Jun 13, 2016
A Data Lake can meet the storage needs of your Modern Data Center. Check out the Top 10 Reasons your organization should adopt scale-out data lake storage for Hadoop Analytics on EMC Isilon.
Tags : 
     EMC
By: EMC     Published Date: Jun 13, 2016
EMC® Isilon® is a simple and scalable platform to build a scale-out data lake. Consolidate storage silos, improve storage utilization, reduce costs, and prove a future proofed platform to run today and tomorrow's workloads.
Tags : 
     EMC
By: EMC     Published Date: Jun 13, 2016
IDC believes that EMC Isilon is indeed an easy to operate, highly scalable and efficient Enterprise Data Lake Platform. IDC validated that a shared storage model based on the Data Lake can in fact provide enterprise-grade service-levels while performing better than dedicated commodity off-the-shelf storage for Hadoop workloads.
Tags : 
     EMC
By: Data Direct Networks     Published Date: Apr 08, 2014
DataDirect Networks (DDN), the largest privately-held provider of high-performance storage, has a large and growing presence in HPC markets. HPC users identify DDN as their storage provider more than any other storage-focused company, with twice the mentions of EMC, and more the twice the mentions of NetApp, Hitachi Data Systems, or Panasas.(5) DDN’s strength in HPC is anchored by its Storage Fusion Architecture (SFA), winner of the HPCwire Editor’s Choice Award for “Best HPC Storage Product or Technology” in each of the past three years. The DDN SFA12KX combines SATA, SAS, and solid-state disks (SSDs) for an environment that can be tailored to a balance of throughput and capacity
Tags : 
     Data Direct Networks
By: Intel     Published Date: Aug 06, 2014
Around the world and across all industries, high-performance computing is being used to solve today’s most important and demanding problems. More than ever, storage solutions that deliver high sustained throughput are vital for powering HPC and Big Data workloads. Intel® Enterprise Edition for Lustre* software unleashes the performance and scalability of the Lustre parallel file system for enterprises and organizations, both large and small. It allows users and workloads that need large scale, high- bandwidth storage to tap into the power and scalability of Lustre, but with the simplified installation, configuration, and monitoring features of Intel® Manager for Lustre* software, a management solution purpose-built for the Lustre file system.Intel ® Enterprise Edition for Lustre* software includes proven support from the Lustre experts at Intel, including worldwide 24x7 technical support. *Other names and brands may be claimed as the property of others.
Tags : 
     Intel
By: Intel     Published Date: Aug 06, 2014
Designing a large-scale, high-performance data storage system presents significant challenges. This paper describes a step-by-step approach to designing such a system and presents an iterative methodology that applies at both the component level and the system level. A detailed case study using the methodology described to design a Lustre* storage system is presented. *Other names and brands may be claimed as the property of others.
Tags : 
     Intel
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflops of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Lustre and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters. Further, Intel Enterprise Edition for Lustre software is backed by Intel, the recognized technical support providers for Lustre, and includes 24/7 service level agreement (SLA) coverage.
Tags : 
     Intel
By: Intel     Published Date: Sep 16, 2014
In this Guide, we take a look at what Lustre on infrastructure AWS delivers for a broad community of business and commercial organizations struggling with the challenge of big data and demanding storage growth.
Tags : intel, lustre, big data solutions in the cloud
     Intel
By: Dell and Intel®     Published Date: Aug 24, 2015
To extract value from an ever-growing onslaught of data, your organization needs next-generation data management, integration, storage and processing systems that allow you to collect, manage, store and analyze data quickly, efficiently and cost-effectively. That’s the case with Dell| Cloudera® Apache™ Hadoop® solutions for big data. These solutions provide end-to-end scalable infrastructure, leveraging open source technologies, to allow you to simultaneously store and process large datasets in a distributed environment for data mining and analysis, on both structured and unstructured data, and to do it all in an affordable manner.
Tags : 
     Dell and Intel®
By: Dell APAC     Published Date: May 30, 2019
When it comes to effectively and efficiently protecting growing volumes of data, midsized organizations face unique challenges. That is because they live in a world of constraints that are both operational and budgetary in nature. Cloud disaster recovery offers new options for these organizations—they can optimize their data protection economics by integrating on-premises protection solutions with cloud-based backup and recovery methods. Dell EMC’s cloud-ready solutions, particularly its Integrated Data Protection Appliances with native cloud extension capabilities, along with its Data Protection Software working in conjunction with its Data Domain backup storage appliances, provide cloud disaster recovery with flexible features. These solutions enhance operational efficiency and provide midsized organizations with clear economic and operational benefits.
Tags : 
     Dell APAC
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com