ro po

Results 1 - 25 of 13182Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 13, 2013
Interested in running a Hadoop proof of concept on enterprise-class storage? Download this solutions guide to get a technical overview on building Hadoop on NetApp E-series storage. NetApp Open Solution for Hadoop delivers big analytics with preengineered, compatible, and supported solutions based on high-quality storage platforms so you reduce the cost, schedule, and risk of do-it-yourself systems and relieving the skills gap most organizations have with Hadoop. See how on going operational and maintenance costs can be reduced with a high available and scalable Hadoop solution.
Tags : open solutions, hadoop solutions guide
     NetApp
By: NetApp     Published Date: -
Enterprise data is growing rapidly - reaching multiple petabytes of data or even billions of files for many organizations. To maximize the business value of this data, enterprises need a storage infrastructure to store, manage, and retrieve a massive amount of data. This ebook shows you how to address large content repository challenges with object storage. You'll learn how to effectively address long-term retention policies, find and retrieve content quickly from long-term repositories and using object storage efficiently.
Tags : object storage, storage infrastructure
     NetApp
By: NetApp     Published Date: Dec 14, 2013
Read how the NetApp Distributed Content Repository Solution is an efficient and risk-reducing active archive solution. Based on customer data, Forrester created a composite organization and concluded that the NetApp Distributed Content Repository delivered a three year ROI of 47% with a payback period of 1.3 months. The key benefits are reduced risk of losing unregulated archived data, denser storage, storage solution efficiency, and compliance for regulated data. The study also provides readers with a framework to do their own financial impact evaluation. Source: The Total Economic Impact Of The NetApp Distributed Content Repository Solution (StorageGRID On E-Series), a commissioned study conducted by Forrester Consulting on behalf of NetApp, March 2013.
Tags : forrester tei
     NetApp
By: NetApp     Published Date: Dec 18, 2013
IT managers have indicated their two most significant challenges associated with managing unstructured data at multiple locations were keeping pace with data growth and improving data protection . Learn how the NetApp Distributed Content Repository provides advanced data protection and system recovery capabilities that can enable multiple data centers and remote offices to maintain access to data through hardware and software faults. Key benefits are: - continuous access to file data while maintaining data redundancy with no administrator intervention needed. - easily integrated and deployed into a distributed environment, providing transparent, centrally managed content storage - provision of secure multi-tenancy using security partitions. - provision effectively infinite, on-demand capacity while providing fast access to files and objects in the cloud. - secure, robust data protection techniques that enable data to persist beyond the life of the storage it resides on
Tags : 
     NetApp
By: IBM     Published Date: Sep 02, 2014
Advanced analytics strategies yield the greatest benefits in terms of improving patient and business outcomes when applied across the entire healthcare ecosystem. But the challenge of collaborating across organizational boundaries in order to share information and insights is daunting to many stakeholders. In this worldwide survey of 555 healthcare providers, payers and life sciences organizations, you will learn the importance of implementing collaborative analytics strategies that: Manage, integrate and interpret data generated at all stages of the healthcare value chain Achieve the right balance of skills in order to translate data into actionable insights Focus on executive sponsorship and enterprise-wide adoption with metrics to measure and track success Position yourself to harness data, create and share insights, make informed decisions, and improve the performance of the entire healthcare ecosystem in which you operate.
Tags : ibm, analytics acrosss ecosystem
     IBM
By: IBM     Published Date: Sep 02, 2014
Life Sciences organizations need to be able to build IT infrastructures that are dynamic, scalable, easy to deploy and manage, with simplified provisioning, high performance, high utilization and able to exploit both data intensive and server intensive workloads, including Hadop MapReduce. Solutions must scale, both in terms of processing and storage, in order to better serve the institution long-term. There is a life cycle management of data, and making it useable for mainstream analyses and applications is an important aspect in system design. This presentation will describe IT requirements and how Technical Computing solutions from IBM and Platform Computing will address these challenges and deliver greater ROI and accelerated time to results for Life Sciences.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
Research teams using next-generation sequencing (NGS) technologies face the daunting challenge of supporting compute-intensive analysis methods against petabytes of data while simultaneously keeping pace with rapidly evolving algorithmic best practices. NGS users can now solve these challenges by deploying the Accelrys Enterprise Platform (AEP) and the NGS Collection on optimized systems from IBM. Learn how you can benefit from the turnkey IBM Application Ready Solution for Accelrys with supporting benchmark data.
Tags : ibm, accelrys, turnkey ngs solution
     IBM
By: IBM     Published Date: Sep 02, 2014
With tougher regulations and continuing market volatility, financial firms are moving to active risk management with a focus on counterparty risk. Firms are revamping their risk and trading practices from top to bottom. They are adopting new risk models and frameworks that support a holistic view of risk. Banks recognize that technology is critical for this transformation, and are adding state-of-the-art enterprise risk management solutions, high performance data and grid management software, and fast hardware. Join IBM Algorithmics and IBM Platform Computing to gain insights on this trend and on technologies for enabling active "real-time" risk management.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
In today’s stringent financial services regulatory environment with exponential growth of data and dynamic business requirements, Risk Analytics has become integral to businesses. IBM Algorithmics provides very sophisticated analyses for a wide range of economic scenarios that better quantify risk for multiple departments within a firm, or across the enterprise. With Algorithmics, firms have a better handle on their financial exposure and credit risks before they finalize real-time transactions. But this requires the performance and agility of a scalable infrastructure; driving up IT risk and complexity. The IBM Application Ready Solution for Algorithmics provides an agile, reliable and high-performance infrastructure to deliver trusted risk insights for sustained growth and profitability. This integrated offering with a validated reference architecture delivers the right risk insights at the right time while lowering the total cost of ownership.
Tags : ibm, it risk, financial risk analytics
     IBM
By: IBM     Published Date: May 20, 2015
In this white paper, we look at various cloud models, and assess their suitability to solve IT challenges. We provide recommendations on what to look for in a cloud provider. Finally, we take a look at the IBM Cloud portfolio.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Join IBM and Nuance Communications Inc. to learn how Nuance uses IBM Elastic Storage to improve the power of their voice recognition applications by managing storage growth, cost and complexity while increasing performance and data availability. View the webcast to learn how you can: · Lower data management costs through policy driven automation and tiered storage management · Manage and increase storage agility through software defined storage Remove data related bottlenecks to deliver application performance
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
According to our global study of more than 800 cloud decision makers and users are becoming increasingly focused on the business value cloud provides. Cloud is integral to mobile, social and analytics initiatives – and the big data management challenge that often comes with them and it helps power the entire suite of game-changing technologies. Enterprises can aim higher when these deployments are riding on the cloud. Mobile, analytics and social implementations can be bigger, bolder and drive greater impact when backed by scalable infrastructure. In addition to scale, cloud can provide integration, gluing the individual technologies into more cohesive solutions. Learn how companies are gaining a competitive advanatge with cloud computing.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
There is a lot of hype around the potential of big data and organizations are hoping to achieve new innovations in products and services with big data and analytics driving more concrete insights about their customers and their own business operations. To meet these challenges, IBM has introduced IBM® Spectrum Scale™. The new IBM Spectrum Scale storage platform has grown from GPFS, which entered the market in 1998. Clearly, IBM has put significant development into developing this mature platform. Spectrum Scale addresses the key requirements of big data storage - extreme scalability for growth, reduced overhead of data movement, easy accessibility , geographic location independence and advanced storage functionality. Read the paper to learn more!
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: Splice Machine     Published Date: Feb 13, 2014
Hadoop: Moving Beyond the Big Data Hype Let’s face it. There is a lot of hype surrounding Big Data and adoop, the defacto Big Data technology platform. Companies want to mine and act on massive data sets, or Big Data, to unlock insights that can help them improve operational efficiency, delight customers, and leapfrog their competition. Hadoop has become popular to store massive data sets because it can distribute them across inexpensive commodity servers. Hadoop is fundamentally a file system (HDFS or Hadoop Distributed File System) with a specialized programming model (MapReduce) to process the data in the files. Big Data has not lived up to expectations so far, partly because of limitations of Hadoop as a technology.
Tags : sql-on-hadoop® evaluation guide, splice machine, adoop
     Splice Machine
By: TIBCO     Published Date: Nov 09, 2015
As one of the most exciting and widely adopted open-source projects, Apache Spark in-memory clusters are driving new opportunities for application development as well as increased intake of IT infrastructure. Apache Spark is now the most active Apache project, with more than 600 contributions being made in the last 12 months by more than 200 organizations. A new survey conducted by Databricks—of 1,417 IT professionals working with Apache Spark finds that high-performance analytics applications that can work with big data are driving a large proportion of that demand. Apache Spark is now being used to aggregate multiple types of data in-memory versus only pulling data from Hadoop. For solution providers, the Apache Spark technology stack is a significant player because it’s one of the core technologies used to modernize data warehouses, a huge segment of the IT industry that accounts for multiple billions in revenue. Spark holds much promise for the future—with data lakes—a storage repo
Tags : 
     TIBCO
By: Storiant     Published Date: Mar 16, 2015
Read this new IDC Report about how today's enterprise datacenters are dealing with new challenges that are far more demanding than ever before. Foremost is the exponential growth of data, most of it unstructured data. Big data and analytics implementations are also quickly becoming a strategic priority in many enterprises, demanding online access to more data, which is retained for longer periods of time. Legacy storage solutions with fixed design characteristics and a cost structure that doesn't scale are proving to be ill-suited for these new needs. This Technology Spotlight examines the issues that are driving organizations to replace older archive and backup-and-restore systems with business continuity and always-available solutions that can scale to handle extreme data growth while leveraging a cloudbased pricing model. The report also looks at the role of Storiant and its long-term storage services solution in the strategically important long-term storage market.
Tags : storiant, big data, analytics implementations, cloudbased pricing model, long-term storage services solution, long-term storage market
     Storiant
By: RYFT     Published Date: Apr 03, 2015
The new Ryft ONE platform is a scalable 1U device that addresses a major need in the fast-growing market for advanced analytics — avoiding I/O bottlenecks that can seriously impede analytics performance on today's hyperscale cluster systems. The Ryft ONE platform is designed for easy integration into existing cluster and other server environments, where it functions as a dedicated, high-performance analytics engine. IDC believes that the new Ryft ONE platform is well positioned to exploit the rapid growth we predict for the high-performance data analysis market.
Tags : ryft, ryft one platform, 1u deivce, advanced analytics, avoiding i/o bottlenecks, idc
     RYFT
By: Impetus     Published Date: Feb 04, 2016
This white paper explores strategies to leverage the steady flow of new, advanced real-time streaming data analytics (RTSA) application development technologies. It defines a thoughtful approach to capitalize on the window of opportunity to benefit from the power of real-time decision making now, and still be able to move to new and emerging technologies as they become enterprise ready.
Tags : 
     Impetus
By: RedPoint     Published Date: Sep 22, 2014
Enterprises can gain serious traction by taking advantage of the scalability, processing power and lower costs that Hadoop 2.0/YARN offers. YARN closes the functionality gap by opening Hadoop to mature enterprise-class data management capabilities. With a lot of data quality functionality left outside of Hadoop 1, and a lot of data inside HDFS originating outside the enterprise, the quality of the data residing in the Hadoop cluster is sometimes as stinky as elephant dung. Some of the topics discussed in this paper include: • The key features, benefits and limitations of Hadoop 1.0 • The benefit of performing data standardization, identity resolution, and master data management inside of Hadoop. • The transformative power of Hadoop 2.0 and its impact on the speed and cost of accessing, cleansing and delivering high-quality enterprise data. Download this illuminating white paper about what YARN really means to the world of big data management.
Tags : 
     RedPoint
By: RedPoint     Published Date: Sep 22, 2014
The emergence of YARN for the Hadoop 2.0 platform has opened the door to new tools and applications that promise to allow more companies to reap the benefits of big data in ways never before possible with outcomes possibly never imagined. By separating the problem of cluster resource management from the data processing function, YARN offers a world beyond MapReduce: less encumbered by complex programming protocols, faster, and at a lower cost. Some of the topics discussed in this paper include: • Why is YARN important for realizing the power of Hadoop for data integration, quality and management? • Benchmark results of MapReduce vs. Pig vs. visual “data flow” design tools • The 3 key features of YARN that solve the complex problems that prohibit businesses from gaining maximum benefit from Hadoop. Download this paper to learn why the power of Hadoop 2.0 lies in enabling applications to run inside Hadoop, without the constraints of MapReduce.
Tags : 
     RedPoint
By: RedPoint     Published Date: Nov 10, 2014
Adoption of Hadoop by data-driven organizations is exploding. Hadoop’s potential cost effectiveness and facility for accepting unstructured data is making it central to modern, “Big Data” architectures. The advancements in Hadoop 2.0 increase the technology’s promise to an even greater extent. But with these opportunities also come challenges and adoption hurdles that make getting the most out of Hadoop easier said than done. Read on as we review some Hadoop basics, highlight some of the adoption challenges that exist and explain how RedPoint Data Management for Hadoop helps organizations accelerate their work with Hadoop.
Tags : 
     RedPoint
By: GridGain     Published Date: Sep 24, 2014
In-memory computing (IMC) is an emerging field of importance in the big data industry. It is a quickly evolving technology, seen by many as an effective way to address the proverbial 3 V’s of big data—volume, velocity, and variety. Big data requires ever more powerful means to process and analyze growing stores of data, being collected at more rapid rates, and with increasing diversity in the types of data being sought—both structured and unstructured. In-memory computing’s rapid rise in the marketplace has the big data community on alert. In fact, Gartner picked in-memory computing as one of the Top Ten Strategic Initiatives.
Tags : gridgain, in memory computing, big data industry, 3v's of big data-volume
     GridGain
By: TIBCO     Published Date: Sep 02, 2014
In this Guide you will learn how predictive analytics helps your organization predict with confidence what will happen next so that you can make smarter decisions and improve business outcomes. It is important to adopt a predictive analytics solution that meets the specific needs of different users and skill sets from beginners, to experienced analysts, to data scientists.
Tags : 
     TIBCO
By: EMC     Published Date: Jun 13, 2016
EMC Isilon Cloudpools software provides policy-based automated tiering that lets you seamlessly integrate with the cloud as on additional storage tier for the isilon cluster at your data center.
Tags : 
     EMC
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com