data management

Results 1 - 25 of 2406Sort Results By: Published Date | Title | Company Name
By: IBM     Published Date: Sep 02, 2014
This book examines data storage and management challenges and explains software-defined storage, an innovative solution for high-performance, cost-effective storage using the IBM General Parallel File System (GPFS).
Tags : ibm, software storage for dummies
     IBM
By: IBM     Published Date: Sep 02, 2014
Life Sciences organizations need to be able to build IT infrastructures that are dynamic, scalable, easy to deploy and manage, with simplified provisioning, high performance, high utilization and able to exploit both data intensive and server intensive workloads, including Hadop MapReduce. Solutions must scale, both in terms of processing and storage, in order to better serve the institution long-term. There is a life cycle management of data, and making it useable for mainstream analyses and applications is an important aspect in system design. This presentation will describe IT requirements and how Technical Computing solutions from IBM and Platform Computing will address these challenges and deliver greater ROI and accelerated time to results for Life Sciences.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
With tougher regulations and continuing market volatility, financial firms are moving to active risk management with a focus on counterparty risk. Firms are revamping their risk and trading practices from top to bottom. They are adopting new risk models and frameworks that support a holistic view of risk. Banks recognize that technology is critical for this transformation, and are adding state-of-the-art enterprise risk management solutions, high performance data and grid management software, and fast hardware. Join IBM Algorithmics and IBM Platform Computing to gain insights on this trend and on technologies for enabling active "real-time" risk management.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Join IBM and Nuance Communications Inc. to learn how Nuance uses IBM Elastic Storage to improve the power of their voice recognition applications by managing storage growth, cost and complexity while increasing performance and data availability. View the webcast to learn how you can: · Lower data management costs through policy driven automation and tiered storage management · Manage and increase storage agility through software defined storage Remove data related bottlenecks to deliver application performance
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Every day, the world creates 2.5 quintillion bytes of data and businesses are realizing tangible results from investments in big data analytics. IBM Spectrum Scale (GPFS) offers an enterprise class alternative to Hadoop Distributed File System (HDFS) for building big data platforms and provides a range of enterprise-class data management features. Spectrum Scale can be deployed independently or with IBM’s big data platform, consisting of IBM InfoSphere® BigInsights™ and IBM Platform™ Symphony. This document describes best practices for deploying Spectrum Scale in such environments to help ensure optimal performance and reliability.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
According to our global study of more than 800 cloud decision makers and users are becoming increasingly focused on the business value cloud provides. Cloud is integral to mobile, social and analytics initiatives – and the big data management challenge that often comes with them and it helps power the entire suite of game-changing technologies. Enterprises can aim higher when these deployments are riding on the cloud. Mobile, analytics and social implementations can be bigger, bolder and drive greater impact when backed by scalable infrastructure. In addition to scale, cloud can provide integration, gluing the individual technologies into more cohesive solutions. Learn how companies are gaining a competitive advanatge with cloud computing.
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
A fast, simple, scalable and complete storage solution for today’s data-intensive enterprise IBM Spectrum Scale is used extensively across industries worldwide. Spectrum Scale simplifies data management with integrated tools designed to help organizations manage petabytes of data and billions of files—as well as control the cost of managing these ever-growing data volumes.
Tags : 
     IBM
By: RedPoint     Published Date: Sep 22, 2014
Enterprises can gain serious traction by taking advantage of the scalability, processing power and lower costs that Hadoop 2.0/YARN offers. YARN closes the functionality gap by opening Hadoop to mature enterprise-class data management capabilities. With a lot of data quality functionality left outside of Hadoop 1, and a lot of data inside HDFS originating outside the enterprise, the quality of the data residing in the Hadoop cluster is sometimes as stinky as elephant dung. Some of the topics discussed in this paper include: • The key features, benefits and limitations of Hadoop 1.0 • The benefit of performing data standardization, identity resolution, and master data management inside of Hadoop. • The transformative power of Hadoop 2.0 and its impact on the speed and cost of accessing, cleansing and delivering high-quality enterprise data. Download this illuminating white paper about what YARN really means to the world of big data management.
Tags : 
     RedPoint
By: RedPoint     Published Date: Sep 22, 2014
The emergence of YARN for the Hadoop 2.0 platform has opened the door to new tools and applications that promise to allow more companies to reap the benefits of big data in ways never before possible with outcomes possibly never imagined. By separating the problem of cluster resource management from the data processing function, YARN offers a world beyond MapReduce: less encumbered by complex programming protocols, faster, and at a lower cost. Some of the topics discussed in this paper include: • Why is YARN important for realizing the power of Hadoop for data integration, quality and management? • Benchmark results of MapReduce vs. Pig vs. visual “data flow” design tools • The 3 key features of YARN that solve the complex problems that prohibit businesses from gaining maximum benefit from Hadoop. Download this paper to learn why the power of Hadoop 2.0 lies in enabling applications to run inside Hadoop, without the constraints of MapReduce.
Tags : 
     RedPoint
By: RedPoint     Published Date: Nov 10, 2014
Adoption of Hadoop by data-driven organizations is exploding. Hadoop’s potential cost effectiveness and facility for accepting unstructured data is making it central to modern, “Big Data” architectures. The advancements in Hadoop 2.0 increase the technology’s promise to an even greater extent. But with these opportunities also come challenges and adoption hurdles that make getting the most out of Hadoop easier said than done. Read on as we review some Hadoop basics, highlight some of the adoption challenges that exist and explain how RedPoint Data Management for Hadoop helps organizations accelerate their work with Hadoop.
Tags : 
     RedPoint
By: IBM     Published Date: Nov 14, 2014
Every day, the world creates 2.5 quintillion bytes of data and businesses are realizing tangible results from investments in big data analytics. IBM Spectrum Scale (GPFS) offers an enterprise class alternative to Hadoop Distributed File System (HDFS) for building big data platforms and provides a range of enterprise-class data management features. Spectrum Scale can be deployed independently or with IBM’s big data platform, consisting of IBM InfoSphere® BigInsights™ and IBM Platform™ Symphony. This document describes best practices for deploying Spectrum Scale in such environments to help ensure optimal performance and reliability.
Tags : 
     IBM
By: IBM     Published Date: Nov 14, 2014
View this series of short webcasts to learn how IBM Platform Computing products can help you ‘maximize the agility of your distributed computing environment’ by improving operational efficiency, simplify user experience, optimize application using and license sharing, address spikes in infrastructure demand and reduce data management costs.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
To quickly and economically meet new and peak demands, Platform LSF (SaaS) and Platform Symphony (SaaS) workload management as well as Elastic Storage on Cloud data management software can be delivered as a service, complete with SoftLayer cloud infrastructure and 24x7 support for technical computing and service-oriented workloads. Watch this demonstration to learn how the IBM Platform Computing Cloud Service can be used to simplify and accelerate financial risk management using IBM Algorithmics.
Tags : 
     IBM
By: Intel     Published Date: Aug 06, 2014
Around the world and across all industries, high-performance computing is being used to solve today’s most important and demanding problems. More than ever, storage solutions that deliver high sustained throughput are vital for powering HPC and Big Data workloads. Intel® Enterprise Edition for Lustre* software unleashes the performance and scalability of the Lustre parallel file system for enterprises and organizations, both large and small. It allows users and workloads that need large scale, high- bandwidth storage to tap into the power and scalability of Lustre, but with the simplified installation, configuration, and monitoring features of Intel® Manager for Lustre* software, a management solution purpose-built for the Lustre file system.Intel ® Enterprise Edition for Lustre* software includes proven support from the Lustre experts at Intel, including worldwide 24x7 technical support. *Other names and brands may be claimed as the property of others.
Tags : 
     Intel
By: Dell and Intel®     Published Date: Aug 24, 2015
To extract value from an ever-growing onslaught of data, your organization needs next-generation data management, integration, storage and processing systems that allow you to collect, manage, store and analyze data quickly, efficiently and cost-effectively. That’s the case with Dell| Cloudera® Apache™ Hadoop® solutions for big data. These solutions provide end-to-end scalable infrastructure, leveraging open source technologies, to allow you to simultaneously store and process large datasets in a distributed environment for data mining and analysis, on both structured and unstructured data, and to do it all in an affordable manner.
Tags : 
     Dell and Intel®
By: Adaptive Computing     Published Date: Feb 27, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: Aberdeen     Published Date: Jun 17, 2011
Download this paper to learn the top strategies leading executives are using to take full advantage of the insight they receive from their business intelligence (BI) systems - and turn that insight into a competitive weapon.
Tags : aberdeen, michael lock, data-driven decisions, business intelligence, public sector, analytics, federal, state, governmental, decisions, data management
     Aberdeen
By: Rubrik EMEA     Published Date: Apr 15, 2019
Ransomware is not going away. This makes it imperative for businesses across all industries to adopt a data management strategy of multi-layered security, easy automation, and quick recovery. To learn more about Rubrik and how it can ?t into your ransomware protection strategy while simplifying data protection across your entire datacenter, visit www.rubrik.com. As the leading next-generation data protection solution, Rubrik deploys as a plug-and-play appliance in less than an hour and has been adopted across all verticals and organization sizes including Fortune 50 companies.
Tags : data, backup, recovery, backups, solution, security, encryption, cyber, privacy, infrastructure
     Rubrik EMEA
By: Rubrik EMEA     Published Date: Apr 15, 2019
Backup and recovery needs a radical rethink. When today’s incumbent solutions were designed over a decade ago, IT environments were exploding, heterogeneity was increasing, and backup was the protection of last resort. The goal was to provide a low cost insurance policy for data, and to support this increasingly complex multi-tier, heterogeneous environment. The answer was to patch together backup and recovery solutions under a common vendor management framework and to minimize costs by moving data across the infrastructure or media.
Tags : data, backup, recovery, virtualization, solutions, cloud, architectures
     Rubrik EMEA
By: Schneider Electric     Published Date: Mar 28, 2019
Attracting Investors Webinar: With more than $18 billion in M&A activity in the first half of last year alone, the colocation industry is riding the bubble of rapid growth. Colocation data center providers are being evaluated by a wide range of investors, with varying experience and perspectives. Understanding the evaluation criteria is a critical competency for attracting the right type of investor and investment to your colocation business. Steve Wallage, Managing Director of Broad Group Consulting, has led more than 30 due diligence projects and will discuss specific areas of focus including assessment of financials, management, customers, business plan, competitive positioning and future strategy and exit. By attending this presentation colocation providers will: • Hear how investors are assessing colocation providers • Understand different types of investor strategy and positioning • Explore actual case studies –success stories as well as examples where investors walked away • Walk away with a greater understanding of how to not only attract investment, but the right type of investor to propel their business growth
Tags : investors, schneider electric, - colocation provider, attracting investors, colo data center
     Schneider Electric
By: Intel     Published Date: Apr 11, 2019
The Internet of Things (IoT) unleashes valuable business insights through data that’s gathered at every level of a retail organization. With IoT and data analytics, retailers now have the capability to gather insight into customer behavior, offer more personalized experiences, achieve better inventory accuracy, create greater supply chain efficiencies, and so much more. But with data comes great risk. A recent report by security firm Thales and 451 Research found that 43 percent of retailers have experienced a data breach in the past year, with a third reporting more than one breach.1 Intel® technology-based gateways and Asavie, a provider of next-gen enterprise mobility management and IoT connectivity solutions, offer a security connectivity solution that minimizes the effort and cost to businesses to ensure safety from cybersecurity attacks. In addition, the Intel/Asavie IoT solution provides retailers with a solid basis to build their smart, connected projects:
Tags : 
     Intel
By: VMware     Published Date: Apr 09, 2019
Lower your data center costs with VMware vSan. In this infographic, we’ll show you how to reduce CAPEX and OPEX with a pay-as-you-grow model that enables rapid application deployment and easy ongoing management.
Tags : 
     VMware
By: Gigamon     Published Date: Mar 26, 2019
Download “How to Strengthen Security While Optimizing Network Performance” to see how next-generation network packet brokers (NGNPBs) mitigate security tool sprawl, simplify IT management and improve network availability. NGNPBs help align network and security teams by creating a single view of network infrastructure and data management. See why you should shift your infrastructure strategy toward NGNPBs to boost efficiency and reduce complexity. Learn more by downloading this new research now.
Tags : 
     Gigamon
By: Illusive Networks     Published Date: Apr 10, 2019
How well-equipped is your organization to stop malicious attackers once they’re inside your network? According to this study of over 600 IT security professionals, almost two-thirds of respondents lack efficient capabilities to detect and investigate “stealth” attackers before serious damage occurs. Download the report to learn the primary obstacles to better threat detection and incident response, how well organizations are hardening their environments against lateral movement, and how cybersecurity budgets are changing to address the reality that attackers will get in.
Tags : risk management, it security, ponemon institute, ponemon, cybersecurity, research report, cyber attack, data breach, apt, targeted attacks, threat management, cyber crime, cyber risk, illusive networks
     Illusive Networks
By: Datastax     Published Date: Apr 08, 2019
For decades, organizations relied on traditional relational database management systems (RDBMS) to organize, store, and analyze their data. But then Facebook came along, and an RDBMS was suddenly not quite enough. The social giant needed a powerful database solution for its Inbox Search feature, and Apache Cassandra—a distributed NoSQL database—was born. Released as an open source project in July 2008, Cassandra—named after the mythological prophet who famously put a curse on an oracle—became an Apache Incubator project in March 2009. It graduated to a top-level project in February 2010
Tags : 
     Datastax
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com