database performance

Results 101 - 125 of 262Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Feb 19, 2015
NetApp Flash Pool is a storage cache option within the NetApp Virtual Storage Tier product family, available for NetApp FAS storage systems. A Flash Pool configures solid state drives (SSDs) and hard disk drives (HDDs) into a single storage pool, known as an “aggregate” in NetApp parlance, with the SSDs providing a fast response time cache for volumes that are provisioned on the Flash Pool aggregate. In this lab evaluation, NetApp commissioned Demartek to evaluate the effectiveness of Flash Pool with different types and numbers of hard disk drives using an online transaction processing (OLTP) database workload, and to evaluate the performance of Flash Pool in a clustered Data ONTAP environment during a cluster storage node failover scenario. In the report, you’ll dis cover how Demartek test engineers documented a 283% gain in IOPS and a reduction in latency by a factor of 66x after incorporating NetApp Flash Pool technology.
Tags : 
     NetApp
By: Vormetric, Inc.     Published Date: Jan 11, 2016
SAP has reviewed and qualified Vormetric’s Transparent Encryption as suitable for use in SAP HANA solution environments. Vormetric provides a proven approach to securing SAP data that meets rigorous security, data governance and compliance requirements. Vormetric Data Security can be quickly deployed to secure data while requiring no change to SAP, the underlying database or hardware infrastructure. This approach enables enterprises to meet data governance requirements with a rigorous separation of duties. Whether you are securing an existing SAP deployment or upgrading, to a new version, Vormetric delivers a proven approach to quickly secure SAP data while ensuring SAP continues to operate at optimal performance.
Tags : sap hana, data security, encryption, virtustream, cloud security, encryption keys, big data, compliance, advanced persistent threats, apt, insider threat, security, data science, data storage
     Vormetric, Inc.
By: IBM     Published Date: Jul 06, 2017
Effectively using and managing information has become critical to driving growth in areas such as pursuing new business opportunities, attracting and retaining customers, and streamlining operations. In the era of big data, you must accommodate a rapidly increasing volume, variety and velocity of data while extracting actionable business insight from that data, faster than ever before. These needs create a daunting array of workload challenges and place tremendous demands on your underlying IT infrastructure and database systems. In many cases, these systems are no longer up to the task—so it’s time to make a decision. Do you use more staff to keep up with the fixes, patches, add-ons and continual tuning required to make your existing systems meet performance goals, or move to a new database solution so you can assign your staff to new, innovative projects that move your business forward?
Tags : database, growth, big data, it infrastructure, information management
     IBM
By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
As organizations develop next-generation applications for the digital era, many are using cognitive computing ushered in by IBM Watson® technology. Cognitive applications can learn and react to customer preferences, and then use that information to support capabilities such as confidence-weighted outcomes with data transparency, systematic learning and natural language processing. To make the most of these next-generation applications, you need a next-generation database. It must handle a massive volume of data while delivering high performance to support real-time analytics. At the same time, it must provide data availability for demanding applications, scalability for growth and flexibility for responding to changes.
Tags : database, applications, data availability, cognitive applications
     Group M_IBM Q1'18
By: IBM     Published Date: Apr 03, 2018
Can your database systems handle data growth and keep up with performance requirements? Here are six reasons to change.
Tags : database systems, data management, hybrid data
     IBM
By: Scalebase     Published Date: Feb 19, 2013
This white paper examines how to scale MySQL databases to handle more users, more connections and more data without re-writing apps or re-architecting the database.
Tags : mysql, databases, users, data distribution, improve, performance
     Scalebase
By: NetApp     Published Date: Sep 18, 2014
The NetApp flash portfolio is capable of solving database performance and I/O latency problems encountered by many database deployments. The majority of databases have a random I/O workload that creates performance problems for spinning media, but is well-suited for today’s flash technologies. NetApp has a diverse enterprise-class flash portfolio consisting of flash in the storage controller (Flash Cache™ intelligent caching), flash within the disk shelves (Flash Pool™ intelligent caching), and all-flash arrays (EF-Series and All-flash FAS). This portfolio can be used to solve complex database performance requirements at multiple levels within a customer’s Oracle environment. This document reviews Oracle database observations and results when implementing flash technologies offered within the NetApp flash portfolio.
Tags : database performance, database deployment, flash technology, enterprise techology
     NetApp
By: NetApp     Published Date: Sep 22, 2014
The NetApp EF series of all-flash arrays are designed specifically for database-driven environments demanding maximum performance, reliability, and availability. This ESG Lab Report documents the real world performance, reliability, availability, and serviceability of NetApp EF-Series flash arrays in Oracle database environments. A combination of hands on testing by ESG Lab and audited in-house performance testing executed by NetApp were used to create this report. In this report, you’ll learn how ESG validated NetApp’s EF-550 flash array performance of over 400,000 IOPS with sub-millisecond latency, while maintaining 6 nine’s availability.
Tags : flash arrays, performance-driven databases, enterprise storage, database environment, serviceability, real world performance
     NetApp
By: NetApp     Published Date: Sep 22, 2014
NetApp Flash Pool is a storage cache option within the NetApp Virtual Storage Tier product family, available for NetApp FAS storage systems. A Flash Pool configures solid state drives (SSDs) and hard disk drives (HDDs) into a single storage pool, known as an “aggregate” in NetApp parlance, with the SSDs providing a fast response time cache for volumes that are provisioned on the Flash Pool aggregate. In this lab evaluation, NetApp commissioned Demartek to evaluate the effectiveness of Flash Pool with different types and numbers of hard disk drives using an online transaction processing (OLTP) database workload, and to evaluate the performance of Flash Pool in a clustered Data ONTAP environment during a cluster storage node failover scenario. In the report, you’ll dis cover how Demartek test engineers documented a 283% gain in IOPS and a reduction in latency by a factor of 66x after incorporating NetApp Flash Pool technology.
Tags : flash pool, fas storage systems, ssd, online transaction processing, cluster storage
     NetApp
By: MarkLogic     Published Date: Mar 13, 2015
Big Data has been in the spotlight recently, as businesses seek to leverage their untapped information resources and win big on the promise of big data. However, the problem with big data initiatives are that organizations try using existing information management practices and legacy relational database technologies, which often collapse under the sheer weight of the data. In this paper, MarkLogic explains how a new approach is needed to handle the volume, velocity, and variety of big data because the current relational model that has been the status quo is not working. Learn about the NoSQL paradigm shift, and why NoSQL is gaining significant market traction because it solves the fundamental challenges of big data, achieving better performance, scalability, and flexibility. Learn how MarkLogic’s customers are reimagining their data to: - Make the world more secure - Provide access to valuable information - Create new revenue streams - Gain insights to increase market share - Reduce b
Tags : enterprise, nosql, relational, databases, data storage, management system, application, scalable
     MarkLogic
By: NetApp     Published Date: Jun 01, 2017
Research on how service providers are using flash storage to create innovative, next generation storage services.
Tags : netapp, database performance, flash storage, data management, cost challenges
     NetApp
By: NetApp     Published Date: Jun 01, 2017
The implications of getting your next generation data center strategy wrong can be fatal for a cloud and hosting business. The Fueled by NetApp Consulting team has seen a trend with service providers testing the waters with many different go-to-market strategies due to these high stakes.
Tags : netapp, database performance, flash storage, data management, cost challenges
     NetApp
By: IBM     Published Date: May 23, 2017
Flexible deployment options, licensing models help take the challenges out of change. As you move toward the cloud, you're likely planning or managing a mixed environment of on- premises and on- cloud applications. To help you succeed in this transition, you need a trans-formative, mixed-workload database that can handle a massive volume of data while delivering high performance, data availability and the flexibility to adapt respond to business changes.
Tags : cloud applications, mobile optimization, web-based applications, data availability, ibm, db2
     IBM
By: IBM     Published Date: Jun 08, 2017
This paper presents a cost/benefit case for two leading enterprise database contenders -- IBM DB2 11.1 for Linux, UNIX, and Windows (DB2 11.1 LUW) and Oracle Database 12c -- with regard to delivering effective security capabilities, high-performance OLTP capacity and throughput, and efficient systems configuration and management automation. Comparisons are of database installations in the telecommunications, healthcare, and consumer banking industries. For OLTP workloads in these environments, three-year costs average 32 percent less for use of DB2 11.1 compared to Oracle 12c.
Tags : ibm, linux, windows, telecommunications, healthcare, oracle database
     IBM
By: IBM     Published Date: Jul 26, 2017
This paper presents a cost/benefit case for two leading enterprise database contenders -- IBM DB2 11.1 for Linux, UNIX, and Windows (DB2 11.1 LUW) and Oracle Database 12c -- with regard to delivering effective security capabilities, high-performance OLTP capacity and throughput, and efficient systems configuration and management automation. Comparisons are of database installations in the telecommunications, healthcare, and consumer banking industries. For OLTP workloads in these environments, three-year costs average 32 percent less for use of DB2 11.1 compared to Oracle 12c.
Tags : ibm, enterprise data, windows, linux, telecommunications, healthcare, consumer banking
     IBM
By: IBM     Published Date: Sep 28, 2017
This paper presents a cost/benefit case for two leading enterprise database contenders -- IBM DB2 11.1 for Linux, UNIX, and Windows (DB2 11.1 LUW) and Oracle Database 12c -- with regard to delivering effective security capabilities, high-performance OLTP capacity and throughput, and efficient systems configuration and management automation. Comparisons are of database installations in the telecommunications, healthcare, and consumer banking industries. For OLTP workloads in these environments, three-year costs average 32 percent less for use of DB2 11.1 compared to Oracle 12c.
Tags : ibm, enterprise database, oltp, telecommunications, healthcare, consumer banking
     IBM
By: IBM     Published Date: Nov 08, 2017
Flexible deployment options, licensing models help take the challenges out of change. As you move toward the cloud, you're likely planning or managing a mixed environment of on- premises and on- cloud applications. To help you succeed in this transition, you need a trans-formative, mixed-workload database that can handle a massive volume of data while delivering high performance, data availability and the flexibility to adapt respond to business changes.
Tags : ibm, cloud, cloud computing, database, ibm db2
     IBM
By: IBM     Published Date: Nov 08, 2017
Flexible deployment options, licensing models help take the challenges out of change. As you move toward the cloud, you're likely planning or managing a mixed environment of on- premises and on- cloud applications. To help you succeed in this transition, you need a trans-formative, mixed-workload database that can handle a massive volume of data while delivering high performance, data availability and the flexibility to adapt respond to business changes.
Tags : ibm db2, cloud, on-cloud applications, mixed-workload database
     IBM
By: Datastax     Published Date: May 20, 2019
Apache Cassandra™ comes with the typical benefits of any NoSQL database, and much more. When enterprises need something easily scalable and ready for today’s hybrid cloud environments, it’s hard to find a database better suited for the job than Cassandra. From performance to availability to hybrid cloud readiness, this ebook explains the five main benefits of Cassandra.
Tags : 
     Datastax
By: Oracle + Dyn     Published Date: Jun 29, 2017
Every user’s first interaction with your website begins with a series of DNS queries. The Domain Name System or DNS is a distributed internet database that maps human-readable names to IP addresses, ensuring users reach the correct online asset (website, application, etc) efficiently. Knowing the complexities and best practices of this layer of your online infrastructure will help your organization build redundancy, improve end-user performance and establish a top notch DR plan. Download this guide to DNS top terms and actionable concepts including: Anycast vs. Unicast networks CNAME DDoS and Hijacking Load Balancing and GSLB
Tags : 
     Oracle + Dyn
By: Group M_IBM Q1'18     Published Date: Jan 23, 2018
Flexible deployment options, licensing models help take the challenges out of change. As you move toward the cloud, you're likely planning or managing a mixed environment of on- premises and on- cloud applications. To help you succeed in this transition, you need a trans-formative, mixed-workload database that can handle a massive volume of data while delivering high performance, data availability and the flexibility to adapt respond to business changes.
Tags : cloud applications, database, data volume, data availability
     Group M_IBM Q1'18
By: Infinidat EMEA     Published Date: May 14, 2019
Databases represent the backbone of most organizations. And Oracle databases in particular have become the mainstream data repository for most mission-critical environments. Some of the largest companies and organizations in the world rely on Oracle databases to store their most important data. The biggest challenge organizations face relative to an Oracle database is to maintain these databases at optimum performance and reliability without breaking the bank. This paper discusses the storage capabilities customers should consider when choosing storage to support an Oracle database environment.
Tags : 
     Infinidat EMEA
By: Oracle + Dyn     Published Date: Jun 27, 2017
"Every user’s first interaction with your website begins with a series of DNS queries. The Domain Name System or DNS is a distributed internet database that maps human-readable names to IP addresses, ensuring users reach the correct online asset (website, application, etc) efficiently. Knowing the complexities and best practices of this layer of your online infrastructure will help your organization build redundancy, improve end-user performance and establish a top notch DR plan. Download this guide to DNS top terms and actionable concepts including: • Anycast vs. Unicast networks • CNAME • DDoS and Hijacking • Load Balancing and GSLB Learn more! "
Tags : 
     Oracle + Dyn
By: Oracle     Published Date: May 03, 2017
Traditional backup systems fail to meet the needs of modern organisations by focusing on backup, not recovery. They treat databases as generic files to be copied, rather than as transactional workloads with specific data integrity, consistency, performance, and availability requirements.
Tags : 
     Oracle
By: Dynatrace     Published Date: Apr 16, 2018
Based on a global survey of 800 CIOs, this report examines the challenges organizations face when working within complex, cloud-centric ecosystems. Technology is at the heart of every organization today. Now more than ever, society expects the services we use to be innovative and faultless, prompting the creation of hyper-complex IT ecosystems. Relying on physical databases and third-party cloud service providers, businesses are finding it increasingly difficult to monitor application performance, ensure positive experiences, and succeed in this new environment.
Tags : 
     Dynatrace
Start   Previous    1 2 3 4 5 6 7 8 9 10 11    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com