red hat

Results 1 - 25 of 2043Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 14, 2013
Read how the NetApp Distributed Content Repository Solution is an efficient and risk-reducing active archive solution. Based on customer data, Forrester created a composite organization and concluded that the NetApp Distributed Content Repository delivered a three year ROI of 47% with a payback period of 1.3 months. The key benefits are reduced risk of losing unregulated archived data, denser storage, storage solution efficiency, and compliance for regulated data. The study also provides readers with a framework to do their own financial impact evaluation. Source: The Total Economic Impact Of The NetApp Distributed Content Repository Solution (StorageGRID On E-Series), a commissioned study conducted by Forrester Consulting on behalf of NetApp, March 2013.
Tags : forrester tei
     NetApp
By: NetApp     Published Date: Dec 18, 2013
IT managers have indicated their two most significant challenges associated with managing unstructured data at multiple locations were keeping pace with data growth and improving data protection . Learn how the NetApp Distributed Content Repository provides advanced data protection and system recovery capabilities that can enable multiple data centers and remote offices to maintain access to data through hardware and software faults. Key benefits are: - continuous access to file data while maintaining data redundancy with no administrator intervention needed. - easily integrated and deployed into a distributed environment, providing transparent, centrally managed content storage - provision of secure multi-tenancy using security partitions. - provision effectively infinite, on-demand capacity while providing fast access to files and objects in the cloud. - secure, robust data protection techniques that enable data to persist beyond the life of the storage it resides on
Tags : 
     NetApp
By: Storiant     Published Date: Mar 16, 2015
Read this new IDC Report about how today's enterprise datacenters are dealing with new challenges that are far more demanding than ever before. Foremost is the exponential growth of data, most of it unstructured data. Big data and analytics implementations are also quickly becoming a strategic priority in many enterprises, demanding online access to more data, which is retained for longer periods of time. Legacy storage solutions with fixed design characteristics and a cost structure that doesn't scale are proving to be ill-suited for these new needs. This Technology Spotlight examines the issues that are driving organizations to replace older archive and backup-and-restore systems with business continuity and always-available solutions that can scale to handle extreme data growth while leveraging a cloudbased pricing model. The report also looks at the role of Storiant and its long-term storage services solution in the strategically important long-term storage market.
Tags : storiant, big data, analytics implementations, cloudbased pricing model, long-term storage services solution, long-term storage market
     Storiant
By: RedPoint     Published Date: Sep 22, 2014
The emergence of YARN for the Hadoop 2.0 platform has opened the door to new tools and applications that promise to allow more companies to reap the benefits of big data in ways never before possible with outcomes possibly never imagined. By separating the problem of cluster resource management from the data processing function, YARN offers a world beyond MapReduce: less encumbered by complex programming protocols, faster, and at a lower cost. Some of the topics discussed in this paper include: • Why is YARN important for realizing the power of Hadoop for data integration, quality and management? • Benchmark results of MapReduce vs. Pig vs. visual “data flow” design tools • The 3 key features of YARN that solve the complex problems that prohibit businesses from gaining maximum benefit from Hadoop. Download this paper to learn why the power of Hadoop 2.0 lies in enabling applications to run inside Hadoop, without the constraints of MapReduce.
Tags : 
     RedPoint
By: RedPoint     Published Date: Nov 10, 2014
Adoption of Hadoop by data-driven organizations is exploding. Hadoop’s potential cost effectiveness and facility for accepting unstructured data is making it central to modern, “Big Data” architectures. The advancements in Hadoop 2.0 increase the technology’s promise to an even greater extent. But with these opportunities also come challenges and adoption hurdles that make getting the most out of Hadoop easier said than done. Read on as we review some Hadoop basics, highlight some of the adoption challenges that exist and explain how RedPoint Data Management for Hadoop helps organizations accelerate their work with Hadoop.
Tags : 
     RedPoint
By: TIBCO     Published Date: Sep 02, 2014
In this Guide you will learn how predictive analytics helps your organization predict with confidence what will happen next so that you can make smarter decisions and improve business outcomes. It is important to adopt a predictive analytics solution that meets the specific needs of different users and skill sets from beginners, to experienced analysts, to data scientists.
Tags : 
     TIBCO
By: Data Direct Networks     Published Date: Apr 08, 2014
DataDirect Networks (DDN), the largest privately-held provider of high-performance storage, has a large and growing presence in HPC markets. HPC users identify DDN as their storage provider more than any other storage-focused company, with twice the mentions of EMC, and more the twice the mentions of NetApp, Hitachi Data Systems, or Panasas.(5) DDN’s strength in HPC is anchored by its Storage Fusion Architecture (SFA), winner of the HPCwire Editor’s Choice Award for “Best HPC Storage Product or Technology” in each of the past three years. The DDN SFA12KX combines SATA, SAS, and solid-state disks (SSDs) for an environment that can be tailored to a balance of throughput and capacity
Tags : 
     Data Direct Networks
By: IBM     Published Date: Nov 14, 2014
Platform Symphony is an enterprise-class server platform that delivers low-latency, scaled-out MapReduce workloads. It supports multiple applications running concurrently so that organizations can increase utilization across all resources resulting in a high return on investment.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
IBM® has created a proprietary implementation of the open-source Hadoop MapReduce run-time that leverages the IBM Platform™ Symphony distributed computing middleware while maintaining application-level compatibility with Apache Hadoop.
Tags : 
     IBM
By: MapR Technologies     Published Date: Sep 04, 2013
Enterprises are faced with new requirements for data. We now have big data that is different from the structured, cleansed corporate data repositories of the past. Before, we had to plan out structured queries. In the Hadoop world, we don’t have to sort data according to a predetermined schema when we collect it. We can store data as it arrives and decide what to do with it later. Today, there are different ways to analyze data collected in Hadoop—but which one is the best way forward?
Tags : 
     MapR Technologies
By: Dell and Intel®     Published Date: Aug 24, 2015
Business need: Merkle needed a scalable, cost-effective way to capture and analyze large amounts of structured and unstructured consumer data for use in developing better marketing campaigns for clients. Solution: The company deployed a Dell and HadoopTM cluster based on Dell and Intel® technologies to support a new big data insight solution that gives clients a unified view of customer data. Benefits: [bullets for the below points] • Partnership with Dell and Intel® leads to new big data solution • Cluster supports the Foundational Marketing Platform, a new data insight solution • Merkle can find patterns in big data and create analytical models that anticipate consumer behavior • Organization cuts costs by 60 percent and boosts processing speeds by 10 times • Solution provides scalability and enables innovation
Tags : 
     Dell and Intel®
By: Red Hat     Published Date: Sep 09, 2018
As enterprises work to balance IT innovation with efficiency—and increasingly adopt cloud-first strategies—agility is key. In this IDC InfoBrief, sponsored by Red Hat, learn 4 key capabilities supporting cloud adoption and agile integration approaches that offer faster, more flexible workflows.
Tags : 
     Red Hat
By: Red Hat     Published Date: Sep 09, 2018
As applications and services become more central to business strategy, and as distributed methodologies like agile and DevOps change the way teams operate, it is critical for IT leaders to find a way to integrate their backend systems, legacy systems, and teams in an agile, adaptable way. This e-book details an architecture called agile integration, consisting of three technology pillars—distributed integration, containers, and APIs—to deliver flexibility, scalability, and reusability.
Tags : 
     Red Hat
By: Red Hat     Published Date: Sep 09, 2018
This assessment shows that enterprises adopt Red Hat Fuse because they believe in a community-based open source approach to integration for modernizing their integration infrastructure that delivers strong ROI. For these organizations, Fuse was part of a larger digital transformation initiative and was also used to modernize integration. IDC interviewed organizations using Fuse to integrate important business applications across their heterogeneous IT environments. These Red Hat customers reported that Fuse has enabled them to complete substantially more integrations at a higher quality level, thereby supporting their efforts to deliver timely and functional applications and digital services. Efficiencies in application integration with Fuse have generated significant value for study participants, which IDC quantifies at an average value of $75,453 per application integrated per year ($985,600 per organization). They have attained this value by: » Enabling more efficient and effectiv
Tags : 
     Red Hat
By: Hewlett Packard Enterprise     Published Date: Jul 18, 2018
"Principled Technologies executed four typical deployment and management scenarios using both HPE Synergy and Cisco UCS. They found that HPE Synergy saved 71.5 minutes and 86 steps, and used four fewer tools compared to Cisco UCS. In a hypothetical 200-node datacenter, that’s a total of 9 work weeks, or just over 2 months’ time savings on routine tasks."
Tags : 
     Hewlett Packard Enterprise
By: Zendesk     Published Date: Jun 29, 2018
The start of a new chapter of your business, whether you're moving upmarket or adding products and features, is a great time to scale your customer service operations in a smart way. We know customers prefer self-service, via a knowledge base, if one is available. A Gartner report estimates that CIOs can reduce customer support costs by 25% or more when proper knowledge management discipline is in place. If you've been on the sidelines waiting to take the self-service leap, this white paper will prove that you and your agents already partake in the activities needed to offer great self-service, every single day.
Tags : 
     Zendesk
By: Hewlett Packard Enterprise     Published Date: Jul 18, 2018
In an innovation-powered economy, ideas need to travel at the speed of thought. Yet even as our ability to communicate across companies and time zones grows rapidly, people remain frustrated by downtime and unanticipated delays across the increasingly complex grid of cloud-based infrastructure, data networks, storage systems, and servers that power our work.
Tags : 
     Hewlett Packard Enterprise
By: Amazon Web Services     Published Date: Jul 25, 2018
What is a Data Lake? Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyze data that addresses many of these challenges. A Data Lakes allows an organization to store all of their data, structured and unstructured, in one, centralized repository. Since data can be stored as-is, there is no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand. Download to find out more now.
Tags : 
     Amazon Web Services
By: Amazon Web Services     Published Date: Jul 25, 2018
As easy as it is to get swept up by the hype surrounding big data, it’s just as easy for organizations to become discouraged by the challenges they encounter while implementing a big data initiative. Concerns regarding big data skill sets (and the lack thereof), security, the unpredictability of data, unsustainable costs, and the need to make a business case can bring a big data initiative to a screeching halt. However, given big data’s power to transform business, it’s critical that organizations overcome these challenges and realize the value of big data. Download now to find out more.
Tags : 
     Amazon Web Services
By: Amazon Web Services     Published Date: Jul 25, 2018
Defining the Data Lake “Big data” is an idea as much as a particular methodology or technology, yet it’s an idea that is enabling powerful insights, faster and better decisions, and even business transformations across many industries. In general, big data can be characterized as an approach to extracting insights from very large quantities of structured and unstructured data from varied sources at a speed that is immediate (enough) for the particular analytics use case.
Tags : 
     Amazon Web Services
By: Avanade DACH     Published Date: Aug 01, 2018
Je besser die Daten, desto besser die Künstliche Intelligenz Sie möchten Ihre Kunden und deren Verhalten besser verstehen? Ihnen eine maßgeschneiderte Customer Experience bieten? Oder neue Geschäftsfelder identifizieren? Es ist vielleicht nicht immer offensichtlich: Aber die Grundlage jeder gut funktionierenden KI sind Daten. In unserem Leitfaden zeigen wir Ihnen in sechs Schritten, wie Sie Ihre Daten auf innovative Weise organisieren. So schaffen Sie eine optimale Grundlage, um künftig das Beste aus künstlicher Intelligenz, Cognitive Computing und maschinellem Lernen herausholen zu können.
Tags : 
     Avanade  DACH
By: Dell Server     Published Date: Aug 08, 2018
Transforming IT to meet the emerging requirements of a rapidly advancing digital economy is a priority for most companies today. Market economies and quickly evolving digital interactions are driving new and increasing demands on IT infrastructure for organizations of all kinds – from small businesses to enterprises to public institutions. IT requirements to support a variety of digital use paradigms (personal devices, IoT, VMs, VDI) are changing quickly, and organizations need to respond in order to be competitive in this evolving digital world. The latest generation of PowerEdge servers powered by Intel® Xeon® Platinum processors can deliver differentiated business agility over older-generation infrastructure. IDC asserted that updating servers can help businesses deploy applications up to 22 percent faster and improve application performance up to 29 percent over outdated infrastructure. Intel Inside®. New Possibilities Outside.
Tags : 
     Dell Server
By: CA Technologies EMEA     Published Date: Sep 07, 2018
As the application economy drives companies to roll out applications more quickly, companies are seeing testing in a new light. Once considered a speed bump on the DevOps fast track, new tools and testing methodologies are emerging to bring testing up to speed. In this ebook, we’ll explore some of the challenges on the road to continuous testing, along with new approaches that will help you adopt next-gen testing practices that offer the ability to test early, often and automatically.
Tags : continuous delivery, application delivery, testing, test data management
     CA Technologies EMEA
By: CA Technologies EMEA     Published Date: Sep 12, 2018
To compete successfully in today’s economy, companies from all industries require the ability to deliver software faster, with higher quality, and reduced risk and costs. This is only possible with a modern software factory that can deliver quality software continuously. Yet for most enterprises, testing has not kept pace with modern development methodologies. A new approach to software testing is required: Continuous Testing. In the first session in a series, join product management leadership to gain in-depth insights on how by shifting testing left, and automating all aspects of test case generation and execution, continuous testing, it enables you to deliver quality software faster than ever. Recorded Feb 5 2018 49 mins Presented by Steve Feloney, VP Product Management CA Technologies
Tags : continuous delivery, application delivery, testing, test data management
     CA Technologies EMEA
By: CA Technologies EMEA     Published Date: Sep 11, 2018
The advent of cloud computing and software-defined data center architectures for modern application delivery has made networking more sensitive than ever before. Applications in the digital age require networks that can expand and contract dynamically based on consumer demand. Enterprises are implementing software-defined networking (SDN) to deliver the automation required by these new environments, but the dynamic nature of SDN makes network management and monitoring fundamentally more challenging. Network infrastructure teams need monitoring tools that can provide visibility into these new and constantly changing networks. This white paper explores the importance of SDN monitoring and examines a leading example of a solution, CA Performance Management with CA Virtual Network Assurance integration.
Tags : 
     CA Technologies EMEA
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com