pos

Results 1 - 25 of 3968Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: -
Enterprise data is growing rapidly - reaching multiple petabytes of data or even billions of files for many organizations. To maximize the business value of this data, enterprises need a storage infrastructure to store, manage, and retrieve a massive amount of data. This ebook shows you how to address large content repository challenges with object storage. You'll learn how to effectively address long-term retention policies, find and retrieve content quickly from long-term repositories and using object storage efficiently.
Tags : object storage, storage infrastructure
     NetApp
By: NetApp     Published Date: Dec 14, 2013
Read how the NetApp Distributed Content Repository Solution is an efficient and risk-reducing active archive solution. Based on customer data, Forrester created a composite organization and concluded that the NetApp Distributed Content Repository delivered a three year ROI of 47% with a payback period of 1.3 months. The key benefits are reduced risk of losing unregulated archived data, denser storage, storage solution efficiency, and compliance for regulated data. The study also provides readers with a framework to do their own financial impact evaluation. Source: The Total Economic Impact Of The NetApp Distributed Content Repository Solution (StorageGRID On E-Series), a commissioned study conducted by Forrester Consulting on behalf of NetApp, March 2013.
Tags : forrester tei
     NetApp
By: NetApp     Published Date: Dec 18, 2013
IT managers have indicated their two most significant challenges associated with managing unstructured data at multiple locations were keeping pace with data growth and improving data protection . Learn how the NetApp Distributed Content Repository provides advanced data protection and system recovery capabilities that can enable multiple data centers and remote offices to maintain access to data through hardware and software faults. Key benefits are: - continuous access to file data while maintaining data redundancy with no administrator intervention needed. - easily integrated and deployed into a distributed environment, providing transparent, centrally managed content storage - provision of secure multi-tenancy using security partitions. - provision effectively infinite, on-demand capacity while providing fast access to files and objects in the cloud. - secure, robust data protection techniques that enable data to persist beyond the life of the storage it resides on
Tags : 
     NetApp
By: IBM     Published Date: Sep 02, 2014
Advanced analytics strategies yield the greatest benefits in terms of improving patient and business outcomes when applied across the entire healthcare ecosystem. But the challenge of collaborating across organizational boundaries in order to share information and insights is daunting to many stakeholders. In this worldwide survey of 555 healthcare providers, payers and life sciences organizations, you will learn the importance of implementing collaborative analytics strategies that: Manage, integrate and interpret data generated at all stages of the healthcare value chain Achieve the right balance of skills in order to translate data into actionable insights Focus on executive sponsorship and enterprise-wide adoption with metrics to measure and track success Position yourself to harness data, create and share insights, make informed decisions, and improve the performance of the entire healthcare ecosystem in which you operate.
Tags : ibm, analytics acrosss ecosystem
     IBM
By: IBM     Published Date: Sep 02, 2014
In today’s stringent financial services regulatory environment with exponential growth of data and dynamic business requirements, Risk Analytics has become integral to businesses. IBM Algorithmics provides very sophisticated analyses for a wide range of economic scenarios that better quantify risk for multiple departments within a firm, or across the enterprise. With Algorithmics, firms have a better handle on their financial exposure and credit risks before they finalize real-time transactions. But this requires the performance and agility of a scalable infrastructure; driving up IT risk and complexity. The IBM Application Ready Solution for Algorithmics provides an agile, reliable and high-performance infrastructure to deliver trusted risk insights for sustained growth and profitability. This integrated offering with a validated reference architecture delivers the right risk insights at the right time while lowering the total cost of ownership.
Tags : ibm, it risk, financial risk analytics
     IBM
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: RYFT     Published Date: Apr 03, 2015
The new Ryft ONE platform is a scalable 1U device that addresses a major need in the fast-growing market for advanced analytics — avoiding I/O bottlenecks that can seriously impede analytics performance on today's hyperscale cluster systems. The Ryft ONE platform is designed for easy integration into existing cluster and other server environments, where it functions as a dedicated, high-performance analytics engine. IDC believes that the new Ryft ONE platform is well positioned to exploit the rapid growth we predict for the high-performance data analysis market.
Tags : ryft, ryft one platform, 1u deivce, advanced analytics, avoiding i/o bottlenecks, idc
     RYFT
By: RedPoint     Published Date: Sep 22, 2014
The emergence of YARN for the Hadoop 2.0 platform has opened the door to new tools and applications that promise to allow more companies to reap the benefits of big data in ways never before possible with outcomes possibly never imagined. By separating the problem of cluster resource management from the data processing function, YARN offers a world beyond MapReduce: less encumbered by complex programming protocols, faster, and at a lower cost. Some of the topics discussed in this paper include: • Why is YARN important for realizing the power of Hadoop for data integration, quality and management? • Benchmark results of MapReduce vs. Pig vs. visual “data flow” design tools • The 3 key features of YARN that solve the complex problems that prohibit businesses from gaining maximum benefit from Hadoop. Download this paper to learn why the power of Hadoop 2.0 lies in enabling applications to run inside Hadoop, without the constraints of MapReduce.
Tags : 
     RedPoint
By: EMC     Published Date: Jun 13, 2016
The EMC IsilonSD product family combines the power of Isilon scale-out NAS with the economy of software-defined storage. IsilonSD Edge is purpose built to address the needs associated with growing unstructured data in enterprise edge location including remote and branch offices.
Tags : 
     EMC
By: MapR Technologies     Published Date: Sep 04, 2013
Enterprises are faced with new requirements for data. We now have big data that is different from the structured, cleansed corporate data repositories of the past. Before, we had to plan out structured queries. In the Hadoop world, we don’t have to sort data according to a predetermined schema when we collect it. We can store data as it arrives and decide what to do with it later. Today, there are different ways to analyze data collected in Hadoop—but which one is the best way forward?
Tags : 
     MapR Technologies
By: Intel     Published Date: Aug 06, 2014
Purpose-built for use with the dynamic computing resources available from Amazon Web Services ™, the Intel Lustre* solution provides the fast, massively scalable storage software needed to accelerate performance, even on complex workloads. Intel is a driving force behind the development of Lustre, and committed to providing fast, scalable, and cost effective storage with added support and manageability. Intel ® Enterprise Edition for Lustre* software is the ideal foundation. *Other names and brands may be claimed as the property of others.
Tags : 
     Intel
By: Intel     Published Date: Aug 06, 2014
Around the world and across all industries, high-performance computing is being used to solve today’s most important and demanding problems. More than ever, storage solutions that deliver high sustained throughput are vital for powering HPC and Big Data workloads. Intel® Enterprise Edition for Lustre* software unleashes the performance and scalability of the Lustre parallel file system for enterprises and organizations, both large and small. It allows users and workloads that need large scale, high- bandwidth storage to tap into the power and scalability of Lustre, but with the simplified installation, configuration, and monitoring features of Intel® Manager for Lustre* software, a management solution purpose-built for the Lustre file system.Intel ® Enterprise Edition for Lustre* software includes proven support from the Lustre experts at Intel, including worldwide 24x7 technical support. *Other names and brands may be claimed as the property of others.
Tags : 
     Intel
By: Dell and Intel®     Published Date: Sep 06, 2015
In conclusion, the retail experience has changed dramatically in recent years as there has been a power shift over to consumers. Shoppers can easily find and compare products from an array of devices, even while walking through a store. They can share their opinions about retailers and products through social media and influence other prospective customers. To compete in this new multi-channel environment, we’ve seen in this guide how retailers have to adopt new and innovative strategies to attract and retain customers. Big data technologies, specifically Hadoop, enable retailers to connect with customers through multiple channels at an entirely new level by harnessing the vast volumes of new data available today. Hadoop helps retailers store, transform, integrate and analyze a wide variety of online and offline customer data—POS transactions, e-commerce transactions, clickstream data, email, social media, sensor data and call center records—all in one central repository. Retailers can
Tags : 
     Dell and Intel®
By: snowflake     Published Date: Jun 09, 2016
Today’s data, and how that data is used, have changed dramatically in the past few years. Data now comes from everywhere—not just enterprise applications, but also websites, log files, social media, sensors, web services, and more. Organizations want to make that data available to all of their analysts as quickly as possible, not limit access to only a few highly-skilled data scientists. However, these efforts are quickly frustrated by the limitations of current data warehouse technologies. These systems simply were not built to handle the diversity of today’s data and analytics. They are based on decades-old architectures designed for a different world, a world where data was limited, users of data were few, and all processing was done in on-premises datacenters.
Tags : 
     snowflake
By: BitStew     Published Date: May 26, 2016
The heaviest lift for an industrial enterprise is data integration, the Achilles’ heel of the Industrial Internet of Things (IIoT). Companies are now recognizing the enormous challenge involved in supporting Big Data strategies that can handle the data that is generated by information systems, operational systems and the extensive networks of old and new sensors. To compound these issues, business leaders are expecting data to be captured, analyzed and used in a near real-time to optimize business processes, drive efficiency and improve profitability. However, integrating this vast amount of dissimilar data into a unified data strategy can be overwhelming for even the largest organizations. Download this white paper, by Bit Stew’s Mike Varney, to learn why a big data solution will not get the job done. Learn how to leverage machine intelligence with a purpose-built IIoT platform to solve the data integration problem.
Tags : 
     BitStew
By: Adaptive Computing     Published Date: Feb 27, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: Hewlett Packard Enterprise     Published Date: Jul 29, 2019
"Learn how to manage a software defined infrastructure SDI deployment and understand the benefit of composable versus traditional infrastructure. The white paper will also cover the definition of an SDI as well as what an IT organization should look for in an SDI solutions provider. Explore these questions in this white paper with a critical look at HPE’s software-defined and CI offering, HPE Synergy."
Tags : 
     Hewlett Packard Enterprise
By: VMware     Published Date: Sep 12, 2019
You’ve heard the stories: a large Internet company exposing all three billion of its customer accounts; a major hotel chain compromising five hundred million customer records; and one of the big-three credit reporting agencies exposing more than 143 million records, leading to a 25 percent loss in value and a $439 million hit. At the time, all of these companies had security mechanisms in place. They had trained professionals on the job. They had invested heavily in protection. But the reality is that no amount of investment in preventative technologies can fully eliminate the threat of savvy attackers, malicious insiders, or inadvertent victims of phishing. Breaches are rising, and so are their cost. In 2018, the average cost of a data breach rose 6.4 percent to $3.86 million, and the cost of a “mega breach,” those defined as losing 1 million to 50 million records, carried especially punishing price tags between $40 million and $350 million.2 Despite increasing investment in security
Tags : 
     VMware
By: Dell EMEA     Published Date: Sep 09, 2019
Quando si tratta di longevità, nessuno può reggere il confronto con Dell. Oltre a fornire capacità, gestibilità e funzionalità di protezione scelte dai dipartimenti IT, i nostri computer sono anche progettati per garantire cicli di vita più duraturi, con una conseguente riduzione degli sprechi. Non c'è da stupirsi che riscuotano successo nel mercato da così tanto tempo. Ma basta guardare al passato, parliamo piuttosto delle nuove funzionalità innovative. Il Latitude 7400 2-in-1 utilizza la nuova tecnologia ExpressSign-in di Dell che rileva la presenza dell'utente, attiva il sistema in circa un secondo e consente di effettuare l'accesso mediante riconoscimento facciale con Windows Hello. Gli utenti possono semplicemente sedersi alla scrivania e iniziare a lavorare, senza necessità di utilizzare combinazioni da tastiera per cambiare utente o addirittura toccare il tasto di accensione. Di fatto, è il primo PC al mondo a utilizzare un sensore di prossimità con tecnologia Intel® Context Se
Tags : 
     Dell EMEA
By: Dell SB     Published Date: Jul 31, 2019
Take back control of your infrastructure projects and build the business case using your own data with Dell EMC’s Live Optics. We can help create a defensible proposal, unique to your environment, using your Live Optics data. Take the guess work out of your projects and build a new infrastructure based on your actual needs. When can we set up a call to discuss?
Tags : 
     Dell SB
By: F5 Networks Singapore Pte Ltd     Published Date: Sep 09, 2019
Have you ever wished for an army of clones to do all your thankless tasks and chores? Well, that fantasy is becoming a reality—at least on the Internet. And while they may not be actual clones, bots have begun doing lots of digital dirty work. Managing your relationship with bots—good and bad—has become an inherent part of doing business in a connected world. With more than half of online traffic initiated by autonomous programs, it’s clear that bots are a driving force of technological change, and they’re here to stay.¹ As bot technology, machine learning, and AI continue to evolve, so will the threats they pose. And while some bots are good, many are malicious—and the cybercriminals behind them are targeting your apps. Preparing your organization to deal with the impact of bots on your business is essential to developing a sustainable strategy that will enable you to grow as you adapt to the new bot-enabled world.
Tags : 
     F5 Networks Singapore Pte Ltd
By: SAP EMEA Global     Published Date: Sep 13, 2019
Questions posed by: SAP SuccessFactors Answers by Lisa Rowan, Research Vice President, HR, Talent, and Learning Strategies
Tags : 
     SAP EMEA Global
By: F5 Networks Singapore Pte Ltd     Published Date: Sep 19, 2019
"Security analysts have a tougher job than ever. New vulnerabilities and security attacks used to be a monthly occurrence, but now they make the headlines almost every day. It’s become much more difficult to effectively monitor and protect all the data passing through your systems. Automated attacks from bad bots that mimic human behavior have raised the stakes, allowing criminals to have machines do the work for them. Not only that, these bots leave an overwhelming number of alert bells, false positives, and inherent stress in their wake for security practitioners to sift through. Today, you really need a significant edge when combating automated threats launched from all parts of the world. Where to start? With spending less time investigating all that noise in your logs."
Tags : 
     F5 Networks Singapore Pte Ltd
By: F5 Networks Singapore Pte Ltd     Published Date: Sep 19, 2019
"The fast pace of innovation demanded by today’s digital businesses challenges traditional processes for the deployment and governance of application delivery and supporting infrastructure. To address the increased pace of change, many organizations are transforming by adopting DevOps: a set of practices which employs continuous integration processes, breaking down the silos between development and operations teams. As cycle times accelerate, and development teams adopt more Agile delivery methodologies, the traditional model for application security can be a drag on the speed and agility inherent in a continuous integration process. This creates a natural friction. Security teams can be perceived as slowing down or blocking delivery. At the same time, however, the apps are exposed to significant threats. The goal of continuous integration is to deliver more frequent releases with more new capabilities to market, faster. It’s all about speed."
Tags : 
     F5 Networks Singapore Pte Ltd
By: F5 Networks Singapore Pte Ltd     Published Date: Sep 19, 2019
"Safeguarding the identity of users and managing the level of access they have to critical business applications could be the biggest security challenge organizations face in today’s assumed- breach world. Over 6,500 publicly disclosed data breaches occurred in 2018 alone, exposing over 5 billion records—a large majority of which included usernames and passwords.1 This wasn’t new to 2018 though, as evidenced by the existence of an online, searchable database of 8 billion username and password combinations that have been stolen over the years (https://haveibeenpwned.com/), keeping in mind there are only 4.3 billion people worldwide that have internet access. These credentials aren’t stolen just for fun—they are the leading attack type for causing a data breach. And the driving force behind the majority of credential attacks are bots—malicious ones—because they enable cybercriminals to achieve scale. That’s why prioritizing secure access and bot protection needs to be part of every organ
Tags : 
     F5 Networks Singapore Pte Ltd
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com