solutions

Results 1 - 25 of 4840Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 13, 2013
Interested in running a Hadoop proof of concept on enterprise-class storage? Download this solutions guide to get a technical overview on building Hadoop on NetApp E-series storage. NetApp Open Solution for Hadoop delivers big analytics with preengineered, compatible, and supported solutions based on high-quality storage platforms so you reduce the cost, schedule, and risk of do-it-yourself systems and relieving the skills gap most organizations have with Hadoop. See how on going operational and maintenance costs can be reduced with a high available and scalable Hadoop solution.
Tags : open solutions, hadoop solutions guide
     NetApp
By: NetApp     Published Date: Dec 19, 2013
SAP HANA enables real-time access to mission-critical, business data and thus, revolutionizes the way existing information can be utilized to address ever changing business requirements. This whitepaper describes both the business and technical benefits of implementing the Cisco UCS with NetApp Storage for SAP HANA solution.
Tags : 
     NetApp
By: IBM     Published Date: Sep 02, 2014
Life Sciences organizations need to be able to build IT infrastructures that are dynamic, scalable, easy to deploy and manage, with simplified provisioning, high performance, high utilization and able to exploit both data intensive and server intensive workloads, including Hadop MapReduce. Solutions must scale, both in terms of processing and storage, in order to better serve the institution long-term. There is a life cycle management of data, and making it useable for mainstream analyses and applications is an important aspect in system design. This presentation will describe IT requirements and how Technical Computing solutions from IBM and Platform Computing will address these challenges and deliver greater ROI and accelerated time to results for Life Sciences.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
With tougher regulations and continuing market volatility, financial firms are moving to active risk management with a focus on counterparty risk. Firms are revamping their risk and trading practices from top to bottom. They are adopting new risk models and frameworks that support a holistic view of risk. Banks recognize that technology is critical for this transformation, and are adding state-of-the-art enterprise risk management solutions, high performance data and grid management software, and fast hardware. Join IBM Algorithmics and IBM Platform Computing to gain insights on this trend and on technologies for enabling active "real-time" risk management.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
Learn how to manage storage growth, cost and complexity, while increasing storage performance and data availability with IBM Software Defined Storage solutions including the IBM General parallel File System (GPFS).
Tags : ibm, storage for dummies
     IBM
By: IBM     Published Date: May 20, 2015
According to our global study of more than 800 cloud decision makers and users are becoming increasingly focused on the business value cloud provides. Cloud is integral to mobile, social and analytics initiatives – and the big data management challenge that often comes with them and it helps power the entire suite of game-changing technologies. Enterprises can aim higher when these deployments are riding on the cloud. Mobile, analytics and social implementations can be bigger, bolder and drive greater impact when backed by scalable infrastructure. In addition to scale, cloud can provide integration, gluing the individual technologies into more cohesive solutions. Learn how companies are gaining a competitive advanatge with cloud computing.
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
The IBM Spectrum Scale solution provided for up to 11x better throughput results than EMC Isilon for Spectrum Protect (TSM) workloads. Using published data, Edison compared a solution comprised of EMC® Isilon® against an IBM® Spectrum Scale™ solution. (IBM Spectrum Scale was formerly IBM® General Parallel File System™ or IBM® GPFS™, also known as code name Elastic Storage). For both solutions, IBM® Spectrum Protect™ (formerly IBM Tivoli® Storage Manager or IBM® TSM®) is used as a common workload performing the backups to target storage systems evaluated.
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
Building applications for handling big data requires laser-like focus on solutions that allow you to deliver scalable, reliable and flexible infrastructure for fast-growing analytics environments. This paper provides 6 best practices for selecting the “right” infrastructure—one that is optimized for performance, flexibility and long-term value.
Tags : 
     IBM
By: Storiant     Published Date: Mar 16, 2015
Read this new IDC Report about how today's enterprise datacenters are dealing with new challenges that are far more demanding than ever before. Foremost is the exponential growth of data, most of it unstructured data. Big data and analytics implementations are also quickly becoming a strategic priority in many enterprises, demanding online access to more data, which is retained for longer periods of time. Legacy storage solutions with fixed design characteristics and a cost structure that doesn't scale are proving to be ill-suited for these new needs. This Technology Spotlight examines the issues that are driving organizations to replace older archive and backup-and-restore systems with business continuity and always-available solutions that can scale to handle extreme data growth while leveraging a cloudbased pricing model. The report also looks at the role of Storiant and its long-term storage services solution in the strategically important long-term storage market.
Tags : storiant, big data, analytics implementations, cloudbased pricing model, long-term storage services solution, long-term storage market
     Storiant
By: Data Direct Networks     Published Date: Apr 08, 2014
DataDirect Networks (DDN), the largest privately-held provider of high-performance storage, has a large and growing presence in HPC markets. HPC users identify DDN as their storage provider more than any other storage-focused company, with twice the mentions of EMC, and more the twice the mentions of NetApp, Hitachi Data Systems, or Panasas.(5) DDN’s strength in HPC is anchored by its Storage Fusion Architecture (SFA), winner of the HPCwire Editor’s Choice Award for “Best HPC Storage Product or Technology” in each of the past three years. The DDN SFA12KX combines SATA, SAS, and solid-state disks (SSDs) for an environment that can be tailored to a balance of throughput and capacity
Tags : 
     Data Direct Networks
By: IBM     Published Date: Nov 14, 2014
Join Gartner, Inc. and IBM Platform Computing for an informative webinar where you will learn how to combine best of breed analytic solutions to provide a low latency, shared big data infrastructure. This helps government IT departments make faster decisions by analyzing massive amounts of data, improving security, detecting fraud, enabling faster decisions and saving cost by optimizing and sharing your existing infrastructure.
Tags : 
     IBM
By: Intel     Published Date: Aug 06, 2014
Around the world and across all industries, high-performance computing is being used to solve today’s most important and demanding problems. More than ever, storage solutions that deliver high sustained throughput are vital for powering HPC and Big Data workloads. Intel® Enterprise Edition for Lustre* software unleashes the performance and scalability of the Lustre parallel file system for enterprises and organizations, both large and small. It allows users and workloads that need large scale, high- bandwidth storage to tap into the power and scalability of Lustre, but with the simplified installation, configuration, and monitoring features of Intel® Manager for Lustre* software, a management solution purpose-built for the Lustre file system.Intel ® Enterprise Edition for Lustre* software includes proven support from the Lustre experts at Intel, including worldwide 24x7 technical support. *Other names and brands may be claimed as the property of others.
Tags : 
     Intel
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflops of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Lustre and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters. Further, Intel Enterprise Edition for Lustre software is backed by Intel, the recognized technical support providers for Lustre, and includes 24/7 service level agreement (SLA) coverage.
Tags : 
     Intel
By: Intel     Published Date: Sep 16, 2014
In this Guide, we take a look at what Lustre on infrastructure AWS delivers for a broad community of business and commercial organizations struggling with the challenge of big data and demanding storage growth.
Tags : intel, lustre, big data solutions in the cloud
     Intel
By: Dell and Intel®     Published Date: Aug 24, 2015
Many enterprises are embracing Hadoop because of the unique business benefits it provides. But, until now, this rapidly evolving big data technology hadn’t always met enterprise security needs. In order to protect big data today, organizations must have solutions that address four key areas: authentication, authorization, audit and lineage, and compliant data protection.
Tags : 
     Dell and Intel®
By: Dell and Intel®     Published Date: Aug 24, 2015
In today’s digitally driven world, the success of a business is increasingly tied to its ability to extract value from data. Exploiting the untapped value of your data is now the pathway to success. By putting data-driven decision making at the heart of the business, your organization can harness a wealth of information to gain an unparalleled competitive advantage. In a future-ready enterprise, you must make a fundamental shift from a focus on technology to a strategic business focus. Data-driven insights can guide everything from the formulation of top-level corporate strategies to connected devices that monitor and enable immediate critical decisions, to the creation of personalized customer interactions. Data is the foundation for enabling business transformation and innovation.
Tags : 
     Dell and Intel®
By: Dell and Intel®     Published Date: Aug 24, 2015
Organizations working at gathering insights from vast volumes of varied data types understand that they need more than traditional, structured systems, and tools. This paper discusses how the many Dell | Cloudera Hadoop solutions help organizations of all sizes, and with a variety of needs and use cases, tackle their big data requirements.
Tags : 
     Dell and Intel®
By: Dell and Intel®     Published Date: Aug 24, 2015
To extract value from an ever-growing onslaught of data, your organization needs next-generation data management, integration, storage and processing systems that allow you to collect, manage, store and analyze data quickly, efficiently and cost-effectively. That’s the case with Dell| Cloudera® Apache™ Hadoop® solutions for big data. These solutions provide end-to-end scalable infrastructure, leveraging open source technologies, to allow you to simultaneously store and process large datasets in a distributed environment for data mining and analysis, on both structured and unstructured data, and to do it all in an affordable manner.
Tags : 
     Dell and Intel®
By: snowflake     Published Date: Jun 09, 2016
THE CHALLENGE: DATA SOLUTIONS CAN’T KEEP PACE WITH DATA NEEDS Organizations are increasingly dependent on diff erent types of data to make successful business decisions. But as the volume, rate, and types of data expand and become less predictable, conventional data warehouses cannot consume all this data eff ectively. Big data solutions like Hadoop increase the complexity of the environment and generally lack the performance of traditional data warehouses. This makes it difficult, expensive, and time-consuming to manage all the systems and the data.
Tags : 
     snowflake
By: Hewlett Packard Enterprise     Published Date: Jul 19, 2018
We’re in the era of hybrid cloud, an integrated combination of public and private cloud environments. Most companies moving into the public cloud today are making strategic decisions about which applications should go to the cloud and which should stay on-premises. Get acquainted with hybrid cloud management strategies and solutions, and learn what critical components must be addressed as you plan your hybrid cloud environment.
Tags : private cloud, hybrid cloud
     Hewlett Packard Enterprise
By: Akamai Technologies     Published Date: Apr 25, 2018
Keeping your data safe requires forward-thinking approaches to cybersecurity. Learn how you can augment your existing on-premise infrastructure with security measures in the cloud for a more robust web security posture. Download this guide to learn: Why the cloud is critical for web security How real-world DDoS attacks are testing the limits of on-site solutions Discover the questions some vendors don’t want you to ask
Tags : cloud, security, cyber, cloud, web, ddos
     Akamai Technologies
By: Butler Technologies     Published Date: Jul 02, 2018
Increasingly complex networks, require more than a one-size-fitsall approach to ensuring adequate performance and data integrity. In addition to the garden-variety performance issues such as slow applications, increased bandwidth requirements, and lack of visibility into cloud resources, there is also the strong likelihood of a malicious attack. While many security solutions like firewalls and intrusion detection systems (IDS) work to prevent security incidents, none are 100 percent effective. However, there are proactive measures that any IT team can implement now that can help ensure that a successful breach is found quickly, effectively remediated, and that evidential data is available in the event of civil and/or criminal proceedings.
Tags : 
     Butler Technologies
By: Butler Technologies     Published Date: Jul 02, 2018
As enterprises become more distributed and migrate applications to the cloud, they will need to diversify their network performance management solutions. In particular, the network operations team will need to complement its packet-based monitoring tools with active test monitoring solutions to enhance visibility into the cloud and help enterprises scale monitoring gracefully and cost-effectively.
Tags : 
     Butler Technologies
By: CA Technologies_Business_Automation     Published Date: Jun 29, 2018
Gartner’s Continuous Delivery Automation Magic Quadrant (MQ) analyzes the current market solutions and their effectiveness in responding to the demands of the modern business. The MQ is created on a tool’s ‘ability to execute’ and its ‘completeness of vision.’ When assessing CA Continuous Delivery Automation, Gartner highlights its ability to ‘provide scalability, resilience, security and enterprise management connectivity.’ The analysis also goes much deeper to cover the strengths, weaknesses and various nuances of all the automation products on the market.
Tags : 
     CA Technologies_Business_Automation
By: CA Technologies_Business_Automation     Published Date: Jun 29, 2018
In the 26-criteria evaluation of continuous delivery and release automation (CDRA) providers, we identified the 15 most significant — Atlassian, CA technologies, Chef Software, Clarive, CloudBees, electric Cloud, Flexagon, Hewlett packard enterprise (Hpe), IBM, Micro Focus, Microsoft, puppet, Red Hat, VMware, and Xebialabs — and researched, analyzed, and scored them. We focused on core features, including modeling, deploying, managing, governing, and visualizing pipelines, and on each vendor’s ability to match a strategy to these features. this report helps infrastructure and operations (I&o) professionals make the right choice when looking for CDRA solutions for their development and operations (Devops) automation.
Tags : 
     CA Technologies_Business_Automation
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com