environment

Results 1 - 25 of 2848Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 18, 2013
IT managers have indicated their two most significant challenges associated with managing unstructured data at multiple locations were keeping pace with data growth and improving data protection . Learn how the NetApp Distributed Content Repository provides advanced data protection and system recovery capabilities that can enable multiple data centers and remote offices to maintain access to data through hardware and software faults. Key benefits are: - continuous access to file data while maintaining data redundancy with no administrator intervention needed. - easily integrated and deployed into a distributed environment, providing transparent, centrally managed content storage - provision of secure multi-tenancy using security partitions. - provision effectively infinite, on-demand capacity while providing fast access to files and objects in the cloud. - secure, robust data protection techniques that enable data to persist beyond the life of the storage it resides on
Tags : 
     NetApp
By: IBM     Published Date: Sep 02, 2014
In today’s stringent financial services regulatory environment with exponential growth of data and dynamic business requirements, Risk Analytics has become integral to businesses. IBM Algorithmics provides very sophisticated analyses for a wide range of economic scenarios that better quantify risk for multiple departments within a firm, or across the enterprise. With Algorithmics, firms have a better handle on their financial exposure and credit risks before they finalize real-time transactions. But this requires the performance and agility of a scalable infrastructure; driving up IT risk and complexity. The IBM Application Ready Solution for Algorithmics provides an agile, reliable and high-performance infrastructure to deliver trusted risk insights for sustained growth and profitability. This integrated offering with a validated reference architecture delivers the right risk insights at the right time while lowering the total cost of ownership.
Tags : ibm, it risk, financial risk analytics
     IBM
By: IBM     Published Date: May 20, 2015
Every day, the world creates 2.5 quintillion bytes of data and businesses are realizing tangible results from investments in big data analytics. IBM Spectrum Scale (GPFS) offers an enterprise class alternative to Hadoop Distributed File System (HDFS) for building big data platforms and provides a range of enterprise-class data management features. Spectrum Scale can be deployed independently or with IBM’s big data platform, consisting of IBM InfoSphere® BigInsights™ and IBM Platform™ Symphony. This document describes best practices for deploying Spectrum Scale in such environments to help ensure optimal performance and reliability.
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
Building applications for handling big data requires laser-like focus on solutions that allow you to deliver scalable, reliable and flexible infrastructure for fast-growing analytics environments. This paper provides 6 best practices for selecting the “right” infrastructure—one that is optimized for performance, flexibility and long-term value.
Tags : 
     IBM
By: RYFT     Published Date: Apr 03, 2015
The new Ryft ONE platform is a scalable 1U device that addresses a major need in the fast-growing market for advanced analytics — avoiding I/O bottlenecks that can seriously impede analytics performance on today's hyperscale cluster systems. The Ryft ONE platform is designed for easy integration into existing cluster and other server environments, where it functions as a dedicated, high-performance analytics engine. IDC believes that the new Ryft ONE platform is well positioned to exploit the rapid growth we predict for the high-performance data analysis market.
Tags : ryft, ryft one platform, 1u deivce, advanced analytics, avoiding i/o bottlenecks, idc
     RYFT
By: Data Direct Networks     Published Date: Apr 08, 2014
DataDirect Networks (DDN), the largest privately-held provider of high-performance storage, has a large and growing presence in HPC markets. HPC users identify DDN as their storage provider more than any other storage-focused company, with twice the mentions of EMC, and more the twice the mentions of NetApp, Hitachi Data Systems, or Panasas.(5) DDN’s strength in HPC is anchored by its Storage Fusion Architecture (SFA), winner of the HPCwire Editor’s Choice Award for “Best HPC Storage Product or Technology” in each of the past three years. The DDN SFA12KX combines SATA, SAS, and solid-state disks (SSDs) for an environment that can be tailored to a balance of throughput and capacity
Tags : 
     Data Direct Networks
By: IBM     Published Date: Nov 14, 2014
Every day, the world creates 2.5 quintillion bytes of data and businesses are realizing tangible results from investments in big data analytics. IBM Spectrum Scale (GPFS) offers an enterprise class alternative to Hadoop Distributed File System (HDFS) for building big data platforms and provides a range of enterprise-class data management features. Spectrum Scale can be deployed independently or with IBM’s big data platform, consisting of IBM InfoSphere® BigInsights™ and IBM Platform™ Symphony. This document describes best practices for deploying Spectrum Scale in such environments to help ensure optimal performance and reliability.
Tags : 
     IBM
By: IBM     Published Date: Nov 14, 2014
Platform HPC enables HPC customers to side-step many overhead cost and support issues that often plague open-source environments and enable them to deploy powerful, easy to use clusters.
Tags : 
     IBM
By: IBM     Published Date: Nov 14, 2014
View this series of short webcasts to learn how IBM Platform Computing products can help you ‘maximize the agility of your distributed computing environment’ by improving operational efficiency, simplify user experience, optimize application using and license sharing, address spikes in infrastructure demand and reduce data management costs.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
Software defined storage is enterprise class storage that uses standard hardware with all the important storage and management functions performed in intelligent software. Software defined storage delivers automated, policy-driven, application-aware storage services through orchestration of the underlining storage infrastructure in support of an overall software defined environment.
Tags : 
     IBM
By: Dell and Intel®     Published Date: Aug 24, 2015
New technologies help decision makers gain insights from all types of data - from traditional databases to high-visibility social media sources. Big data initiatives must ensure data is cost-effectively managed, shared by systems across the enterprise, and quickly and securely made available for analysis and action by line-of-business teams. In this article, learn how Dell working with Intel® helps IT leaders overcome the challenges of IT and business alignment, resource constraints and siloed environments through a comprehensive big data portfolio based on choice and flexibility, redefined economics and connected intelligence.
Tags : 
     Dell and Intel®
By: Dell and Intel®     Published Date: Aug 24, 2015
To extract value from an ever-growing onslaught of data, your organization needs next-generation data management, integration, storage and processing systems that allow you to collect, manage, store and analyze data quickly, efficiently and cost-effectively. That’s the case with Dell| Cloudera® Apache™ Hadoop® solutions for big data. These solutions provide end-to-end scalable infrastructure, leveraging open source technologies, to allow you to simultaneously store and process large datasets in a distributed environment for data mining and analysis, on both structured and unstructured data, and to do it all in an affordable manner.
Tags : 
     Dell and Intel®
By: Dell and Intel®     Published Date: Sep 06, 2015
In conclusion, the retail experience has changed dramatically in recent years as there has been a power shift over to consumers. Shoppers can easily find and compare products from an array of devices, even while walking through a store. They can share their opinions about retailers and products through social media and influence other prospective customers. To compete in this new multi-channel environment, we’ve seen in this guide how retailers have to adopt new and innovative strategies to attract and retain customers. Big data technologies, specifically Hadoop, enable retailers to connect with customers through multiple channels at an entirely new level by harnessing the vast volumes of new data available today. Hadoop helps retailers store, transform, integrate and analyze a wide variety of online and offline customer data—POS transactions, e-commerce transactions, clickstream data, email, social media, sensor data and call center records—all in one central repository. Retailers can
Tags : 
     Dell and Intel®
By: snowflake     Published Date: Jun 09, 2016
THE CHALLENGE: DATA SOLUTIONS CAN’T KEEP PACE WITH DATA NEEDS Organizations are increasingly dependent on diff erent types of data to make successful business decisions. But as the volume, rate, and types of data expand and become less predictable, conventional data warehouses cannot consume all this data eff ectively. Big data solutions like Hadoop increase the complexity of the environment and generally lack the performance of traditional data warehouses. This makes it difficult, expensive, and time-consuming to manage all the systems and the data.
Tags : 
     snowflake
By: Revolution Analytics     Published Date: May 09, 2014
As the primary facilitator of data science and big data, machine learning has garnered much interest by a broad range of industries as a way to increase value of enterprise data assets. Through techniques of supervised and unsupervised statistical learning, organizations can make important predictions and discover previously unknown knowledge to provide actionable business intelligence. In this guide, we’ll examine the principles underlying machine learning based on the R statistical environment. We’ll explore machine learning with R from the open source R perspective as well as the more robust commercial perspective using Revolution Analytics Enterprise (RRE) for big data deployments.
Tags : revolution analytics, data science, big data, machine learning
     Revolution Analytics
By: Adaptive Computing     Published Date: Feb 27, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: Rubrik EMEA     Published Date: Apr 15, 2019
Backup and recovery needs a radical rethink. When today’s incumbent solutions were designed over a decade ago, IT environments were exploding, heterogeneity was increasing, and backup was the protection of last resort. The goal was to provide a low cost insurance policy for data, and to support this increasingly complex multi-tier, heterogeneous environment. The answer was to patch together backup and recovery solutions under a common vendor management framework and to minimize costs by moving data across the infrastructure or media.
Tags : data, backup, recovery, virtualization, solutions, cloud, architectures
     Rubrik EMEA
By: CloudHealth by VMware     Published Date: Apr 15, 2019
Organisations moving to AWS seek improved performance, increased innovation, and a faster time to market—but the road to cloud maturity, and ultimately cloud success, proves both challenging and expensive. Learn to accelerate your AWS cloud journey with: A checklist for determining if you have clear visibility into your AWS environment Expert tips for developing proper cloud security best practices Real examples of financial, performance, and security management policies for automating your cloud ecosystem Looking to optimise your AWS cloud infrastructure? Use this eBook to regain control over your visibility and cost management, security and compliance, and governance and automation.
Tags : 
     CloudHealth by VMware
By: CloudHealth by VMware     Published Date: Apr 15, 2019
Govern Your AWS Environment with Automated Policies If you manage a dynamic cloud environment, you already know that managing your assets and dealing with continuous changes is time consuming. Read this eBook to learn the best practices for policies that you must put in place to reduce the time it takes to optimize and manage your infrastructure.
Tags : 
     CloudHealth by VMware
By: CloudHealth by VMware     Published Date: Apr 15, 2019
Choosing Azure revolutionises your environment's agility, simplicity, and innovation, but have you achieved the cost savings you expected? Discover 10 ways you can reduce your spend in Azure, including: Terminate Zombie Assets Delete Aged Snapshots Rightsize Virtual Machines Rightsize SQL Databases Read 10 Best practices for Reducing Spend in Azure, to learn key strategies for optimising cloud spend and saving 10-20?% on your monthly Azure costs.
Tags : 
     CloudHealth by VMware
By: Intel     Published Date: Apr 11, 2019
To remain competitive, manufacturers must focus on achieving new growth while driving down costs. Key to achieving this is greater flexibility and a dramatic upturn in operational efficiency across the manufacturing process. One area ripe for improvement is intralogistics transportation. Many manufacturers still rely on autonomous guide vehicles (AGVs) to undertake repetitive transport tasks; but, rigid in nature, they do not support today’s demand-driven, dynamic manufacturing environments. Intelligent autonomous mobile robots (AMRs), like SEIT* from Milvus Robotics, offer a viable and cost-effective alternative. This solution brief describes how to solve business challenges through investment in innovative technologies.
Tags : 
     Intel
By: Intel     Published Date: Apr 15, 2019
o With foot traffic falling and online shopping options growing, retailers must find new ways to “digitize” and understand real-world behavioral data—such as in-store browsing patterns, staff attentiveness, and specific product interest— in the same way that online retail utilizes big data to optimize online experiences. They must also find innovative ways to keep customers engaged with their brands, especially in expensive brick-and-mortar locations. In this environment, managing labor costs is critical, as these costs are second only to real estate. Assigning and enabling sales associates cost-effectively is key to profitability. Retailers have an opportunity to meet their challenges by putting new data and Internet of Things (IoT) technologies to work
Tags : 
     Intel
By: Gigamon     Published Date: Mar 26, 2019
Read “Security at the Speed of Your Network” to learn why organizations are using an architectural approach to improve security posture and reduce costs. Enable security tools to work more efficiently across physical, virtual and cloud environments. Deploy a diverse set of security solutions as a centralized security tool farm. You can secure more data on faster networks with no compromise between security, performance and cost. Read now.
Tags : 
     Gigamon
By: Gigamon     Published Date: Mar 26, 2019
Read this ESG research brief to learn how to improve visibility across distributed computing environments without increasing staff or tools. Download “The Importance of a Common Distributed Data Services Layer” to learn why ensuring visibility in a distributed network environment is crucial and how a common distributed data services layer to collect, process and share data ensures that network operations teams are able to provide better digital experiences, mitigate risk and respond to changing environments.
Tags : 
     Gigamon
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com