it infrastructure

Results 1 - 25 of 2649Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 13, 2013
FlexPod Select with Hadoop delivers enterprise class Hadoop with validated, pre-configured components for fast deployment, higher reliability and smoother integration with existing applications and infrastructure.  These technical reference architectures optimize storage, networking, and servers with Cloudera and Hortonworks distributions of Hadoop. Leverage FlexPod Select with Hadoop to help store, manage, process and perform advanced analytics on your multi-structured data.   Tuning parameters, optimization techniques among other Hadoop cluster guidelines  are provided.
Tags : flexpod with hadoop, enterprise data, storage infrastructure, massive amounts of data
     NetApp
By: IBM     Published Date: Sep 02, 2014
Life Sciences organizations need to be able to build IT infrastructures that are dynamic, scalable, easy to deploy and manage, with simplified provisioning, high performance, high utilization and able to exploit both data intensive and server intensive workloads, including Hadop MapReduce. Solutions must scale, both in terms of processing and storage, in order to better serve the institution long-term. There is a life cycle management of data, and making it useable for mainstream analyses and applications is an important aspect in system design. This presentation will describe IT requirements and how Technical Computing solutions from IBM and Platform Computing will address these challenges and deliver greater ROI and accelerated time to results for Life Sciences.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : ibm, ibm platform computing, save money
     IBM
By: IBM     Published Date: Sep 02, 2014
In today’s stringent financial services regulatory environment with exponential growth of data and dynamic business requirements, Risk Analytics has become integral to businesses. IBM Algorithmics provides very sophisticated analyses for a wide range of economic scenarios that better quantify risk for multiple departments within a firm, or across the enterprise. With Algorithmics, firms have a better handle on their financial exposure and credit risks before they finalize real-time transactions. But this requires the performance and agility of a scalable infrastructure; driving up IT risk and complexity. The IBM Application Ready Solution for Algorithmics provides an agile, reliable and high-performance infrastructure to deliver trusted risk insights for sustained growth and profitability. This integrated offering with a validated reference architecture delivers the right risk insights at the right time while lowering the total cost of ownership.
Tags : ibm, it risk, financial risk analytics
     IBM
By: IBM     Published Date: May 20, 2015
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
IBM Platform Symphony - accelerate big data analytics – This demonstration will highlight the benefits and features of IBM Platform Symphony to accelerate big data analytics by maximizing distributed system performance, fully utilizing computing resources and effectively harnessing the power of Hadoop.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
According to our global study of more than 800 cloud decision makers and users are becoming increasingly focused on the business value cloud provides. Cloud is integral to mobile, social and analytics initiatives – and the big data management challenge that often comes with them and it helps power the entire suite of game-changing technologies. Enterprises can aim higher when these deployments are riding on the cloud. Mobile, analytics and social implementations can be bigger, bolder and drive greater impact when backed by scalable infrastructure. In addition to scale, cloud can provide integration, gluing the individual technologies into more cohesive solutions. Learn how companies are gaining a competitive advanatge with cloud computing.
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: TIBCO     Published Date: Nov 09, 2015
As one of the most exciting and widely adopted open-source projects, Apache Spark in-memory clusters are driving new opportunities for application development as well as increased intake of IT infrastructure. Apache Spark is now the most active Apache project, with more than 600 contributions being made in the last 12 months by more than 200 organizations. A new survey conducted by Databricks—of 1,417 IT professionals working with Apache Spark finds that high-performance analytics applications that can work with big data are driving a large proportion of that demand. Apache Spark is now being used to aggregate multiple types of data in-memory versus only pulling data from Hadoop. For solution providers, the Apache Spark technology stack is a significant player because it’s one of the core technologies used to modernize data warehouses, a huge segment of the IT industry that accounts for multiple billions in revenue. Spark holds much promise for the future—with data lakes—a storage repo
Tags : 
     TIBCO
By: IBM     Published Date: Nov 14, 2014
Join Gartner, Inc. and IBM Platform Computing for an informative webinar where you will learn how to combine best of breed analytic solutions to provide a low latency, shared big data infrastructure. This helps government IT departments make faster decisions by analyzing massive amounts of data, improving security, detecting fraud, enabling faster decisions and saving cost by optimizing and sharing your existing infrastructure.
Tags : 
     IBM
By: IBM     Published Date: Nov 14, 2014
View this series of short webcasts to learn how IBM Platform Computing products can help you ‘maximize the agility of your distributed computing environment’ by improving operational efficiency, simplify user experience, optimize application using and license sharing, address spikes in infrastructure demand and reduce data management costs.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
This demonstration shows how an organization using IBM Platform Computing workload managers can easily and securely tap resources in the IBM SoftLayer public cloud to handle periods of peak demand and reduce total IT infrastructure costs.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
To quickly and economically meet new and peak demands, Platform LSF (SaaS) and Platform Symphony (SaaS) workload management as well as Elastic Storage on Cloud data management software can be delivered as a service, complete with SoftLayer cloud infrastructure and 24x7 support for technical computing and service-oriented workloads. Watch this demonstration to learn how the IBM Platform Computing Cloud Service can be used to simplify and accelerate financial risk management using IBM Algorithmics.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
Software defined storage is enterprise class storage that uses standard hardware with all the important storage and management functions performed in intelligent software. Software defined storage delivers automated, policy-driven, application-aware storage services through orchestration of the underlining storage infrastructure in support of an overall software defined environment.
Tags : 
     IBM
By: Dell and Intel®     Published Date: Aug 24, 2015
To extract value from an ever-growing onslaught of data, your organization needs next-generation data management, integration, storage and processing systems that allow you to collect, manage, store and analyze data quickly, efficiently and cost-effectively. That’s the case with Dell| Cloudera® Apache™ Hadoop® solutions for big data. These solutions provide end-to-end scalable infrastructure, leveraging open source technologies, to allow you to simultaneously store and process large datasets in a distributed environment for data mining and analysis, on both structured and unstructured data, and to do it all in an affordable manner.
Tags : 
     Dell and Intel®
By: Solix     Published Date: Aug 03, 2015
Every CIO want to know if their infrastructure will handle it when data growth reaches 40 zettabytes by 2020. When data sets become to large, application performance slows and infrastructure struggles to keep up. Data growth drives increases cost and complexity everywhere, including power consumption, data center space, performance and availability. To find out more download the Gartner study now.
Tags : 
     Solix
By: Adaptive Computing     Published Date: Feb 27, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: Akamai Technologies     Published Date: Jun 14, 2018
"Businesses continue to evolve as digital technologies reshape industries. The workforce is mobile, and speed and ef ciency are imperative, necessitating dynamic, cloud-based infrastructures and connectivity, as well as unhindered, secure application access — from anywhere, on any device, at any time. Leaders must remove hurdles to progress, but new business initiatives and processes increase the attack surface, potentially putting the company at risk.
Tags : digital technology, cloud, security, connectivity, authenticate
     Akamai Technologies
By: Akamai Technologies     Published Date: Jun 14, 2018
IT complexity has rapidly grown with more applications, users, and infrastructure needed to service the enterprise. Traditional remote access models weren’t built for business models of today and are unable to keep up with the pace of change. Read this paper to understand how remote access can be redefined to remove the complexity, meet end-user expectation and mitigate network security risks.
Tags : security, multi-factor authentication, network security risks
     Akamai Technologies
By: Akamai Technologies     Published Date: Jun 14, 2018
"High-profile cyber attacks seem to occur almost daily in recent years. Clearly security threats are persistent and growing. While many organizations have adopted a defense-in-depth strategy — utilizing anti-virus protection, firewalls, intruder prevention systems, sandboxing, and secure web gateways — most IT departments still fail to explicitly protect the Domain Name System (DNS). This oversight leaves a massive gap in network defenses. But this infrastructure doesn’t have to be a vulnerability. Solutions that protect recursive DNS (rDNS) can serve as a simple and effective security control point for end users and devices on your network. Read this white paper to learn more about how rDNS is putting your enterprise at risk, why you need a security checkpoint at this infrastructural layer, how rDNS security solutio Read 5 Reasons Enterprises Need a New Access Model to learn about the fundamental changes enterprises need to make when providing access to their private applications.
Tags : rdns, dns, anti-virus, security, network defense
     Akamai Technologies
By: Hewlett Packard Enterprise     Published Date: Jul 19, 2018
"The Enterprise Strategy Group put HPE SimpliVity hyperconverged infrastructure through its paces in a series of hands-on performance and stress tests simulating real-world configurations and load conditions. Read this report to learn how an HPE SimpliVity infrastructure: Ran nine workloads simultaneously, with an average response time across all applications of 3.9ms Showed little to no performance impact as a virtual controller, drive, and node were systematically failed on a two-node cluster."
Tags : hyperconverged, infrastructure, simplivity
     Hewlett Packard Enterprise
By: Akamai Technologies     Published Date: Apr 25, 2018
Keeping your data safe requires forward-thinking approaches to cybersecurity. Learn how you can augment your existing on-premise infrastructure with security measures in the cloud for a more robust web security posture. Download this guide to learn: Why the cloud is critical for web security How real-world DDoS attacks are testing the limits of on-site solutions Discover the questions some vendors don’t want you to ask
Tags : cloud, security, cyber, cloud, web, ddos
     Akamai Technologies
By: Tenable     Published Date: Aug 07, 2018
When it comes to IT infrastructure, it’s fair to say the perimeter has left the premises. Whether it’s discovering short-lived assets (e.g., containers), assessing cloud environments or maintaining web application security, today’s attack surface presents a growing challenge to CISOs looking to understand and reduce their cyber risk. To combat this issue, a discipline called Cyber Exposure is emerging to help organizations manage and measure this risk. This ebook provides insights on how CISOs are addressing the modern attack surface.
Tags : cyber exposure, iot, vulnerability management, cloud security, mobile security, container security
     Tenable
By: CA Technologies_Business_Automation     Published Date: Jun 29, 2018
In the 26-criteria evaluation of continuous delivery and release automation (CDRA) providers, we identified the 15 most significant — Atlassian, CA technologies, Chef Software, Clarive, CloudBees, electric Cloud, Flexagon, Hewlett packard enterprise (Hpe), IBM, Micro Focus, Microsoft, puppet, Red Hat, VMware, and Xebialabs — and researched, analyzed, and scored them. We focused on core features, including modeling, deploying, managing, governing, and visualizing pipelines, and on each vendor’s ability to match a strategy to these features. this report helps infrastructure and operations (I&o) professionals make the right choice when looking for CDRA solutions for their development and operations (Devops) automation.
Tags : 
     CA Technologies_Business_Automation
By: CA Technologies_Business_Automation     Published Date: Jun 29, 2018
Challenge Enterprises today face the major challenge of how to fully orchestrate the apps that define their business and automate the IT processes underpinning them when much of the infrastructure used to run them is outsourced to cloud providers. Opportunity Some cloud service providers offer their own orchestration tools, and each on-premises tool has automation capabilities. But while we own our apps, providers are interchangeable depending on what they can offer and for what price. We should be able to switch between cloud providers and between cloud, on-premises and hybrid infrastructure as and when the business requires, with minimal effort and without losing any control. Sometimes we might want to use more than one provider at the same time, leveraging the advantages of each provider simultaneously. Benefits What is needed is an orchestration layer that remains constant while cloud services come and go; one that enterprises own along with their core apps.
Tags : 
     CA Technologies_Business_Automation
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com