application performance

Results 1 - 25 of 818Sort Results By: Published Date | Title | Company Name
By: IBM     Published Date: Sep 02, 2014
In today’s stringent financial services regulatory environment with exponential growth of data and dynamic business requirements, Risk Analytics has become integral to businesses. IBM Algorithmics provides very sophisticated analyses for a wide range of economic scenarios that better quantify risk for multiple departments within a firm, or across the enterprise. With Algorithmics, firms have a better handle on their financial exposure and credit risks before they finalize real-time transactions. But this requires the performance and agility of a scalable infrastructure; driving up IT risk and complexity. The IBM Application Ready Solution for Algorithmics provides an agile, reliable and high-performance infrastructure to deliver trusted risk insights for sustained growth and profitability. This integrated offering with a validated reference architecture delivers the right risk insights at the right time while lowering the total cost of ownership.
Tags : ibm, it risk, financial risk analytics
     IBM
By: IBM     Published Date: May 20, 2015
Join IBM and Nuance Communications Inc. to learn how Nuance uses IBM Elastic Storage to improve the power of their voice recognition applications by managing storage growth, cost and complexity while increasing performance and data availability. View the webcast to learn how you can: · Lower data management costs through policy driven automation and tiered storage management · Manage and increase storage agility through software defined storage Remove data related bottlenecks to deliver application performance
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
Building applications for handling big data requires laser-like focus on solutions that allow you to deliver scalable, reliable and flexible infrastructure for fast-growing analytics environments. This paper provides 6 best practices for selecting the “right” infrastructure—one that is optimized for performance, flexibility and long-term value.
Tags : 
     IBM
By: TIBCO     Published Date: Nov 09, 2015
As one of the most exciting and widely adopted open-source projects, Apache Spark in-memory clusters are driving new opportunities for application development as well as increased intake of IT infrastructure. Apache Spark is now the most active Apache project, with more than 600 contributions being made in the last 12 months by more than 200 organizations. A new survey conducted by Databricks—of 1,417 IT professionals working with Apache Spark finds that high-performance analytics applications that can work with big data are driving a large proportion of that demand. Apache Spark is now being used to aggregate multiple types of data in-memory versus only pulling data from Hadoop. For solution providers, the Apache Spark technology stack is a significant player because it’s one of the core technologies used to modernize data warehouses, a huge segment of the IT industry that accounts for multiple billions in revenue. Spark holds much promise for the future—with data lakes—a storage repo
Tags : 
     TIBCO
By: IBM     Published Date: Feb 13, 2015
Organizations of all sizes need help building clusters and grids to support compute- and data-intensive application workloads. Read how the Hartree Centre is building several high-performance computing clusters to support a variety of research projects.
Tags : 
     IBM
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflops of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Lustre and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters. Further, Intel Enterprise Edition for Lustre software is backed by Intel, the recognized technical support providers for Lustre, and includes 24/7 service level agreement (SLA) coverage.
Tags : 
     Intel
By: Kx Systems     Published Date: Jan 16, 2015
?Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we wa
Tags : kx systems, kdb+, relational database
     Kx Systems
By: Solix     Published Date: Aug 03, 2015
Every CIO want to know if their infrastructure will handle it when data growth reaches 40 zettabytes by 2020. When data sets become to large, application performance slows and infrastructure struggles to keep up. Data growth drives increases cost and complexity everywhere, including power consumption, data center space, performance and availability. To find out more download the Gartner study now.
Tags : 
     Solix
By: Hewlett Packard Enterprise     Published Date: Apr 20, 2018
In an innovation-powered economy, ideas need to travel at the speed of thought. Yet even as our ability to communicate across companies and time zones grows rapidly, people remain frustrated by downtime and unanticipated delays across the increasingly complex grid of cloud-based infrastructure, data networks, storage systems, and servers that power our work.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: May 04, 2018
Managing infrastructure has always brought with it frustration, headaches and wasted time. That’s because IT professionals have to spend their days, nights and weekends dealing with problems that are disruptive to their applications and organization and manually tune their infrastructure. And, the challenges increase as the number of applications and reliance on infrastructure continues to grow. Luckily, there is a better way. HPE InfoSight is artificial intelligence (AI) that predicts and prevents problems across the infrastructure stack and ensures optimal performance and efficient resource use.
Tags : 
     Hewlett Packard Enterprise
By: Butler Technologies     Published Date: Jul 02, 2018
As enterprises become more distributed and migrate applications to the cloud, they will need to diversify their network performance management solutions. In particular, the network operations team will need to complement its packet-based monitoring tools with active test monitoring solutions to enhance visibility into the cloud and help enterprises scale monitoring gracefully and cost-effectively.
Tags : 
     Butler Technologies
By: Turbonomic     Published Date: Jul 05, 2018
The hybrid cloud has been heralded as a promising IT operational model enabling enterprises to maintain security and control over the infrastructure on which their applications run. At the same time, it promises to maximize ROI from their local data center and leverage public cloud infrastructure for an occasional demand spike. Public clouds are relatively new in the IT landscape and their adoption has accelerated over the last few years with multiple vendors now offering solutions as well as improved on-ramps for workloads to ease the adoption of a hybrid cloud model. With these advances and the ability to choose between a local data center and multiple public cloud offerings, one fundamental question must still be answered: What, when and where to run workloads to assure performance while maximizing efficiency? In this whitepaper, we explore some of the players in Infrastructure-as-a-Service (IaaS) and hybrid cloud, the challenges surrounding effective implementation, and how to iden
Tags : 
     Turbonomic
By: Turbonomic     Published Date: Jul 05, 2018
Organizations are adopting cloud computing to accelerate service delivery. Some try to deliver cloud economies of scale in their private data centers with the mantra “automate everything,” a philosophy often simpler in theory than practice. Others have opted to leverage public cloud resources for the added benefit of the pay-as-you-go model, but are finding it difficult to keep costs in check. Regardless of approach, cloud technology poses the same challenge IT has faced for decades: how to assure application performance while minimizing costs.
Tags : 
     Turbonomic
By: Workday France     Published Date: Apr 30, 2018
Le changement n'est jamais facile, surtout lorsqu'il concerne les applications de gestion. Les responsables informatiques savent cependant que passer au cloud permet de gagner en efficacité et en rapidité et de réaliser d'importantes économies. Mais en quoi les applications Cloud sont-elles si différentes de celles conçues à partir d'une architecture traditionnelle ou hybride ? Consultez ce livre Blanc de CIO.com pour découvrir les différentes architectures cloud, la façon dont elles influencent les capacités et les performances des applications et ce que cela induit en terme de migration depuis les systèmes actuels.
Tags : 
     Workday France
By: Coyote Point Systems     Published Date: Aug 25, 2010
Application Delivery Controllers understand applications and optimize server performance - offloading compute-intensive tasks that prevent servers from quickly delivering applications. Learn how ADCs have taken over where load balancers left off.
Tags : coyote point, systems, slbs, server load balancers, adc, adcs, ssl, load balancing, load balancer, application delivery, application delivery controller, application delivery network, ssl acceleration, ssl offloading, server offload, server offloading, server acceleration, content switch, content switching.
     Coyote Point Systems
By: ASG Software Solutions     Published Date: May 27, 2009
Application management requires visibility from multiple vantage points within the IT enterprise, combined with a centralized information store that pulls the technology pieces of the application puzzle into a coherent whole.
Tags : asg, cmdb, bsm, itil, configuration management, metadata, metacmdb, lob, sdm, service dependency mapping, ecommerce, bpm, workflow, itsm, critical application, cms, itsm
     ASG Software Solutions
By: ASG Software Solutions     Published Date: Apr 20, 2010
This paper can help you achieve successful legacy modernization projects. It presents practical steps for starting application modernization projects and describes the benefits of three high payback strategies. It also reviews the criteria for evaluating a variety of modernization tools.
Tags : asg, application modernization, clarity consulting, apo, application performance optimization, business intelligence
     ASG Software Solutions
By: ASG Software Solutions     Published Date: May 05, 2010
Read this topical and informative white paper from EMA Research and see how you can attain peace of mind that your end user's application performance will exceed expectations.
Tags : asg, end user, service assurance, ema, it infrastructure, service management
     ASG Software Solutions
By: Cisco EMEA Tier 3 ABM     Published Date: Mar 05, 2018
The competitive advantages and value of BDA are now widely acknowledged and have led to the shifting of focus at many firms from “if and when” to “where and how.” With BDA applications requiring more from IT infrastructures and lines of business demanding higher-quality insights in less time, choosing the right infrastructure platform for Big Data applications represents a core component of maximizing value. This IDC study considered the experiences of firms using Cisco UCS as an infrastructure platform for their BDA applications. The study found that Cisco UCS contributed to the strong value the firms are achieving with their business operations through scalability, performance, time to market, and cost effectiveness. As a result, these firms directly attributed business benefits to the manner in which Cisco UCS is deployed in the infrastructure.
Tags : big data, analytics, cisco, value, business, enterprise
     Cisco EMEA Tier 3 ABM
By: Hewlett Packard Enterprise     Published Date: May 11, 2018
The tipping point has arrived. Large enterprises are planning their next-generation datacenters around flash-based storage, and for good reason. Flash arrays provide read and write performance that is orders of magnitude faster than spinning media at a total cost of ownership that is on par with disk and will soon be lower. The benefits not only include improved application performance, but more consistent performance, lower latency, reduced storage footprint, streamlined storage administration, and lower operating costs. These advantages are too beneficial to your business to ignore. That’s why flash is becoming the standard for new storage investments.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Aug 15, 2016
The increasing demands of application and database workloads, growing numbers of virtual machines, and more powerful processors are driving demand for ever-faster storage systems. Increasingly, IT organizations are turning to solid-state storage to meet these demands, with hybrid and all-flash arrays taking the place of traditional disk storage for high performance workloads. Download this white paper to learn how you can get the most from your storage environment.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Feb 05, 2018
As businesses plunge into the digital future, no asset will have a greater impact on success than data. The ability to collect, harness, analyze, protect, and manage data will determine which businesses disrupt their industries, and which are disrupted; which businesses thrive, and which disappear. But traditional storage solutions are not designed to optimally handle such a critical business asset. Instead, businesses need to adopt an all-flash data center. In their new role as strategic business enablers, IT leaders have the responsibility to ensure that their businesses are protected, by investing in flexible, future-proof flash storage solutions. The right flash solution can deliver on critical business needs for agility, rapid growth, speed-to-market, data protection, application performance, and cost-effectiveness—while minimizing the maintenance and administration burden.
Tags : data, storage, decision makers, hpe
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Feb 05, 2018
Applications are the engines that drive today’s digital businesses. When the infrastructure that powers those applications is difficult to administer, or fails, businesses and their IT organizations are severely impacted. Traditionally, IT assumed much of the responsibility to ensure availability and performance. In the digital era, however, the industry needs to evolve and reset the requirements on vendors.
Tags : financial, optimization, hpe, predictive, analytics
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Mar 26, 2018
Business users expect immediate access to data, all the time and without interruption. But reality does not always meet expectations. IT leaders must constantly perform intricate forensic work to unravel the maze of issues that impact data delivery to applications. This performance gap between the data and the application creates a bottleneck that impacts productivity and ultimately damages a business’ ability to operate effectively. We term this the “app-data gap.”
Tags : 
     Hewlett Packard Enterprise
By: Akamai Technologies     Published Date: May 05, 2017
Web application and DDoS attacks hit enterprises without warning or reason. Most Distributed Denial of Service (DDoS) attacks require little skill to launch with attackers can simply rent resources from DDoS-for-hire sites at a low cost.. In comparison, DDoS attacks typically result in: • Operational disruption • Loss of confidential data • Lost user productivity • Reputational harm • Damage to partner and customer relations • Lost revenue Depending on your industry, that could add up to tens of thousands of dollars in damage – and in some cases it could be millions. Only 2% of organizations said their web applications had not been compromised in the past 12 months – 98% said they had.
Tags : ddos, technical support, data security, application security
     Akamai Technologies
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com