application

Results 1 - 25 of 5722Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 13, 2013
FlexPod Select with Hadoop delivers enterprise class Hadoop with validated, pre-configured components for fast deployment, higher reliability and smoother integration with existing applications and infrastructure.  These technical reference architectures optimize storage, networking, and servers with Cloudera and Hortonworks distributions of Hadoop. Leverage FlexPod Select with Hadoop to help store, manage, process and perform advanced analytics on your multi-structured data.   Tuning parameters, optimization techniques among other Hadoop cluster guidelines  are provided.
Tags : flexpod with hadoop, enterprise data, storage infrastructure, massive amounts of data
     NetApp
By: IBM     Published Date: Sep 02, 2014
Life Sciences organizations need to be able to build IT infrastructures that are dynamic, scalable, easy to deploy and manage, with simplified provisioning, high performance, high utilization and able to exploit both data intensive and server intensive workloads, including Hadop MapReduce. Solutions must scale, both in terms of processing and storage, in order to better serve the institution long-term. There is a life cycle management of data, and making it useable for mainstream analyses and applications is an important aspect in system design. This presentation will describe IT requirements and how Technical Computing solutions from IBM and Platform Computing will address these challenges and deliver greater ROI and accelerated time to results for Life Sciences.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
Research teams using next-generation sequencing (NGS) technologies face the daunting challenge of supporting compute-intensive analysis methods against petabytes of data while simultaneously keeping pace with rapidly evolving algorithmic best practices. NGS users can now solve these challenges by deploying the Accelrys Enterprise Platform (AEP) and the NGS Collection on optimized systems from IBM. Learn how you can benefit from the turnkey IBM Application Ready Solution for Accelrys with supporting benchmark data.
Tags : ibm, accelrys, turnkey ngs solution
     IBM
By: IBM     Published Date: Sep 02, 2014
In today’s stringent financial services regulatory environment with exponential growth of data and dynamic business requirements, Risk Analytics has become integral to businesses. IBM Algorithmics provides very sophisticated analyses for a wide range of economic scenarios that better quantify risk for multiple departments within a firm, or across the enterprise. With Algorithmics, firms have a better handle on their financial exposure and credit risks before they finalize real-time transactions. But this requires the performance and agility of a scalable infrastructure; driving up IT risk and complexity. The IBM Application Ready Solution for Algorithmics provides an agile, reliable and high-performance infrastructure to deliver trusted risk insights for sustained growth and profitability. This integrated offering with a validated reference architecture delivers the right risk insights at the right time while lowering the total cost of ownership.
Tags : ibm, it risk, financial risk analytics
     IBM
By: IBM     Published Date: May 20, 2015
Join IBM and Nuance Communications Inc. to learn how Nuance uses IBM Elastic Storage to improve the power of their voice recognition applications by managing storage growth, cost and complexity while increasing performance and data availability. View the webcast to learn how you can: · Lower data management costs through policy driven automation and tiered storage management · Manage and increase storage agility through software defined storage Remove data related bottlenecks to deliver application performance
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
Building applications for handling big data requires laser-like focus on solutions that allow you to deliver scalable, reliable and flexible infrastructure for fast-growing analytics environments. This paper provides 6 best practices for selecting the “right” infrastructure—one that is optimized for performance, flexibility and long-term value.
Tags : 
     IBM
By: TIBCO     Published Date: Nov 09, 2015
As one of the most exciting and widely adopted open-source projects, Apache Spark in-memory clusters are driving new opportunities for application development as well as increased intake of IT infrastructure. Apache Spark is now the most active Apache project, with more than 600 contributions being made in the last 12 months by more than 200 organizations. A new survey conducted by Databricks—of 1,417 IT professionals working with Apache Spark finds that high-performance analytics applications that can work with big data are driving a large proportion of that demand. Apache Spark is now being used to aggregate multiple types of data in-memory versus only pulling data from Hadoop. For solution providers, the Apache Spark technology stack is a significant player because it’s one of the core technologies used to modernize data warehouses, a huge segment of the IT industry that accounts for multiple billions in revenue. Spark holds much promise for the future—with data lakes—a storage repo
Tags : 
     TIBCO
By: Impetus     Published Date: Feb 04, 2016
This white paper explores strategies to leverage the steady flow of new, advanced real-time streaming data analytics (RTSA) application development technologies. It defines a thoughtful approach to capitalize on the window of opportunity to benefit from the power of real-time decision making now, and still be able to move to new and emerging technologies as they become enterprise ready.
Tags : 
     Impetus
By: Impetus     Published Date: Mar 15, 2016
Streaming analytics platforms provide businesses a method for extracting strategic value from data-in-motion in a manner similar to how traditional analytics tools operate on data-at rest. Instead of historical analysis, the goal with streaming analytics is to enable near real-time decision making by letting companies inspect, correlate and analyze data even as it flows into applications and databases from numerous different sources. Streaming analytics allows companies to do event processing against massive volumes of data streaming into the enterprise at high velocity.
Tags : impetus, guide to stream analytics, real time streaming analytics, streaming analytics, real time analytics, big data analytics
     Impetus
By: RedPoint     Published Date: Sep 22, 2014
The emergence of YARN for the Hadoop 2.0 platform has opened the door to new tools and applications that promise to allow more companies to reap the benefits of big data in ways never before possible with outcomes possibly never imagined. By separating the problem of cluster resource management from the data processing function, YARN offers a world beyond MapReduce: less encumbered by complex programming protocols, faster, and at a lower cost. Some of the topics discussed in this paper include: • Why is YARN important for realizing the power of Hadoop for data integration, quality and management? • Benchmark results of MapReduce vs. Pig vs. visual “data flow” design tools • The 3 key features of YARN that solve the complex problems that prohibit businesses from gaining maximum benefit from Hadoop. Download this paper to learn why the power of Hadoop 2.0 lies in enabling applications to run inside Hadoop, without the constraints of MapReduce.
Tags : 
     RedPoint
By: MEMSQL     Published Date: Apr 12, 2016
The pace of data is not slowing. Applications of today are built with infinite data sets in mind. As these real-time applications become the norm, and batch processing becomes a relic of the past, digital enterprises will implement memory-optimized, distributed data systems to simplify Lambda Architectures for real-time data processing and exploration.
Tags : 
     MEMSQL
By: IBM     Published Date: Nov 14, 2014
Platform Symphony is an enterprise-class server platform that delivers low-latency, scaled-out MapReduce workloads. It supports multiple applications running concurrently so that organizations can increase utilization across all resources resulting in a high return on investment.
Tags : 
     IBM
By: IBM     Published Date: Nov 14, 2014
View this series of short webcasts to learn how IBM Platform Computing products can help you ‘maximize the agility of your distributed computing environment’ by improving operational efficiency, simplify user experience, optimize application using and license sharing, address spikes in infrastructure demand and reduce data management costs.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
Software defined storage is enterprise class storage that uses standard hardware with all the important storage and management functions performed in intelligent software. Software defined storage delivers automated, policy-driven, application-aware storage services through orchestration of the underlining storage infrastructure in support of an overall software defined environment.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
IBM® has created a proprietary implementation of the open-source Hadoop MapReduce run-time that leverages the IBM Platform™ Symphony distributed computing middleware while maintaining application-level compatibility with Apache Hadoop.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
Organizations of all sizes need help building clusters and grids to support compute- and data-intensive application workloads. Read how the Hartree Centre is building several high-performance computing clusters to support a variety of research projects.
Tags : 
     IBM
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflops of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Lustre and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters. Further, Intel Enterprise Edition for Lustre software is backed by Intel, the recognized technical support providers for Lustre, and includes 24/7 service level agreement (SLA) coverage.
Tags : 
     Intel
By: GridGain     Published Date: Mar 10, 2015
Software as a Service (SaaS) is a software distribution model in which applications are hosted by a vendor or service provider and made available to customers over the Internet. Instead of companies installing software on their own servers, known as the on premises distribution model, application software providers host the software in the cloud and charge customers according to the time they spend using the software, or based on a monthly or annual fee. SaaS is becoming increasingly popular, and as the industry develops, more and more companies are dropping older business models in favor of this rapidly evolving methodology.
Tags : gridgain, saas, saas perfomance and scalability, in memory computing, data fabric, paas for saas, data grid, real-time streaming, hadoop
     GridGain
By: Kx Systems     Published Date: Jan 16, 2015
?Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we wa
Tags : kx systems, kdb+, relational database
     Kx Systems
By: Solix     Published Date: Aug 03, 2015
Every CIO want to know if their infrastructure will handle it when data growth reaches 40 zettabytes by 2020. When data sets become to large, application performance slows and infrastructure struggles to keep up. Data growth drives increases cost and complexity everywhere, including power consumption, data center space, performance and availability. To find out more download the Gartner study now.
Tags : 
     Solix
By: snowflake     Published Date: Jun 09, 2016
Today’s data, and how that data is used, have changed dramatically in the past few years. Data now comes from everywhere—not just enterprise applications, but also websites, log files, social media, sensors, web services, and more. Organizations want to make that data available to all of their analysts as quickly as possible, not limit access to only a few highly-skilled data scientists. However, these efforts are quickly frustrated by the limitations of current data warehouse technologies. These systems simply were not built to handle the diversity of today’s data and analytics. They are based on decades-old architectures designed for a different world, a world where data was limited, users of data were few, and all processing was done in on-premises datacenters.
Tags : 
     snowflake
By: Adaptive Computing     Published Date: Feb 27, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: Akamai Technologies     Published Date: Oct 02, 2018
Ponemon Institute’s Asia-Pacific report details the prevalence of and consequences associated with web application attacks and denial of service (DoS) attacks. More than 500 IT and IT security professionals in Asia-Pacific shared the experiences of their organizations with these types of cyberattacks. The report provides a clear breakdown of specific costs by category for web application attacks and DoS. You’ll also see the security technologies the organizations are using to try to stop DDoS attacks and web application attacks, the rated effectiveness of each technology, and the barriers organizations in Asia-Pacific face in achieving effective protection.
Tags : 
     Akamai Technologies
By: Akamai Technologies     Published Date: Oct 02, 2018
Independent technology research firm Forrester evaluated web application firewall (WAF) vendors and published the results in The Forrester Wave™: Web Application Firewalls, Q2 2018. Akamai Technologies emerged as one of the leaders after a comprehensive evaluation on 33 criteria. The report states that security pros require a WAF that will automatically protect web applications, stay ahead of zero-day attacks and protect new application formats including APIs and serverless architectures. The report also reveals detailed findings for the 10 most significant WAF vendors. Akamai’s Kona Site Defender was the top scorer in the zero-day attacks criterion and one of the select vendors rated a Leader, the highest-ranking level in the report.
Tags : 
     Akamai Technologies
By: Akamai Technologies     Published Date: Oct 02, 2018
Ponemon Institute’s Asia-Pacific report details the prevalence of and consequences associated with web application attacks and denial of service (DoS) attacks. More than 500 IT and IT security professionals in Asia-Pacific shared the experiences of their organizations with these types of cyberattacks. The report provides a clear breakdown of specific costs by category for web application attacks and DoS. You’ll also see the security technologies the organizations are using to try to stop DDoS attacks and web application attacks, the rated effectiveness of each technology, and the barriers organizations in Asia-Pacific face in achieving effective protection.
Tags : 
     Akamai Technologies
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com