applications

Results 1 - 25 of 3788Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 13, 2013
FlexPod Select with Hadoop delivers enterprise class Hadoop with validated, pre-configured components for fast deployment, higher reliability and smoother integration with existing applications and infrastructure.  These technical reference architectures optimize storage, networking, and servers with Cloudera and Hortonworks distributions of Hadoop. Leverage FlexPod Select with Hadoop to help store, manage, process and perform advanced analytics on your multi-structured data.   Tuning parameters, optimization techniques among other Hadoop cluster guidelines  are provided.
Tags : flexpod with hadoop, enterprise data, storage infrastructure, massive amounts of data
     NetApp
By: IBM     Published Date: Sep 02, 2014
Life Sciences organizations need to be able to build IT infrastructures that are dynamic, scalable, easy to deploy and manage, with simplified provisioning, high performance, high utilization and able to exploit both data intensive and server intensive workloads, including Hadop MapReduce. Solutions must scale, both in terms of processing and storage, in order to better serve the institution long-term. There is a life cycle management of data, and making it useable for mainstream analyses and applications is an important aspect in system design. This presentation will describe IT requirements and how Technical Computing solutions from IBM and Platform Computing will address these challenges and deliver greater ROI and accelerated time to results for Life Sciences.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Join IBM and Nuance Communications Inc. to learn how Nuance uses IBM Elastic Storage to improve the power of their voice recognition applications by managing storage growth, cost and complexity while increasing performance and data availability. View the webcast to learn how you can: · Lower data management costs through policy driven automation and tiered storage management · Manage and increase storage agility through software defined storage Remove data related bottlenecks to deliver application performance
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
Building applications for handling big data requires laser-like focus on solutions that allow you to deliver scalable, reliable and flexible infrastructure for fast-growing analytics environments. This paper provides 6 best practices for selecting the “right” infrastructure—one that is optimized for performance, flexibility and long-term value.
Tags : 
     IBM
By: TIBCO     Published Date: Nov 09, 2015
As one of the most exciting and widely adopted open-source projects, Apache Spark in-memory clusters are driving new opportunities for application development as well as increased intake of IT infrastructure. Apache Spark is now the most active Apache project, with more than 600 contributions being made in the last 12 months by more than 200 organizations. A new survey conducted by Databricks—of 1,417 IT professionals working with Apache Spark finds that high-performance analytics applications that can work with big data are driving a large proportion of that demand. Apache Spark is now being used to aggregate multiple types of data in-memory versus only pulling data from Hadoop. For solution providers, the Apache Spark technology stack is a significant player because it’s one of the core technologies used to modernize data warehouses, a huge segment of the IT industry that accounts for multiple billions in revenue. Spark holds much promise for the future—with data lakes—a storage repo
Tags : 
     TIBCO
By: Impetus     Published Date: Mar 15, 2016
Streaming analytics platforms provide businesses a method for extracting strategic value from data-in-motion in a manner similar to how traditional analytics tools operate on data-at rest. Instead of historical analysis, the goal with streaming analytics is to enable near real-time decision making by letting companies inspect, correlate and analyze data even as it flows into applications and databases from numerous different sources. Streaming analytics allows companies to do event processing against massive volumes of data streaming into the enterprise at high velocity.
Tags : impetus, guide to stream analytics, real time streaming analytics, streaming analytics, real time analytics, big data analytics
     Impetus
By: RedPoint     Published Date: Sep 22, 2014
The emergence of YARN for the Hadoop 2.0 platform has opened the door to new tools and applications that promise to allow more companies to reap the benefits of big data in ways never before possible with outcomes possibly never imagined. By separating the problem of cluster resource management from the data processing function, YARN offers a world beyond MapReduce: less encumbered by complex programming protocols, faster, and at a lower cost. Some of the topics discussed in this paper include: • Why is YARN important for realizing the power of Hadoop for data integration, quality and management? • Benchmark results of MapReduce vs. Pig vs. visual “data flow” design tools • The 3 key features of YARN that solve the complex problems that prohibit businesses from gaining maximum benefit from Hadoop. Download this paper to learn why the power of Hadoop 2.0 lies in enabling applications to run inside Hadoop, without the constraints of MapReduce.
Tags : 
     RedPoint
By: MEMSQL     Published Date: Apr 12, 2016
The pace of data is not slowing. Applications of today are built with infinite data sets in mind. As these real-time applications become the norm, and batch processing becomes a relic of the past, digital enterprises will implement memory-optimized, distributed data systems to simplify Lambda Architectures for real-time data processing and exploration.
Tags : 
     MEMSQL
By: IBM     Published Date: Nov 14, 2014
Platform Symphony is an enterprise-class server platform that delivers low-latency, scaled-out MapReduce workloads. It supports multiple applications running concurrently so that organizations can increase utilization across all resources resulting in a high return on investment.
Tags : 
     IBM
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflops of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Lustre and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters. Further, Intel Enterprise Edition for Lustre software is backed by Intel, the recognized technical support providers for Lustre, and includes 24/7 service level agreement (SLA) coverage.
Tags : 
     Intel
By: GridGain     Published Date: Mar 10, 2015
Software as a Service (SaaS) is a software distribution model in which applications are hosted by a vendor or service provider and made available to customers over the Internet. Instead of companies installing software on their own servers, known as the on premises distribution model, application software providers host the software in the cloud and charge customers according to the time they spend using the software, or based on a monthly or annual fee. SaaS is becoming increasingly popular, and as the industry develops, more and more companies are dropping older business models in favor of this rapidly evolving methodology.
Tags : gridgain, saas, saas perfomance and scalability, in memory computing, data fabric, paas for saas, data grid, real-time streaming, hadoop
     GridGain
By: Kx Systems     Published Date: Jan 16, 2015
?Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we wa
Tags : kx systems, kdb+, relational database
     Kx Systems
By: snowflake     Published Date: Jun 09, 2016
Today’s data, and how that data is used, have changed dramatically in the past few years. Data now comes from everywhere—not just enterprise applications, but also websites, log files, social media, sensors, web services, and more. Organizations want to make that data available to all of their analysts as quickly as possible, not limit access to only a few highly-skilled data scientists. However, these efforts are quickly frustrated by the limitations of current data warehouse technologies. These systems simply were not built to handle the diversity of today’s data and analytics. They are based on decades-old architectures designed for a different world, a world where data was limited, users of data were few, and all processing was done in on-premises datacenters.
Tags : 
     snowflake
By: Adaptive Computing     Published Date: Feb 27, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: Hewlett Packard Enterprise     Published Date: Jan 31, 2019
Most companies moving into the public cloud today are making strategic decisions about which applications should go to the cloud and which should stay on-premises. Get acquainted with hybrid cloud management strategies and solutions, and learn what critical components must be addressed as you plan your hybrid cloud environment.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Jan 31, 2019
The bar for success is rising in higher education.  University leaders and IT administrators are aware of the compelling benefits of digital transformation overall—and artificial intelligence (AI) in particular. AI can amplify human capabilities by using machine learning, or deep learning, to convert the fast-growing and plentiful sources of data about all aspects of a university into actionable insights that drive better decisions. But when planning a transformational strategy, these leaders must prioritize operational continuity. It’s critical to protect the everyday activities of learning, research, and administration that rely on the IT infrastructure to consistently deliver data to its applications.
Tags : 
     Hewlett Packard Enterprise
By: TIBCO Software EMEA     Published Date: Jan 17, 2019
In an industry driven to deliver alpha, where might financial services firms find opportunities when investing in application innovation? The answer is data. Every financial services firm understands the importance of data. More is better. Sooner is better. Accessing it, understanding it, and taking advantage of it before the competition is better. That’s how data delivers alpha.
Tags : virtualization, data, client, firms, application, liquidity, access, management, applications
     TIBCO Software EMEA
By: TIBCO Software GmbH     Published Date: Jan 15, 2019
Enterprises use data virtualization software such as TIBCO® Data Virtualization to reduce data bottlenecks so more insights can be delivered for better business outcomes. For developers, data virtualization allows applications to access and use data without needing to know its technical details, such as how it is formatted or where it is physically located. For developers, data virtualization helps rapidly create reusable data services that access and transform data and deliver data analytics with even heavylifting reads completed quickly, securely, and with high performance. These data services can then be coalesced into a common data layer that can support a wide range of analytic and applications use cases. Data engineers and analytics development teams are big data virtualization users, with Gartner predicting over 50% of these teams adopting the technology by 202
Tags : 
     TIBCO Software GmbH
By: Intapp     Published Date: Jan 09, 2019
Intapp Time provides superior business intelligence thatchangesyour firm’sfundamental relationship with time.Thisunified suite of applications gives timekeepers access to time data and capture wherever they are: in the office, on a mobile device, online and offline.It is user-centric, offering a completely automated option while fullysupporting hands-on tracking—contemporaneous or reconstructionist.Intapp Time helps your business mine time data to reveal new sources of revenue, inform staff decisions, increase project efficiency, and reduce time leakage.
Tags : business, business intelligence, time, tax, time for tax, intapp, applications, time data, automation, reporting, timekeeping, audit, accounting, consulting, professional services, active time capture, passive time capture, time tracking
     Intapp
By: Dell SB     Published Date: Jan 24, 2019
Dell Precision delivers versatile designs, top performance & reliability to conquer the industry’s most demanding applications. From award-winning filmmakers and animators to architects and engineers our expansive portfolio enables you to customize the workstation for your creative expertise.
Tags : 
     Dell SB
By: Rackspace     Published Date: Feb 01, 2019
Rackspace Quick Start for Google Cloud Platform helps enterprises expedite their migration to Google Cloud using proven design, automation, and migration methodologies—all executed by Rackspace experts who have deployed more than a million applications into the cloud. By partnering with your company’s cross-functional leaders, our professional adoption team will fast-track your journey to the cloud—typically moving your first application(s) to the cloud within the first few weeks of the program. This annual review includes an assistance with a disaster recovery (DR) simulation, audit of patch levels, and upl eveling the deployment tools to ensure they align with the infrastructure that may have evolved since deployment.
Tags : 
     Rackspace
By: Rackspace     Published Date: Feb 01, 2019
Whether you’re already a Google customer or simply getting started with the public cloud, Google Cloud Platform (GCP) is an aordable, reliable, innovative and intuitive cloud solution. Rackspace can help you accelerate innovation and cost savings by taking over the intensive dayto-day operations of GCP — letting you focus on achieving your core business objectives while optimizing the performance of your applications. Rackspace works with customers to identify the scope and criticality of their applications and determine the service level that best addresses their needs. To discover how, download this whitepaper today.
Tags : 
     Rackspace
By: Lenovo - APAC     Published Date: Feb 11, 2019
Asian ICT infrastructure investment is exploding as businesses review and modernise their data-centre architectures to keep up with the service demands of a growing and increasingly sophisticated population. Demand for cloud services, particularly to support big-data analytics initiatives, is driving this trend. Frost & Sullivan, for example, believes the Asia-Pacific cloud computing market will grow at 28.4 percent annually through 2022. Despite this growth, many businesses are also rapidly realising that public cloud is not the best solution for every need as theydo not always offer the same level of visibility, performance, and control as on-premises infrastructure.This reality is pushing many companies towards the middle ground of hybrid IT, in which applications and infrastructure are distributed across public cloud and self-managed data centre infrastructure. Read about Medical company Mutoh and how it took advantage of the latest technology.
Tags : lenovodcg, nutanix, hyperconvergedinfrastructure, hci
     Lenovo - APAC
By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to analyzing and mitigating the risks of migrating to PostgreSQL. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. A key decision criteria for adopting any technology is whether it can support requirements for existing applications while also fitting into longer term strategies and needs. The first section of this eBook provides a detailed analysis of all aspects of migrating from legacy and commercial solutions to PostgreSQL: ? Schema and code migration ? Data migration ? Application code migration ? Testing and evaluation
Tags : 
     Stratoscale
By: Oracle     Published Date: Jan 08, 2019
No matter what your organization size, emerging technologies will impact you. Choosing the right path is a critical decision. Oracle can help automotive companies with a powerful combination of cloud technology and business applications.
Tags : 
     Oracle
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com