applications

Results 1 - 25 of 4012Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 13, 2013
FlexPod Select with Hadoop delivers enterprise class Hadoop with validated, pre-configured components for fast deployment, higher reliability and smoother integration with existing applications and infrastructure.  These technical reference architectures optimize storage, networking, and servers with Cloudera and Hortonworks distributions of Hadoop. Leverage FlexPod Select with Hadoop to help store, manage, process and perform advanced analytics on your multi-structured data.   Tuning parameters, optimization techniques among other Hadoop cluster guidelines  are provided.
Tags : flexpod with hadoop, enterprise data, storage infrastructure, massive amounts of data
     NetApp
By: IBM     Published Date: Sep 02, 2014
Life Sciences organizations need to be able to build IT infrastructures that are dynamic, scalable, easy to deploy and manage, with simplified provisioning, high performance, high utilization and able to exploit both data intensive and server intensive workloads, including Hadop MapReduce. Solutions must scale, both in terms of processing and storage, in order to better serve the institution long-term. There is a life cycle management of data, and making it useable for mainstream analyses and applications is an important aspect in system design. This presentation will describe IT requirements and how Technical Computing solutions from IBM and Platform Computing will address these challenges and deliver greater ROI and accelerated time to results for Life Sciences.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Join IBM and Nuance Communications Inc. to learn how Nuance uses IBM Elastic Storage to improve the power of their voice recognition applications by managing storage growth, cost and complexity while increasing performance and data availability. View the webcast to learn how you can: · Lower data management costs through policy driven automation and tiered storage management · Manage and increase storage agility through software defined storage Remove data related bottlenecks to deliver application performance
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
Building applications for handling big data requires laser-like focus on solutions that allow you to deliver scalable, reliable and flexible infrastructure for fast-growing analytics environments. This paper provides 6 best practices for selecting the “right” infrastructure—one that is optimized for performance, flexibility and long-term value.
Tags : 
     IBM
By: TIBCO     Published Date: Nov 09, 2015
As one of the most exciting and widely adopted open-source projects, Apache Spark in-memory clusters are driving new opportunities for application development as well as increased intake of IT infrastructure. Apache Spark is now the most active Apache project, with more than 600 contributions being made in the last 12 months by more than 200 organizations. A new survey conducted by Databricks—of 1,417 IT professionals working with Apache Spark finds that high-performance analytics applications that can work with big data are driving a large proportion of that demand. Apache Spark is now being used to aggregate multiple types of data in-memory versus only pulling data from Hadoop. For solution providers, the Apache Spark technology stack is a significant player because it’s one of the core technologies used to modernize data warehouses, a huge segment of the IT industry that accounts for multiple billions in revenue. Spark holds much promise for the future—with data lakes—a storage repo
Tags : 
     TIBCO
By: Impetus     Published Date: Mar 15, 2016
Streaming analytics platforms provide businesses a method for extracting strategic value from data-in-motion in a manner similar to how traditional analytics tools operate on data-at rest. Instead of historical analysis, the goal with streaming analytics is to enable near real-time decision making by letting companies inspect, correlate and analyze data even as it flows into applications and databases from numerous different sources. Streaming analytics allows companies to do event processing against massive volumes of data streaming into the enterprise at high velocity.
Tags : impetus, guide to stream analytics, real time streaming analytics, streaming analytics, real time analytics, big data analytics
     Impetus
By: RedPoint     Published Date: Sep 22, 2014
The emergence of YARN for the Hadoop 2.0 platform has opened the door to new tools and applications that promise to allow more companies to reap the benefits of big data in ways never before possible with outcomes possibly never imagined. By separating the problem of cluster resource management from the data processing function, YARN offers a world beyond MapReduce: less encumbered by complex programming protocols, faster, and at a lower cost. Some of the topics discussed in this paper include: • Why is YARN important for realizing the power of Hadoop for data integration, quality and management? • Benchmark results of MapReduce vs. Pig vs. visual “data flow” design tools • The 3 key features of YARN that solve the complex problems that prohibit businesses from gaining maximum benefit from Hadoop. Download this paper to learn why the power of Hadoop 2.0 lies in enabling applications to run inside Hadoop, without the constraints of MapReduce.
Tags : 
     RedPoint
By: MEMSQL     Published Date: Apr 12, 2016
The pace of data is not slowing. Applications of today are built with infinite data sets in mind. As these real-time applications become the norm, and batch processing becomes a relic of the past, digital enterprises will implement memory-optimized, distributed data systems to simplify Lambda Architectures for real-time data processing and exploration.
Tags : 
     MEMSQL
By: IBM     Published Date: Nov 14, 2014
Platform Symphony is an enterprise-class server platform that delivers low-latency, scaled-out MapReduce workloads. It supports multiple applications running concurrently so that organizations can increase utilization across all resources resulting in a high return on investment.
Tags : 
     IBM
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflops of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Lustre and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters. Further, Intel Enterprise Edition for Lustre software is backed by Intel, the recognized technical support providers for Lustre, and includes 24/7 service level agreement (SLA) coverage.
Tags : 
     Intel
By: GridGain     Published Date: Mar 10, 2015
Software as a Service (SaaS) is a software distribution model in which applications are hosted by a vendor or service provider and made available to customers over the Internet. Instead of companies installing software on their own servers, known as the on premises distribution model, application software providers host the software in the cloud and charge customers according to the time they spend using the software, or based on a monthly or annual fee. SaaS is becoming increasingly popular, and as the industry develops, more and more companies are dropping older business models in favor of this rapidly evolving methodology.
Tags : gridgain, saas, saas perfomance and scalability, in memory computing, data fabric, paas for saas, data grid, real-time streaming
     GridGain
By: Kx Systems     Published Date: Jan 16, 2015
?Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we wa
Tags : kx systems, kdb+, relational database
     Kx Systems
By: snowflake     Published Date: Jun 09, 2016
Today’s data, and how that data is used, have changed dramatically in the past few years. Data now comes from everywhere—not just enterprise applications, but also websites, log files, social media, sensors, web services, and more. Organizations want to make that data available to all of their analysts as quickly as possible, not limit access to only a few highly-skilled data scientists. However, these efforts are quickly frustrated by the limitations of current data warehouse technologies. These systems simply were not built to handle the diversity of today’s data and analytics. They are based on decades-old architectures designed for a different world, a world where data was limited, users of data were few, and all processing was done in on-premises datacenters.
Tags : 
     snowflake
By: Adaptive Computing     Published Date: Feb 27, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: Dell APAC     Published Date: May 30, 2019
To out-innovate and out-pace their competition, organizations must be on a consistent path to keep their infrastructure modern. IT is under constant pressure to deliver optimized infrastructure for new business initiatives and supporting applications all while trying to contain or even reduce costs.
Tags : 
     Dell APAC
By: Dell APAC     Published Date: May 30, 2019
To out-innovate and out-pace their competition, organizations must be on a consistent path to keep their infrastructure modern. IT is under constant pressure to deliver optimized infrastructure for new business initiatives and supporting applications all while trying to contain or even reduce costs.
Tags : 
     Dell APAC
By: Dell APAC     Published Date: May 30, 2019
To out-innovate and out-pace their competition, organizations must be on a consistent path to keep their infrastructure modern. IT is under constant pressure to deliver optimized infrastructure for new business initiatives and supporting applications all while trying to contain or even reduce costs.
Tags : 
     Dell APAC
By: IBM APAC     Published Date: Jul 19, 2019
AI applications and especially deep learning systems are extremely demanding and require powerful parallel processing capabilities. IDC research shows that, in terms of core capacity, a large gap between actual and required CPU capability will develop in the next several years. IDC is seeing the worldwide market for accelerated servers grow to $25.6 billion in 2022, with a 31.6% CAGR. Indeed, this market is growing so fast that IDC is forecasting that by 2021,12% of worldwide server value will be from accelerated compute. Download this IDC report to find out why organizations like yours will need to make decisions about replacing existing general-purpose hardware or supplementing it with hardware dedicated to AI-specific processing tasks.
Tags : 
     IBM APAC
By: Entrust Datacard     Published Date: Jul 23, 2019
Security risks and breaches have become part of the daily landscape as companies and organizations of every size and in every vertical and industry announce that they have been compromised. In 2016 reported security breaches were up 40%, and this year is on pace to surpass that steep rise. Over the past year alone, there have been high-profile breaches in the gaming, financial services, hospitality, food service, consumer packaged goods, and retail sectors. Many of those breaches occurred due to vulnerabilities in applications and on websites. For example, this past April, the IRS announced a breach attributable to a tool designed to fetch data for the Free Application for Federal Student Aid (FAFSA) form.
Tags : 
     Entrust Datacard
By: Google     Published Date: Aug 05, 2019
Moving existing enterprise workloads to the cloud has always been challenging as companies struggle to adapt and migrate applications to run in a cloud environment. IT managers must understand application dependencies, change drivers and networking configurations, and learn new management interfaces. This white paper will provide a deeper understanding of Migrate for Compute Engine's technology for mass migrations into GCP unique technology and architecture, and will explore how these capabilities improve current mass migration practices.
Tags : cloud applications, cloud as a service, velostrata, migration, google cloud
     Google
By: Amazon Web Services EMEA     Published Date: Aug 02, 2019
Artifi cial intelligence is becoming a key component of business transformation. Virtually any business leader seeking to unlock value and develop new capabilities using technology is at some stage of the AI journey. For example, those at the leading edge have incorporated machine learning insights into business processes and are building functionality such as natural language processing and preventative maintenance diagnostics into their products. Others are experimenting with pilot projects or developing plans to get started.
Tags : data, technologies, learning, applications, technology, machine, cloud
     Amazon Web Services EMEA
By: Amazon Web Services EMEA     Published Date: Aug 02, 2019
Companies worldwide are undergoing digital transformations. By modernizing their applications, they can deliver better service to customers, and keep pace in a competitive landscape. In many cases, AWS has helped companies modernize by implementing containers—and initiating cultural shifts— to streamline development. In this eBook, we discuss best practices in containerization and how you can get started today with containers on AWS.
Tags : containers, developers, application, aws
     Amazon Web Services EMEA
By: Automation Anywhere APAC     Published Date: Aug 15, 2019
The University of Melbourne deployed Automation Anywhere’s Robotic Process Automation (RPA) technology to reduce manual work and automate a range of administrative processes across student admissions, faculty administration, and supplier tracking. The deployed software bots now automate the entry of all data and attachments for new admission applications, and the university has slowly expanded its automation capabilities for staff across other faculties. This has allowed the University of Melbourne to increase the efficiency of critical business processes, boost staff engagement, and improve customer experience for its teachers and student body.
Tags : rpa, roi, digital workforce, customer story
     Automation Anywhere APAC
By: NetApp APAC     Published Date: Jul 04, 2019
This IDC study provides an evaluation of 10 vendors that sell all-flash arrays (AFAs) for dense mixed enterprise workload consolidation that includes at least some mission-critical applications. "All-flash arrays are dominating primary storage spend in the enterprise, driving over 80% of that revenue in 2017," said Eric Burgener, research director, Storage. "Today's leading AFAs offer all the performance, capacity scalability, enterprise-class functionality, and datacenter integration capabilities needed to support dense mixed enterprise workload consolidation. More and more IT shops are recognizing this and committing to 'all flash for primary storage' strategies."
Tags : 
     NetApp APAC
By: HERE Technologies     Published Date: Jul 11, 2019
Supply chain managers are increasingly leveraging location intelligence and location data to raise visibility throughout their whole logistics process and to optimize their delivery routes. Leveraging this data requires an ever-more-robust technology stack. As supply chain technology stacks become more complex, diverse and defined by legacy system integrations, Application Program Interfaces (APIs) are becoming essential to making stacks scale, allowing supply chain managers to better meet the demands of the new generation of consumers. Innovative location APIs provide supply chain stacks and applications with: Greater agility Contextual intelligence Real-time data implementation Speed Scale Introducing new technology into an organization can sometimes be daunting. As one of the world’s leading location platforms, HERE shares insights and tips to streamline the supply chain technology integration across the whole organization.
Tags : here technologies, supply chain, mapping
     HERE Technologies
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com