applications

Results 1 - 25 of 3849Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 13, 2013
FlexPod Select with Hadoop delivers enterprise class Hadoop with validated, pre-configured components for fast deployment, higher reliability and smoother integration with existing applications and infrastructure.  These technical reference architectures optimize storage, networking, and servers with Cloudera and Hortonworks distributions of Hadoop. Leverage FlexPod Select with Hadoop to help store, manage, process and perform advanced analytics on your multi-structured data.   Tuning parameters, optimization techniques among other Hadoop cluster guidelines  are provided.
Tags : flexpod with hadoop, enterprise data, storage infrastructure, massive amounts of data
     NetApp
By: IBM     Published Date: Sep 02, 2014
Life Sciences organizations need to be able to build IT infrastructures that are dynamic, scalable, easy to deploy and manage, with simplified provisioning, high performance, high utilization and able to exploit both data intensive and server intensive workloads, including Hadop MapReduce. Solutions must scale, both in terms of processing and storage, in order to better serve the institution long-term. There is a life cycle management of data, and making it useable for mainstream analyses and applications is an important aspect in system design. This presentation will describe IT requirements and how Technical Computing solutions from IBM and Platform Computing will address these challenges and deliver greater ROI and accelerated time to results for Life Sciences.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Join IBM and Nuance Communications Inc. to learn how Nuance uses IBM Elastic Storage to improve the power of their voice recognition applications by managing storage growth, cost and complexity while increasing performance and data availability. View the webcast to learn how you can: · Lower data management costs through policy driven automation and tiered storage management · Manage and increase storage agility through software defined storage Remove data related bottlenecks to deliver application performance
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
Building applications for handling big data requires laser-like focus on solutions that allow you to deliver scalable, reliable and flexible infrastructure for fast-growing analytics environments. This paper provides 6 best practices for selecting the “right” infrastructure—one that is optimized for performance, flexibility and long-term value.
Tags : 
     IBM
By: TIBCO     Published Date: Nov 09, 2015
As one of the most exciting and widely adopted open-source projects, Apache Spark in-memory clusters are driving new opportunities for application development as well as increased intake of IT infrastructure. Apache Spark is now the most active Apache project, with more than 600 contributions being made in the last 12 months by more than 200 organizations. A new survey conducted by Databricks—of 1,417 IT professionals working with Apache Spark finds that high-performance analytics applications that can work with big data are driving a large proportion of that demand. Apache Spark is now being used to aggregate multiple types of data in-memory versus only pulling data from Hadoop. For solution providers, the Apache Spark technology stack is a significant player because it’s one of the core technologies used to modernize data warehouses, a huge segment of the IT industry that accounts for multiple billions in revenue. Spark holds much promise for the future—with data lakes—a storage repo
Tags : 
     TIBCO
By: Impetus     Published Date: Mar 15, 2016
Streaming analytics platforms provide businesses a method for extracting strategic value from data-in-motion in a manner similar to how traditional analytics tools operate on data-at rest. Instead of historical analysis, the goal with streaming analytics is to enable near real-time decision making by letting companies inspect, correlate and analyze data even as it flows into applications and databases from numerous different sources. Streaming analytics allows companies to do event processing against massive volumes of data streaming into the enterprise at high velocity.
Tags : impetus, guide to stream analytics, real time streaming analytics, streaming analytics, real time analytics, big data analytics
     Impetus
By: RedPoint     Published Date: Sep 22, 2014
The emergence of YARN for the Hadoop 2.0 platform has opened the door to new tools and applications that promise to allow more companies to reap the benefits of big data in ways never before possible with outcomes possibly never imagined. By separating the problem of cluster resource management from the data processing function, YARN offers a world beyond MapReduce: less encumbered by complex programming protocols, faster, and at a lower cost. Some of the topics discussed in this paper include: • Why is YARN important for realizing the power of Hadoop for data integration, quality and management? • Benchmark results of MapReduce vs. Pig vs. visual “data flow” design tools • The 3 key features of YARN that solve the complex problems that prohibit businesses from gaining maximum benefit from Hadoop. Download this paper to learn why the power of Hadoop 2.0 lies in enabling applications to run inside Hadoop, without the constraints of MapReduce.
Tags : 
     RedPoint
By: MEMSQL     Published Date: Apr 12, 2016
The pace of data is not slowing. Applications of today are built with infinite data sets in mind. As these real-time applications become the norm, and batch processing becomes a relic of the past, digital enterprises will implement memory-optimized, distributed data systems to simplify Lambda Architectures for real-time data processing and exploration.
Tags : 
     MEMSQL
By: IBM     Published Date: Nov 14, 2014
Platform Symphony is an enterprise-class server platform that delivers low-latency, scaled-out MapReduce workloads. It supports multiple applications running concurrently so that organizations can increase utilization across all resources resulting in a high return on investment.
Tags : 
     IBM
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflops of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Lustre and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters. Further, Intel Enterprise Edition for Lustre software is backed by Intel, the recognized technical support providers for Lustre, and includes 24/7 service level agreement (SLA) coverage.
Tags : 
     Intel
By: GridGain     Published Date: Mar 10, 2015
Software as a Service (SaaS) is a software distribution model in which applications are hosted by a vendor or service provider and made available to customers over the Internet. Instead of companies installing software on their own servers, known as the on premises distribution model, application software providers host the software in the cloud and charge customers according to the time they spend using the software, or based on a monthly or annual fee. SaaS is becoming increasingly popular, and as the industry develops, more and more companies are dropping older business models in favor of this rapidly evolving methodology.
Tags : gridgain, saas, saas perfomance and scalability, in memory computing, data fabric, paas for saas, data grid, real-time streaming, hadoop
     GridGain
By: Kx Systems     Published Date: Jan 16, 2015
?Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we wa
Tags : kx systems, kdb+, relational database
     Kx Systems
By: snowflake     Published Date: Jun 09, 2016
Today’s data, and how that data is used, have changed dramatically in the past few years. Data now comes from everywhere—not just enterprise applications, but also websites, log files, social media, sensors, web services, and more. Organizations want to make that data available to all of their analysts as quickly as possible, not limit access to only a few highly-skilled data scientists. However, these efforts are quickly frustrated by the limitations of current data warehouse technologies. These systems simply were not built to handle the diversity of today’s data and analytics. They are based on decades-old architectures designed for a different world, a world where data was limited, users of data were few, and all processing was done in on-premises datacenters.
Tags : 
     snowflake
By: Adaptive Computing     Published Date: Feb 27, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: Cisco EMEA     Published Date: Mar 08, 2019
And then imagine processing power strong enough to make sense of all this data in every language and in every dimension. Unless you’ve achieved that digital data nirvana (and you haven’t told the rest of us), you’re going to have some unknowns in your world. In the world of security, unknown threats exist outside the enterprise in the form of malicious actors, state-sponsored attacks and malware that moves fast and destroys everything it touches. The unknown exists inside the enterprise in the form of insider threat from rogue employees or careless contractors – which was deemed by 24% of our survey respondents to pose the most serious risk to their organizations. The unknown exists in the form of new devices, new cloud applications, and new data. The unknown is what keeps CISOs, what keeps you, up at night – and we know because we asked you.
Tags : 
     Cisco EMEA
By: Cisco     Published Date: Mar 22, 2019
The Secure Data Center is a place in the network (PIN) where a company centralizes data and performs services for business. Data centers contain hundreds to thousands of physical and virtual servers that are segmented by applications, zones, and other methods. This guide addresses data center business flows and the security used to defend them. The Secure Data Center is one of the six places in the network within SAFE. SAFE is a holistic approach in which Secure PINs model the physical infrastructure and Secure Domains represent the operational aspects of a network.
Tags : 
     Cisco
By: Cisco     Published Date: Mar 22, 2019
Cisco ACI, the industry-leading software-defined networking solution, facilitates application agility and data center automation. With ACI Anywhere, enable scalable multicloud networks with a consistent policy model, and gain the flexibility to move applications seamlessly to any location or any cloud while maintaining security and high availability.
Tags : 
     Cisco
By: Cisco EMEA     Published Date: Mar 26, 2019
Imagine if you could see deep into the future. And way back into the past, both at the same time. Imagine having visibility of everything that had ever happened and everything that was ever going to happen, everywhere, all at once. And then imagine processing power strong enough to make sense of all this data in every language and in every dimension. Unless you’ve achieved that digital data nirvana (and you haven’t told the rest of us), you’re going to have some unknowns in your world. In the world of security, unknown threats exist outside the enterprise in the form of malicious actors, state-sponsored attacks and malware that moves fast and destroys everything it touches. The unknown exists inside the enterprise in the form of insider threat from rogue employees or careless contractors – which was deemed by 24% of our survey respondents to pose the most serious risk to their organizations. The unknown exists in the form of new devices, new cloud applications, and new data. The unk
Tags : 
     Cisco EMEA
By: Dell EMC     Published Date: Feb 14, 2019
Artificial intelligence (AI) is a transformative technology that will change the way organizations interact and will add intelligence to many products and services through new insights currently hidden in vast pools of data. In 2017 alone, venture capitalists invested more than $11.7 billion in the top 100 Artificial Intelligence startups, according to CB Insights1, and the breadth of Artificial Intelligence applications continues to grow. While human-like intelligence will remain the stuff of novels and movies for the near future, most organizations can and should explore practical Artificial Intelligence projects. This technology has the potential to: • Improve productivity of internal applications • Increase revenue through enhanced customer interacton and improved customer acquisiiton • Reduce costs by optimizing operations • Enhance products and services with "smart" functionality such as vision and voice interaction and control This paper provided by Dell and Intel® gives executi
Tags : 
     Dell EMC
By: Dell EMC     Published Date: Feb 14, 2019
Technology is quickly moving to the forefront as organizations undertake digital and IT transformation projects that enable strategic differentiation in a world where users are leveraging applications and data in new ways. The reality is, most organizations were not born digital but instead have legacy business processes, applications, and infrastructure that require modernization and automation. Read this executive summary from Dell and Intel® to learn why businesses must embark on IT transformation to modernize and automate their legacy infrastructure and prime themselves to achieve their digital business goals. Intel Inside®. Powerful Productivity Outside.
Tags : 
     Dell EMC
By: Gigamon EMEA     Published Date: Apr 10, 2019
Upgrading your network doesn’t have to be a big headache. Get the Securosis report Scaling Network Security and scale security controls and policy without starting over. Discover your options for improving security architecture on your terms, using existing infrastructure and intelligently applying security controls at scale without major overhauls. With this approach, your network protection can evolve with applications, attackers and technology—even in today’s demanding 100Gbps network environment.
Tags : 
     Gigamon EMEA
By: HERE Technologies     Published Date: Mar 25, 2019
In the future, people won’t necessarily want to own cars, but they’ll need personal mobility. The mobility ecosystem can support a range of different services, but in the near-term, many of these nascent businesses face a range of operational challenges as they bid to grow and become profitable. This ebook explores the fundamental needs of new mobility providers as they target business improvements, looks at their considerations as they forge key technology partnerships, and shows how HERE's Auto Mobility Operations solution can help mobility services meet those requirements. This ebook will help you understand how Auto Mobility Operations helps: • Enable the creation and integration of key location features as applications for mobile operating systems through use of the HERE Mobile SDK • Manage and efficiently operate fleet assets with HERE Location Services, providing fresh, high-quality and global location-based data • Create a frictionless and compelling UX - with APIs, mapping
Tags : auto, mapping, location data
     HERE Technologies
By: Illusive Networks     Published Date: Apr 10, 2019
APTs can be particularly harmful to financial service organizations, raising the need for early detection of malicious intruders. This white paper describes three use cases that illustrate how Illusive’s technology provides a nimble, easy-to-manage solution that guards the integrity of SWIFT services, defends legacy, custom, or “untouchable” applications and systems, and helps manage cyber risk during periods of disruptive business change.
Tags : cyber security, deception technology, endpoint security, cyber security, threat management, threat protection, illusive networks, endpoint protection, lateral movement, financial services, advanced threat protection, apt, targeted attacks, network security, enterprise security
     Illusive Networks
By: Forcepoint     Published Date: Mar 14, 2019
Once again, Forcepoint has been named a Top Player in this year’s report, citing strengths such as integration with Forcepoint CASB to extend DLP policies into cloud applications and more. Read the report to learn more.
Tags : data protection, data security, analyst report, dlp, cloud apps, data loss prevention, data leakage prevention, cybersecurity, cyber security, casb, radicati, radicati market quadrant
     Forcepoint
By: Epicor     Published Date: Apr 12, 2019
As a business leader of a manufacturing company, you may be experiencing the challenges of competing in a digitally transformed marketplace. In our increasingly digital world, your ability to grow depends on how your business leverages the latest best-practice technology. Is your enterprise resource planning (ERP) system keeping pace? In this Forrester Research report—presented by Epicor Software—you’ll discover the latest insights on software as a service (SaaS) business solutions. SaaS business applications will evolve in the cloud to new levels of value in: Intelligence Flexibility Connectivity User experience The report also provides predictions and recommendations that will help guide your manufacturing business with its strategic growth initiatives. Download the report to learn more, and discover why you should change the way you think about cloud.
Tags : epicor erp, saas, applications
     Epicor
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com