analytic large data

Results 1 - 25 of 35Sort Results By: Published Date | Title | Company Name
By: TIBCO     Published Date: Nov 09, 2015
As one of the most exciting and widely adopted open-source projects, Apache Spark in-memory clusters are driving new opportunities for application development as well as increased intake of IT infrastructure. Apache Spark is now the most active Apache project, with more than 600 contributions being made in the last 12 months by more than 200 organizations. A new survey conducted by Databricks—of 1,417 IT professionals working with Apache Spark finds that high-performance analytics applications that can work with big data are driving a large proportion of that demand. Apache Spark is now being used to aggregate multiple types of data in-memory versus only pulling data from Hadoop. For solution providers, the Apache Spark technology stack is a significant player because it’s one of the core technologies used to modernize data warehouses, a huge segment of the IT industry that accounts for multiple billions in revenue. Spark holds much promise for the future—with data lakes—a storage repo
Tags : 
     TIBCO
By: Adaptive Computing     Published Date: Feb 27, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: Infinidat EMEA     Published Date: May 14, 2019
Big Data and analytics workloads represent a new frontier for organizations. Data is being collected from sources that did not exist 10 years ago. Mobile phone data, machine-generated data, and website interaction data are all being collected and analyzed. In addition, as IT budgets are already under pressure, Big Data footprints are getting larger and posing a huge storage challenge. This paper provides information on the issues that Big Data applications pose for storage systems and how choosing the correct storage infrastructure can streamline and consolidate Big Data and analytics applications without breaking the bank.
Tags : 
     Infinidat EMEA
By: Zaloni     Published Date: Apr 24, 2019
Why your data catalog won’t deliver significant ROI According to Gartner, organizations that provide access to a curated catalog of internal and external data assets will derive twice as much business value from their analytics investments by 2020 than those that do not. That’s a ringing endorsement of data catalogs, and a growing number of enterprises seem to agree. In fact, the global data catalog market is expected to grow from US$210.0 million in 2017 to US$620.0 million by 2022, at a Compound Annual Growth Rate (CAGR) of 24.2%. Why such large and intensifying demand for data catalogs? The primary driver is that many organizations are working to modernize their data platforms with data lakes, cloud-based data warehouses, advanced analytics and various SaaS applications in order to grow profitable digital initiatives. To support these digital initiatives and other business imperatives, organizations need more reliable, faster access to their data. However, modernizing data plat
Tags : 
     Zaloni
By: SAP     Published Date: May 18, 2014
Leading companies and technology providers are rethinking the fundamental model of analytics, and the contours of a new paradigm are emerging. The new generation of analytics goes beyond Big Data (information that is too large and complex to manipulate without robust software), and the traditional narrow approach of analytics which was restricted to analysing customer and financial data collected from their interactions on social media. Today companies are embracing the social revolution, using real-time technologies to unlock deep insights about customers and others and enable better-informed decisions and richer collaboration in real-time.
Tags : sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management
     SAP
By: Oracle     Published Date: Nov 28, 2017
Today’s leading-edge organizations differentiate themselves through analytics to further their competitive advantage by extracting value from all their data sources. Other companies are looking to become data-driven through the modernization of their data management deployments. These strategies do include challenges, such as the management of large growing volumes of data. Today’s digital world is already creating data at an explosive rate, and the next wave is on the horizon, driven by the emergence of IoT data sources. The physical data warehouses of the past were great for collecting data from across the enterprise for analysis, but the storage and compute resources needed to support them are not able to keep pace with the explosive growth. In addition, the manual cumbersome task of patch, update, upgrade poses risks to data due to human errors. To reduce risks, costs, complexity, and time to value, many organizations are taking their data warehouses to the cloud. Whether hosted lo
Tags : 
     Oracle
By: SAS     Published Date: Jan 17, 2018
A picture is worth a thousand words – especially when you are trying to find relationships and understand your data – which could include thousands or even millions of variables. To create meaningful visuals of your data, there are some basic tips and techniques you should consider. Data size and composition play an important role when selecting graphs to represent your data. This paper, filled with graphics and explanations, discusses some of the basic issues concerning data visualization and provides suggestions for addressing those issues. From there, it moves on to the topic of big data and discusses those challenges and potential solutions as well. It also includes a section on SAS® Visual Analytics, software that was created especially for quickly visualizing very large amounts of data. Autocharting and "what does it mean" balloons can help even novice users create and interact with graphics that can help them understand and derive the most value from their data.
Tags : 
     SAS
By: Cognizant     Published Date: Oct 23, 2018
In the last few years, a wave of digital technologies changed the banking landscape - social/ mobile altered the way banks engage with customers, analytics enabled hyper personalized offerings by making sense of large datasets, Cloud technologies shifted the computing paradigm from CapEx to OpEx, enabling delivery of business processes as services from third-party platforms. Now, a second wave of disruption is set to drive even more profound changes - including robotic process automation (RPA), AI, IOT instrumentation, blockchain distributed ledger and shared infrastructure, and open banking platforms controlled by application programming interfaces (API). As these technologies become commercialized, and demand increases for digitally-enabled services, we will see unprecedented disruption, as non-traditional banks and fintechs rush into all segments of the banking space. This whitepaper examines key considerations for banks as they explore value in the emerging Digital 2.0 world.
Tags : cognizant, banking, digital
     Cognizant
By: IBM     Published Date: Mar 29, 2017
One of the biggest changes facing organizations making purchasing and deployment decisions about analytic databases — including relational data warehouses — is whether to opt for a cloud solution. A couple of years ago, only a few organizations selected such cloud analytic databases. Today, according to a 2016 IDC survey, 56% of large and midsize organizations in the United States have at least one data warehouse or mart deploying in the cloud.
Tags : cloud, analytics, data, organization, ibm
     IBM
By: IBM     Published Date: Oct 27, 2016
IBM Analytics for Apache Spark for Bluemix is an open-source cluster computing framework with in-memory processing to speed analytic applications up to 100 times faster compared to other technologies on the market today. Optimized for extremely fast and large scale data processing-you can easily perform big data analysis from one application.
Tags : ibm, apache spark, bluemix, analytics, data science
     IBM
By: IBM     Published Date: Jan 30, 2017
Analytics has permeated, virtually, every department within an organization. It’s no longer a ‘nice to have’. It’s an organizational imperative. HR, specifically, collects a wealth of data; from recruiting applications, employee surveys, performance management data and it sits in systems that remain largely untapped. This data can help drive strategic decisions about your workforce. Analytic tools have, historically, been difficult to use and required heavy IT lifting in order to get the most out of them. What if an analytics system learned and continue to learn as it experienced new information, new scenarios, and new responses. This is referred to as Cognitive Computing and is key to providing an analytics system that is easy to use but extremely powerful.
Tags : ibm, talent analytics, cognitive computing, analytics, engagement
     IBM
By: Group M_IBM Q418     Published Date: Oct 15, 2018
The enterprise data warehouse (EDW) has been at the cornerstone of enterprise data strategies for over 20 years. EDW systems have traditionally been built on relatively costly hardware infrastructures. But ever-growing data volume and increasingly complex processing have raised the cost of EDW software and hardware licenses while impacting the performance needed for analytic insights. Organizations can now use EDW offloading and optimization techniques to reduce costs of storing, processing and analyzing large volumes of data. Getting data governance right is critical to your business success. That means ensuring your data is clean, of excellent quality, and of verifiable lineage. Such governance principles can be applied in Hadoop-like environments. Hadoop is designed to store, process and analyze large volumes of data at significantly lower cost than a data warehouse. But to get the return on investment, you must infuse data governance processes as part of offloading.
Tags : 
     Group M_IBM Q418
By: Group M_IBM Q119     Published Date: Mar 04, 2019
One of the biggest changes facing organizations making purchasing and deployment decisions about analytic databases — including relational data warehouses — is whether to opt for a cloud solution. A couple of years ago, only a few organizations selected such cloud analytic databases. Today, according to a 2016 IDC survey, 56% of large and midsize organizations in the United States have at least one data warehouse or mart deploying in the cloud.
Tags : 
     Group M_IBM Q119
By: Group M_IBM Q119     Published Date: Mar 11, 2019
One of the biggest changes facing organizations making purchasing and deployment decisions about analytic databases — including relational data warehouses — is whether to opt for a cloud solution. A couple of years ago, only a few organizations selected such cloud analytic databases. Today, according to a 2016 IDC survey, 56% of large and midsize organizations in the United States have at least one data warehouse or mart deploying in the cloud
Tags : 
     Group M_IBM Q119
By: Group M_IBM Q3'19     Published Date: Jun 27, 2019
The enterprise data warehouse (EDW) has been at the cornerstone of enterprise data strategies for over 20 years. EDW systems have traditionally been built on relatively costly hardware infrastructures. But ever-growing data volume and increasingly complex processing have raised the cost of EDW software and hardware licenses while impacting the performance needed for analytic insights. Organizations can now use EDW offloading and optimization techniques to reduce costs of storing, processing and analyzing large volumes of data.
Tags : 
     Group M_IBM Q3'19
By: ParAccel     Published Date: Dec 16, 2010
This solution brief explains how Fidelity Information Services (FIS) executives realized that they needed an analytics database solution that could keep up with additional fraud complexity as well as much larger sets of data to improve detection rates.
Tags : paraccel, analytic database, financial fraud analytics, fidelity information services
     ParAccel
By: Amazon Web Services     Published Date: Sep 05, 2018
Today’s businesses generate staggering amounts of data, and learning to get the most value from that data is paramount to success. Just as Amazon Web Services (AWS) has transformed IT infrastructure to something that can be delivered on-demand, scalably, quickly, and cost-effectively, Amazon Redshift is doing the same for data warehousing and big data analytics. Amazon Redshift offers a massively parallel columnar data store that can be spun up in just a few minutes to deal with billions of rows of data at a cost of just a few cents an hour. Organizations choose Amazon Redshift for its affordability, flexibility, and powerful feature set: • Enterprise-class relational database query and management system • Supports client connections with many types of applications, including business intelligence (BI), reporting, data, and analytics tools • Execute analytic queries in order to retrieve, compare, and evaluate large amounts of data in multiple-stage operations
Tags : 
     Amazon Web Services
By: ParAccel     Published Date: Nov 15, 2010
This solution brief explains how Fidelity Information Services (FIS) executives realized that they needed an analytics database solution that could keep up with additional fraud complexity as well as much larger sets of data to improve detection rates.
Tags : paraccel, analytic database, financial fraud analytics, fidelity information services
     ParAccel
By: AWS     Published Date: Jun 20, 2018
Data and analytics have become an indispensable part of gaining and keeping a competitive edge. But many legacy data warehouses introduce a new challenge for organizations trying to manage large data sets: only a fraction of their data is ever made available for analysis. We call this the “dark data” problem: companies know there is value in the data they collected, but their existing data warehouse is too complex, too slow, and just too expensive to use. A modern data warehouse is designed to support rapid data growth and interactive analytics over a variety of relational, non-relational, and streaming data types leveraging a single, easy-to-use interface. It provides a common architectural platform for leveraging new big data technologies to existing data warehouse methods, thereby enabling organizations to derive deeper business insights. Key elements of a modern data warehouse: • Data ingestion: take advantage of relational, non-relational, and streaming data sources • Federated q
Tags : 
     AWS
By: FICO EMEA     Published Date: Jan 25, 2019
Communications service providers (CSPs) have long recognized the potential of data analytics. Yet their early efforts to pull actionable intelligence from the oceans of data they have access to were largely unsuccessful. Many tried a 'big bang' approach to building a central repository without knowing what they wanted to do with the data in it. The arrival of artificial intelligence (AI) – its machine learning subset in particular – has changed their thinking and approach. For this Quick Insights report, we surveyed 64 professionals from CSPs around the world who are applying, leveraging and/ or planning to deploy advanced analytics in some capacity at various points across the customer lifecycle.
Tags : analytics, artificial intelligence, customer lifecycle, insights, telecom credit lifecycle, customer acquisition, optimisation
     FICO EMEA
By: IBM     Published Date: Aug 08, 2014
Big data and analytics help insurance companies identify the next best action for customers. With the right solutions, companies can extract, integrate and analyze a large volume and variety of data, from call-center notes and voice recordings to web chats, telematics and social media.
Tags : big data, analytics, insurance, customer service, solutions
     IBM
By: IBM     Published Date: Aug 08, 2014
Big data and analytics help insurance companies identify the next best action for customers. With the right solutions, companies can extract, integrate and analyze a large volume and variety of data, from call-center notes and voice recordings to web chats, telematics and social media.
Tags : big data, analytics, insurance, customer service, solutions
     IBM
By: Vertica     Published Date: Feb 23, 2010
Ovum takes a deep-dive technology audit of Vertica's Analytic Database that is designed specifically for storing and querying large datasets.
Tags : ovum, vertica, analytical databases, dbms, technology audit, mpp, rdbms
     Vertica
By: IBM     Published Date: May 07, 2013
This book brings a practitioner’s view to Big Data Analytics. Download this ebook to read a practical viewpoint into leveraging analytics for Big Data. This book also draws material from a large number of workshops and interviews with business and IT leaders. Learn more about ‘Big Data and Business Analytics’ through IBM’s latest market leading solutions. Register and attend this complimentary virtual event on June 11 by IBM Business Analytics
Tags : practical viewpoint, analytics, big data, interviews, business
     IBM
By: AWS     Published Date: Nov 28, 2018
Financial institutions run on data: collecting it, analyzing it, delivering meaningful insights, and taking action in real time. As data volumes increase, organizations demand a scalable analytics platform that can meet the needs of data scientists and business users alike. However, managing an on-premises analytics environment for a large and diverse user base can become time-consuming, costly, and unwieldy. Tableau Server on Amazon Web Services (AWS) is helping major Financial Services organizations shift data visualization and analytics workloads to the cloud. The result is fewer hours spent on manual work and more time to ask deeper questions and launch new data analyses, with easily-scalable support for large numbers of users. In this webinar, you’ll hear how one major asset management company made the shift to cloud data visualization with Tableau Server on AWS. Discover lessons learned, best practices tailored to Financial Services organizations, and starting tactics for scalable analytics on the cloud.
Tags : 
     AWS
Previous   1 2    Next    
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com