sid

Results 1 - 25 of 3136Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 18, 2013
IT managers have indicated their two most significant challenges associated with managing unstructured data at multiple locations were keeping pace with data growth and improving data protection . Learn how the NetApp Distributed Content Repository provides advanced data protection and system recovery capabilities that can enable multiple data centers and remote offices to maintain access to data through hardware and software faults. Key benefits are: - continuous access to file data while maintaining data redundancy with no administrator intervention needed. - easily integrated and deployed into a distributed environment, providing transparent, centrally managed content storage - provision of secure multi-tenancy using security partitions. - provision effectively infinite, on-demand capacity while providing fast access to files and objects in the cloud. - secure, robust data protection techniques that enable data to persist beyond the life of the storage it resides on
Tags : 
     NetApp
By: IBM     Published Date: Sep 16, 2015
The impact of the 2008 financial crisis affects not only Sell Side firms - the focus of discussions - but Buy Side and Insurance firms. Dodd Frank Act targets all systemically important firms. This study conducted in partnership with Waters Technology, an Incisive Media publication, focuses these firms. The report finds that they are facing intense pressure on multiple fronts including stricter liquidity and solvency risk regulations, and changing market conditions. Like Sell Side firms, they require more innovative business models and analytics to meet these challenges and to deliver growth and performance.
Tags : 
     IBM
By: TIBCO     Published Date: Nov 09, 2015
As one of the most exciting and widely adopted open-source projects, Apache Spark in-memory clusters are driving new opportunities for application development as well as increased intake of IT infrastructure. Apache Spark is now the most active Apache project, with more than 600 contributions being made in the last 12 months by more than 200 organizations. A new survey conducted by Databricks—of 1,417 IT professionals working with Apache Spark finds that high-performance analytics applications that can work with big data are driving a large proportion of that demand. Apache Spark is now being used to aggregate multiple types of data in-memory versus only pulling data from Hadoop. For solution providers, the Apache Spark technology stack is a significant player because it’s one of the core technologies used to modernize data warehouses, a huge segment of the IT industry that accounts for multiple billions in revenue. Spark holds much promise for the future—with data lakes—a storage repo
Tags : 
     TIBCO
By: Impetus     Published Date: Mar 15, 2016
Streaming analytics platforms provide businesses a method for extracting strategic value from data-in-motion in a manner similar to how traditional analytics tools operate on data-at rest. Instead of historical analysis, the goal with streaming analytics is to enable near real-time decision making by letting companies inspect, correlate and analyze data even as it flows into applications and databases from numerous different sources. Streaming analytics allows companies to do event processing against massive volumes of data streaming into the enterprise at high velocity.
Tags : impetus, guide to stream analytics, real time streaming analytics, streaming analytics, real time analytics, big data analytics
     Impetus
By: Dell and Intel®     Published Date: Jun 18, 2015
The rapid evolution of big data technology in the past few years has changed forever the pursuit of scientific exploration and discovery. Along with traditional experiment and theory, computational modeling and simulation is a third paradigm for science. Its value lies in exploring areas of science in which physical experimentation is unfeasible and insights cannot be revealed analytically, such as in climate modeling, seismology and galaxy formation. More recently, big data has been called the “fourth paradigm" of science. Big data can be observed, in a real sense, by computers processing it and often by humans reviewing visualizations created from it. In the past, humans had to reduce the data, often using techniques of statistical sampling, to be able to make sense of it. Now, new big data processing techniques will help us make sense of it without traditional reduction
Tags : 
     Dell and Intel®
By: RedPoint     Published Date: Sep 22, 2014
Enterprises can gain serious traction by taking advantage of the scalability, processing power and lower costs that Hadoop 2.0/YARN offers. YARN closes the functionality gap by opening Hadoop to mature enterprise-class data management capabilities. With a lot of data quality functionality left outside of Hadoop 1, and a lot of data inside HDFS originating outside the enterprise, the quality of the data residing in the Hadoop cluster is sometimes as stinky as elephant dung. Some of the topics discussed in this paper include: • The key features, benefits and limitations of Hadoop 1.0 • The benefit of performing data standardization, identity resolution, and master data management inside of Hadoop. • The transformative power of Hadoop 2.0 and its impact on the speed and cost of accessing, cleansing and delivering high-quality enterprise data. Download this illuminating white paper about what YARN really means to the world of big data management.
Tags : 
     RedPoint
By: RedPoint     Published Date: Sep 22, 2014
The emergence of YARN for the Hadoop 2.0 platform has opened the door to new tools and applications that promise to allow more companies to reap the benefits of big data in ways never before possible with outcomes possibly never imagined. By separating the problem of cluster resource management from the data processing function, YARN offers a world beyond MapReduce: less encumbered by complex programming protocols, faster, and at a lower cost. Some of the topics discussed in this paper include: • Why is YARN important for realizing the power of Hadoop for data integration, quality and management? • Benchmark results of MapReduce vs. Pig vs. visual “data flow” design tools • The 3 key features of YARN that solve the complex problems that prohibit businesses from gaining maximum benefit from Hadoop. Download this paper to learn why the power of Hadoop 2.0 lies in enabling applications to run inside Hadoop, without the constraints of MapReduce.
Tags : 
     RedPoint
By: GridGain     Published Date: Sep 24, 2014
In-memory computing (IMC) is an emerging field of importance in the big data industry. It is a quickly evolving technology, seen by many as an effective way to address the proverbial 3 V’s of big data—volume, velocity, and variety. Big data requires ever more powerful means to process and analyze growing stores of data, being collected at more rapid rates, and with increasing diversity in the types of data being sought—both structured and unstructured. In-memory computing’s rapid rise in the marketplace has the big data community on alert. In fact, Gartner picked in-memory computing as one of the Top Ten Strategic Initiatives.
Tags : gridgain, in memory computing, big data industry, 3v's of big data-volume
     GridGain
By: TIBCO     Published Date: Sep 02, 2014
In this Guide you will learn how predictive analytics helps your organization predict with confidence what will happen next so that you can make smarter decisions and improve business outcomes. It is important to adopt a predictive analytics solution that meets the specific needs of different users and skill sets from beginners, to experienced analysts, to data scientists.
Tags : 
     TIBCO
By: IBM     Published Date: Nov 14, 2014
Platform HPC enables HPC customers to side-step many overhead cost and support issues that often plague open-source environments and enable them to deploy powerful, easy to use clusters.
Tags : 
     IBM
By: Intel     Published Date: Sep 16, 2014
In this Guide, we take a look at what Lustre on infrastructure AWS delivers for a broad community of business and commercial organizations struggling with the challenge of big data and demanding storage growth.
Tags : intel, lustre, big data solutions in the cloud
     Intel
By: Dell and Intel®     Published Date: Sep 17, 2014
According to the 2014 IDG Enterprise Big Data research report, companies are intensifying their efforts to derive value through big data initiatives with nearly half (49%) of respondents already implementing big data projects or in the process of doing so in the future. Further, organizations are seeing exponential growth in the amount of data managed with an expected increas of 76% within the next 12-18 months. With growth there are opportunities as well as challenges. Among those facing the big data challenge are finance executives, as this extraordinary growth presents a unique opportunity to leverage data assets like never before. * Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.
Tags : dell and intel, big data for finance
     Dell and Intel®
By: Dell and Intel®     Published Date: Apr 02, 2015
In this Guide we have delivered the case for the benefits of big data technology applied to the needs of the manufacturing industry. In demonstrating the value of big data, we included: • An overview of how manufacturing can benefit from the big data technology stack • An overview of how manufacturing can benefit from the big data technology stack • A high-level view of common big data pain points for manufacturers • A detailed analysis of big data technology for manufacturers • A view as to how manufacturers are going about big data adoption • A proven case study with: Omneo • Dell PowerEdge servers with Intel® Xeon® processors
Tags : dell, intel, big data, manufacturing, technology stack, pain points, big data adoption, omneo
     Dell and Intel®
By: Dell and Intel®     Published Date: Sep 06, 2015
In conclusion, the retail experience has changed dramatically in recent years as there has been a power shift over to consumers. Shoppers can easily find and compare products from an array of devices, even while walking through a store. They can share their opinions about retailers and products through social media and influence other prospective customers. To compete in this new multi-channel environment, we’ve seen in this guide how retailers have to adopt new and innovative strategies to attract and retain customers. Big data technologies, specifically Hadoop, enable retailers to connect with customers through multiple channels at an entirely new level by harnessing the vast volumes of new data available today. Hadoop helps retailers store, transform, integrate and analyze a wide variety of online and offline customer data—POS transactions, e-commerce transactions, clickstream data, email, social media, sensor data and call center records—all in one central repository. Retailers can
Tags : 
     Dell and Intel®
By: GridGain     Published Date: Mar 10, 2015
Software as a Service (SaaS) is a software distribution model in which applications are hosted by a vendor or service provider and made available to customers over the Internet. Instead of companies installing software on their own servers, known as the on premises distribution model, application software providers host the software in the cloud and charge customers according to the time they spend using the software, or based on a monthly or annual fee. SaaS is becoming increasingly popular, and as the industry develops, more and more companies are dropping older business models in favor of this rapidly evolving methodology.
Tags : gridgain, saas, saas perfomance and scalability, in memory computing, data fabric, paas for saas, data grid, real-time streaming, hadoop
     GridGain
By: Kx Systems     Published Date: Jan 16, 2015
?Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we wa
Tags : kx systems, kdb+, relational database
     Kx Systems
By: snowflake     Published Date: Jun 09, 2016
Data and the way that data is used have changed, but data warehousing has not. Today’s premises-based data warehouses are based on technology that is, at its core, two decades old. To meet the demands and opportunities of today, data warehouses have to fundamentally change.
Tags : 
     snowflake
By: Hewlett Packard Enterprise     Published Date: Jan 31, 2019
This book helps you understand both sides of the hybrid IT equation and how HPE can help your organization transform its IT operations and save time and money in the process. I delve into the worlds of security, economics, and operations to show you new ways to support your business workloads.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Jan 31, 2019
With the maturing of the all-flash array (AFA) market, the established market leaders in this space are turning their attention to other ways to differentiate themselves from their competition besides just product functionality. Consciously designing and driving a better customer experience (CX) is a strategy being pursued by many of these vendors.This white paper defines cloud-based predictive analytics and discusses evolving storage requirements that are driving their use and takes a look at how these platforms are being used to drive incremental value for public sector organizations in the areas of performance, availability, management, recovery, and information technology (IT) infrastructure planning.
Tags : 
     Hewlett Packard Enterprise
By: Workplace by Facebook     Published Date: Dec 21, 2018
Technology is creating new expectations for openness within the workplace. Especially amongst a younger generation raised on mobile, messaging and instant access to information. Download this info-graphic now to see how Workplace by Facebook create a seamless experience inside the workplace with the use of openness and transparency.
Tags : 
     Workplace by Facebook
By: Workplace by Facebook     Published Date: Dec 21, 2018
We believe people change organizations. So, we built Workplace by Facebook to empower them. Our mission is to unlock human potential by giving the world a place to work together. We do it by combining next-generation technology and easy-to use features to transform communications, culture, and workflows inside organizations of all shapes, sizes, and industries. Industries like retail. In this playbook, we’ll explore the new consumer expectations shaping the future of retail in North America. You’ll discover why great brand experiences for your customers start with great work experiences for your employees. Download this whitepaper now to learn more about the benefits of customer-centric collaboration tools like Workplace by Facebook. Then you’ll be ready to take the next step on your digital transformation journey.
Tags : 
     Workplace by Facebook
By: TIBCO Software EMEA     Published Date: Jan 17, 2019
Are you considering data virtualization for your organization today? In this paper you will learn 10 core truths about data virtualization and gain essential knowledge for overcoming analytic data bottlenecks and driving better outcomes.
Tags : virtualization, data, analytics, datasets, software, access, integration, projects, tools, scalability
     TIBCO Software EMEA
By: TIBCO Software GmbH     Published Date: Jan 15, 2019
Are you considering data virtualization for your organization today? If so this whitepaper synthesizes the 10 things you need to know as you commence your data virtualization journey.
Tags : 
     TIBCO Software GmbH
By: TIBCO Software GmbH     Published Date: Jan 15, 2019
Enterprises use data virtualization software such as TIBCO® Data Virtualization to reduce data bottlenecks so more insights can be delivered for better business outcomes. For developers, data virtualization allows applications to access and use data without needing to know its technical details, such as how it is formatted or where it is physically located. For developers, data virtualization helps rapidly create reusable data services that access and transform data and deliver data analytics with even heavylifting reads completed quickly, securely, and with high performance. These data services can then be coalesced into a common data layer that can support a wide range of analytic and applications use cases. Data engineers and analytics development teams are big data virtualization users, with Gartner predicting over 50% of these teams adopting the technology by 202
Tags : 
     TIBCO Software GmbH
By: Zendesk     Published Date: Jan 04, 2019
No mercado mundial de softwares de suporte ao cliente, a Zendesk mais uma vez é reconhecida como líder no Quadrante Mágico do Gartner de 2018 para Centros de Relacionamento com Clientes de CRM. Todos os anos, o Gartner realiza uma análise minuciosa dos prestadores de serviços no setor de aplicativos de atendimento e suporte ao cliente. O relatório do Quadrante Mágico do Gartner para Centros de Relacionamento com Clientes de CRM oferece informações valiosas para líderes de empresas em busca de soluções tecnológicas para interagir e engajar seus clientes. Novamente, a Zendesk pode ser encontrada no relatório de 2018 no quadrante Líder, que consideramos um reflexo do sucesso de nossos 125.000 clientes, incluindo empresas como Airbnb, Tesco e a Universidade do Tennessee. O ano passado incluiu um número de marcos significativos para nós, incluindo o lançamento de recursos de inteligência artificial aprimorados para autoatendimento, com uma taxa de execução que ultrapassou US$ 500 milhões
Tags : 
     Zendesk
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com