sid

Results 1 - 25 of 3404Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 18, 2013
IT managers have indicated their two most significant challenges associated with managing unstructured data at multiple locations were keeping pace with data growth and improving data protection . Learn how the NetApp Distributed Content Repository provides advanced data protection and system recovery capabilities that can enable multiple data centers and remote offices to maintain access to data through hardware and software faults. Key benefits are: - continuous access to file data while maintaining data redundancy with no administrator intervention needed. - easily integrated and deployed into a distributed environment, providing transparent, centrally managed content storage - provision of secure multi-tenancy using security partitions. - provision effectively infinite, on-demand capacity while providing fast access to files and objects in the cloud. - secure, robust data protection techniques that enable data to persist beyond the life of the storage it resides on
Tags : 
     NetApp
By: IBM     Published Date: Sep 16, 2015
The impact of the 2008 financial crisis affects not only Sell Side firms - the focus of discussions - but Buy Side and Insurance firms. Dodd Frank Act targets all systemically important firms. This study conducted in partnership with Waters Technology, an Incisive Media publication, focuses these firms. The report finds that they are facing intense pressure on multiple fronts including stricter liquidity and solvency risk regulations, and changing market conditions. Like Sell Side firms, they require more innovative business models and analytics to meet these challenges and to deliver growth and performance.
Tags : 
     IBM
By: TIBCO     Published Date: Nov 09, 2015
As one of the most exciting and widely adopted open-source projects, Apache Spark in-memory clusters are driving new opportunities for application development as well as increased intake of IT infrastructure. Apache Spark is now the most active Apache project, with more than 600 contributions being made in the last 12 months by more than 200 organizations. A new survey conducted by Databricks—of 1,417 IT professionals working with Apache Spark finds that high-performance analytics applications that can work with big data are driving a large proportion of that demand. Apache Spark is now being used to aggregate multiple types of data in-memory versus only pulling data from Hadoop. For solution providers, the Apache Spark technology stack is a significant player because it’s one of the core technologies used to modernize data warehouses, a huge segment of the IT industry that accounts for multiple billions in revenue. Spark holds much promise for the future—with data lakes—a storage repo
Tags : 
     TIBCO
By: Impetus     Published Date: Mar 15, 2016
Streaming analytics platforms provide businesses a method for extracting strategic value from data-in-motion in a manner similar to how traditional analytics tools operate on data-at rest. Instead of historical analysis, the goal with streaming analytics is to enable near real-time decision making by letting companies inspect, correlate and analyze data even as it flows into applications and databases from numerous different sources. Streaming analytics allows companies to do event processing against massive volumes of data streaming into the enterprise at high velocity.
Tags : impetus, guide to stream analytics, real time streaming analytics, streaming analytics, real time analytics, big data analytics
     Impetus
By: Dell and Intel®     Published Date: Jun 18, 2015
The rapid evolution of big data technology in the past few years has changed forever the pursuit of scientific exploration and discovery. Along with traditional experiment and theory, computational modeling and simulation is a third paradigm for science. Its value lies in exploring areas of science in which physical experimentation is unfeasible and insights cannot be revealed analytically, such as in climate modeling, seismology and galaxy formation. More recently, big data has been called the “fourth paradigm" of science. Big data can be observed, in a real sense, by computers processing it and often by humans reviewing visualizations created from it. In the past, humans had to reduce the data, often using techniques of statistical sampling, to be able to make sense of it. Now, new big data processing techniques will help us make sense of it without traditional reduction
Tags : 
     Dell and Intel®
By: RedPoint     Published Date: Sep 22, 2014
Enterprises can gain serious traction by taking advantage of the scalability, processing power and lower costs that Hadoop 2.0/YARN offers. YARN closes the functionality gap by opening Hadoop to mature enterprise-class data management capabilities. With a lot of data quality functionality left outside of Hadoop 1, and a lot of data inside HDFS originating outside the enterprise, the quality of the data residing in the Hadoop cluster is sometimes as stinky as elephant dung. Some of the topics discussed in this paper include: • The key features, benefits and limitations of Hadoop 1.0 • The benefit of performing data standardization, identity resolution, and master data management inside of Hadoop. • The transformative power of Hadoop 2.0 and its impact on the speed and cost of accessing, cleansing and delivering high-quality enterprise data. Download this illuminating white paper about what YARN really means to the world of big data management.
Tags : 
     RedPoint
By: RedPoint     Published Date: Sep 22, 2014
The emergence of YARN for the Hadoop 2.0 platform has opened the door to new tools and applications that promise to allow more companies to reap the benefits of big data in ways never before possible with outcomes possibly never imagined. By separating the problem of cluster resource management from the data processing function, YARN offers a world beyond MapReduce: less encumbered by complex programming protocols, faster, and at a lower cost. Some of the topics discussed in this paper include: • Why is YARN important for realizing the power of Hadoop for data integration, quality and management? • Benchmark results of MapReduce vs. Pig vs. visual “data flow” design tools • The 3 key features of YARN that solve the complex problems that prohibit businesses from gaining maximum benefit from Hadoop. Download this paper to learn why the power of Hadoop 2.0 lies in enabling applications to run inside Hadoop, without the constraints of MapReduce.
Tags : 
     RedPoint
By: GridGain     Published Date: Sep 24, 2014
In-memory computing (IMC) is an emerging field of importance in the big data industry. It is a quickly evolving technology, seen by many as an effective way to address the proverbial 3 V’s of big data—volume, velocity, and variety. Big data requires ever more powerful means to process and analyze growing stores of data, being collected at more rapid rates, and with increasing diversity in the types of data being sought—both structured and unstructured. In-memory computing’s rapid rise in the marketplace has the big data community on alert. In fact, Gartner picked in-memory computing as one of the Top Ten Strategic Initiatives.
Tags : gridgain, in memory computing, big data industry, 3v's of big data-volume
     GridGain
By: TIBCO     Published Date: Sep 02, 2014
In this Guide you will learn how predictive analytics helps your organization predict with confidence what will happen next so that you can make smarter decisions and improve business outcomes. It is important to adopt a predictive analytics solution that meets the specific needs of different users and skill sets from beginners, to experienced analysts, to data scientists.
Tags : 
     TIBCO
By: IBM     Published Date: Nov 14, 2014
Platform HPC enables HPC customers to side-step many overhead cost and support issues that often plague open-source environments and enable them to deploy powerful, easy to use clusters.
Tags : 
     IBM
By: Intel     Published Date: Sep 16, 2014
In this Guide, we take a look at what Lustre on infrastructure AWS delivers for a broad community of business and commercial organizations struggling with the challenge of big data and demanding storage growth.
Tags : intel, lustre, big data solutions in the cloud
     Intel
By: Dell and Intel®     Published Date: Sep 17, 2014
According to the 2014 IDG Enterprise Big Data research report, companies are intensifying their efforts to derive value through big data initiatives with nearly half (49%) of respondents already implementing big data projects or in the process of doing so in the future. Further, organizations are seeing exponential growth in the amount of data managed with an expected increas of 76% within the next 12-18 months. With growth there are opportunities as well as challenges. Among those facing the big data challenge are finance executives, as this extraordinary growth presents a unique opportunity to leverage data assets like never before. * Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.
Tags : dell and intel, big data for finance
     Dell and Intel®
By: Dell and Intel®     Published Date: Apr 02, 2015
In this Guide we have delivered the case for the benefits of big data technology applied to the needs of the manufacturing industry. In demonstrating the value of big data, we included: • An overview of how manufacturing can benefit from the big data technology stack • An overview of how manufacturing can benefit from the big data technology stack • A high-level view of common big data pain points for manufacturers • A detailed analysis of big data technology for manufacturers • A view as to how manufacturers are going about big data adoption • A proven case study with: Omneo • Dell PowerEdge servers with Intel® Xeon® processors
Tags : dell, intel, big data, manufacturing, technology stack, pain points, big data adoption, omneo
     Dell and Intel®
By: Dell and Intel®     Published Date: Sep 06, 2015
In conclusion, the retail experience has changed dramatically in recent years as there has been a power shift over to consumers. Shoppers can easily find and compare products from an array of devices, even while walking through a store. They can share their opinions about retailers and products through social media and influence other prospective customers. To compete in this new multi-channel environment, we’ve seen in this guide how retailers have to adopt new and innovative strategies to attract and retain customers. Big data technologies, specifically Hadoop, enable retailers to connect with customers through multiple channels at an entirely new level by harnessing the vast volumes of new data available today. Hadoop helps retailers store, transform, integrate and analyze a wide variety of online and offline customer data—POS transactions, e-commerce transactions, clickstream data, email, social media, sensor data and call center records—all in one central repository. Retailers can
Tags : 
     Dell and Intel®
By: GridGain     Published Date: Mar 10, 2015
Software as a Service (SaaS) is a software distribution model in which applications are hosted by a vendor or service provider and made available to customers over the Internet. Instead of companies installing software on their own servers, known as the on premises distribution model, application software providers host the software in the cloud and charge customers according to the time they spend using the software, or based on a monthly or annual fee. SaaS is becoming increasingly popular, and as the industry develops, more and more companies are dropping older business models in favor of this rapidly evolving methodology.
Tags : gridgain, saas, saas perfomance and scalability, in memory computing, data fabric, paas for saas, data grid, real-time streaming
     GridGain
By: Kx Systems     Published Date: Jan 16, 2015
?Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we wa
Tags : kx systems, kdb+, relational database
     Kx Systems
By: snowflake     Published Date: Jun 09, 2016
Data and the way that data is used have changed, but data warehousing has not. Today’s premises-based data warehouses are based on technology that is, at its core, two decades old. To meet the demands and opportunities of today, data warehouses have to fundamentally change.
Tags : 
     snowflake
By: Intel     Published Date: Jul 17, 2019
Today, business is conducted in a fast-paced, on-demand, globally dispersed environment. Maintaining a competitive edge requires cohesive real-time collaboration. Mobile workers, partners, vendors, suppliers, and even customers expect to be able to work together seamlessly, both inside and outside the firewall. But delivering tools for high-quality collaboration is often challenging, due to the need for businesses of all sizes to support a wide range of personal and business devices. Mixing disparate devices with complex conferencing solutions often lead to frustrated employees, while businesses face a labor-intensive, costly endeavor without a high ROI. In addition, collaboration is evolving with businesses creating small "huddle" spaces and open workspaces that increase the demand for ubiquitous, effective conferencing. The Intel Unite ' solution is a fast, simple, cost-efficient way to deliver a more secure, manageable, high-quality collaboration experience. Whether you select the
Tags : 
     Intel
By: Entrust Datacard     Published Date: Jul 23, 2019
Advanced key and certificate management enables the use of digital credentials even in the most demanding of security environments. Such solutions enable users, regardless of whether they are internal or external to their network, to benefit from both basic and enhanced capabilities in a consistent and secure manner. This document was created to assist organizations in the selection of the best PKI solution to meet their business and security needs. It outlines key questions to be considered during the selection process to ensure the aforementioned requirements are addressed. This is not intended to be an exhaustive list. It is meant as a starting place to assist you in your review process. Introduction
Tags : 
     Entrust Datacard
By: KPMG     Published Date: Sep 04, 2019
Key considerations when developing a strong and customer-friendly approach to intelligent authentication. Bank fraud is on the rise. In fact, according to a recent survey of 43 major banks around the world, it’s not just the number of fraud cases that is going up; so, too, is the value of fraud overall. In large part, this increase in fraud is the result of identity theft scams. Rather than attempting some sort of high-stakes virtual bank heist for all the gold in the vault, most online thieves seem content simply stealing money from every-day customer’s accounts when they aren’t looking.
Tags : 
     KPMG
By: Intel     Published Date: Aug 26, 2019
Distributed cloud architectures will play an important role in Communication Service Providers’ network transformations. The distributed cloud will be made of numerous edge locations, from the customer premises to mobile base stations to central offices, each of which has its benefits and constraints. CSPs will need to consider attributes such as performance, availability, security, integration, and data and traffic volume to optimize the infrastructure for a myriad of workloads. Learn more about Heavy Reading’s survey results on CSPs adopting a distributed cloud strategy and delivering new services in this white paper.
Tags : 
     Intel
By: Google     Published Date: Aug 05, 2019
"Agile BI requires more than just agile dashboards. True agility means prototyping data models quickly so business users can continuously iterate on them. Application development and delivery professionals working on BI initiatives should consider adding DWA platforms to their BI toolbox. This Forrester report discusses how seven data warehouse automation vendors bring Agile options to all phases of BI/analytics application development. Read more to find out how these platforms help facilitate shorter development cycles."
Tags : data warehouse automation, forrester, agile solutions, business intelligence, analytics
     Google
By: Gigamon     Published Date: Jun 21, 2019
Organisations have invested heavily in cybersecurity tools and yet more than five million data records are lost or stolen every day. The problem is not that today’s cybersecurity tools are badly designed or missing features, the problem is that surging volumes of network traffic overwhelm security tools, causing administrators to use sampling or disable advanced features in order to preserve application performance. Also, security tools and IT staff don’t get all the data they need to detect and respond to outside attacks and insider incidents, because they are faced with “blind spots” in data collection. Read this business brief in local language to find out how
Tags : 
     Gigamon
By: Automation Anywhere APAC     Published Date: Aug 15, 2019
In this InfoBrief, we discuss the broad impact of digitization and how organizations are utilizing RPA technologies for driving business outcomes. We also share use cases and examples from different sectors, as well as key factors that organizations need to consider when selecting a partner for RPA deployment. Since RPA is the latest buzzword, there is a lot of noise around this topic and we aim to filter the noise and provide key insights, as well as offer essential guidance for successful RPA strategies.
Tags : analyst, artificial intelligence, intelligent automation, rpa, digital workforce
     Automation Anywhere APAC
By: Automation Anywhere APAC     Published Date: Aug 15, 2019
Global corporate enterprise AI practitioners are clearly still dealing with infrastructure issues related to talent and technology. End-to-end processes remain stubbornly carbon-based. Rule-based automation is truly not yet globally scaled across the majority of organizations. Double the AI And yet, the AI & Intelligent Automation Network members went from 21%, having deployed Intelligent Enterprise solutions to over 44% in just one year’s time. Over 4/5 expect to deploy AI in under two years The stated goal for deployment is just under 83% by the end of 2020. Considering the fact that they’ve essentially got two years, and those ranks have doubled in one year – doubling again in two years is achievable. Incidentally, that same number was only 67% a year ago. 50% expect to be established, globally scaling or refining AI in under two years Global corporate enterprise is in fact slowly but surely transforming into the intelligent enterprise of tomorrow. Having said that, it will be
Tags : analyst, artificial intelligence, intelligent automation, rpa
     Automation Anywhere APAC
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com