sid

Results 1 - 25 of 3196Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 18, 2013
IT managers have indicated their two most significant challenges associated with managing unstructured data at multiple locations were keeping pace with data growth and improving data protection . Learn how the NetApp Distributed Content Repository provides advanced data protection and system recovery capabilities that can enable multiple data centers and remote offices to maintain access to data through hardware and software faults. Key benefits are: - continuous access to file data while maintaining data redundancy with no administrator intervention needed. - easily integrated and deployed into a distributed environment, providing transparent, centrally managed content storage - provision of secure multi-tenancy using security partitions. - provision effectively infinite, on-demand capacity while providing fast access to files and objects in the cloud. - secure, robust data protection techniques that enable data to persist beyond the life of the storage it resides on
Tags : 
     NetApp
By: IBM     Published Date: Sep 16, 2015
The impact of the 2008 financial crisis affects not only Sell Side firms - the focus of discussions - but Buy Side and Insurance firms. Dodd Frank Act targets all systemically important firms. This study conducted in partnership with Waters Technology, an Incisive Media publication, focuses these firms. The report finds that they are facing intense pressure on multiple fronts including stricter liquidity and solvency risk regulations, and changing market conditions. Like Sell Side firms, they require more innovative business models and analytics to meet these challenges and to deliver growth and performance.
Tags : 
     IBM
By: TIBCO     Published Date: Nov 09, 2015
As one of the most exciting and widely adopted open-source projects, Apache Spark in-memory clusters are driving new opportunities for application development as well as increased intake of IT infrastructure. Apache Spark is now the most active Apache project, with more than 600 contributions being made in the last 12 months by more than 200 organizations. A new survey conducted by Databricks—of 1,417 IT professionals working with Apache Spark finds that high-performance analytics applications that can work with big data are driving a large proportion of that demand. Apache Spark is now being used to aggregate multiple types of data in-memory versus only pulling data from Hadoop. For solution providers, the Apache Spark technology stack is a significant player because it’s one of the core technologies used to modernize data warehouses, a huge segment of the IT industry that accounts for multiple billions in revenue. Spark holds much promise for the future—with data lakes—a storage repo
Tags : 
     TIBCO
By: Impetus     Published Date: Mar 15, 2016
Streaming analytics platforms provide businesses a method for extracting strategic value from data-in-motion in a manner similar to how traditional analytics tools operate on data-at rest. Instead of historical analysis, the goal with streaming analytics is to enable near real-time decision making by letting companies inspect, correlate and analyze data even as it flows into applications and databases from numerous different sources. Streaming analytics allows companies to do event processing against massive volumes of data streaming into the enterprise at high velocity.
Tags : impetus, guide to stream analytics, real time streaming analytics, streaming analytics, real time analytics, big data analytics
     Impetus
By: Dell and Intel®     Published Date: Jun 18, 2015
The rapid evolution of big data technology in the past few years has changed forever the pursuit of scientific exploration and discovery. Along with traditional experiment and theory, computational modeling and simulation is a third paradigm for science. Its value lies in exploring areas of science in which physical experimentation is unfeasible and insights cannot be revealed analytically, such as in climate modeling, seismology and galaxy formation. More recently, big data has been called the “fourth paradigm" of science. Big data can be observed, in a real sense, by computers processing it and often by humans reviewing visualizations created from it. In the past, humans had to reduce the data, often using techniques of statistical sampling, to be able to make sense of it. Now, new big data processing techniques will help us make sense of it without traditional reduction
Tags : 
     Dell and Intel®
By: RedPoint     Published Date: Sep 22, 2014
Enterprises can gain serious traction by taking advantage of the scalability, processing power and lower costs that Hadoop 2.0/YARN offers. YARN closes the functionality gap by opening Hadoop to mature enterprise-class data management capabilities. With a lot of data quality functionality left outside of Hadoop 1, and a lot of data inside HDFS originating outside the enterprise, the quality of the data residing in the Hadoop cluster is sometimes as stinky as elephant dung. Some of the topics discussed in this paper include: • The key features, benefits and limitations of Hadoop 1.0 • The benefit of performing data standardization, identity resolution, and master data management inside of Hadoop. • The transformative power of Hadoop 2.0 and its impact on the speed and cost of accessing, cleansing and delivering high-quality enterprise data. Download this illuminating white paper about what YARN really means to the world of big data management.
Tags : 
     RedPoint
By: RedPoint     Published Date: Sep 22, 2014
The emergence of YARN for the Hadoop 2.0 platform has opened the door to new tools and applications that promise to allow more companies to reap the benefits of big data in ways never before possible with outcomes possibly never imagined. By separating the problem of cluster resource management from the data processing function, YARN offers a world beyond MapReduce: less encumbered by complex programming protocols, faster, and at a lower cost. Some of the topics discussed in this paper include: • Why is YARN important for realizing the power of Hadoop for data integration, quality and management? • Benchmark results of MapReduce vs. Pig vs. visual “data flow” design tools • The 3 key features of YARN that solve the complex problems that prohibit businesses from gaining maximum benefit from Hadoop. Download this paper to learn why the power of Hadoop 2.0 lies in enabling applications to run inside Hadoop, without the constraints of MapReduce.
Tags : 
     RedPoint
By: GridGain     Published Date: Sep 24, 2014
In-memory computing (IMC) is an emerging field of importance in the big data industry. It is a quickly evolving technology, seen by many as an effective way to address the proverbial 3 V’s of big data—volume, velocity, and variety. Big data requires ever more powerful means to process and analyze growing stores of data, being collected at more rapid rates, and with increasing diversity in the types of data being sought—both structured and unstructured. In-memory computing’s rapid rise in the marketplace has the big data community on alert. In fact, Gartner picked in-memory computing as one of the Top Ten Strategic Initiatives.
Tags : gridgain, in memory computing, big data industry, 3v's of big data-volume
     GridGain
By: TIBCO     Published Date: Sep 02, 2014
In this Guide you will learn how predictive analytics helps your organization predict with confidence what will happen next so that you can make smarter decisions and improve business outcomes. It is important to adopt a predictive analytics solution that meets the specific needs of different users and skill sets from beginners, to experienced analysts, to data scientists.
Tags : 
     TIBCO
By: IBM     Published Date: Nov 14, 2014
Platform HPC enables HPC customers to side-step many overhead cost and support issues that often plague open-source environments and enable them to deploy powerful, easy to use clusters.
Tags : 
     IBM
By: Intel     Published Date: Sep 16, 2014
In this Guide, we take a look at what Lustre on infrastructure AWS delivers for a broad community of business and commercial organizations struggling with the challenge of big data and demanding storage growth.
Tags : intel, lustre, big data solutions in the cloud
     Intel
By: Dell and Intel®     Published Date: Sep 17, 2014
According to the 2014 IDG Enterprise Big Data research report, companies are intensifying their efforts to derive value through big data initiatives with nearly half (49%) of respondents already implementing big data projects or in the process of doing so in the future. Further, organizations are seeing exponential growth in the amount of data managed with an expected increas of 76% within the next 12-18 months. With growth there are opportunities as well as challenges. Among those facing the big data challenge are finance executives, as this extraordinary growth presents a unique opportunity to leverage data assets like never before. * Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.
Tags : dell and intel, big data for finance
     Dell and Intel®
By: Dell and Intel®     Published Date: Apr 02, 2015
In this Guide we have delivered the case for the benefits of big data technology applied to the needs of the manufacturing industry. In demonstrating the value of big data, we included: • An overview of how manufacturing can benefit from the big data technology stack • An overview of how manufacturing can benefit from the big data technology stack • A high-level view of common big data pain points for manufacturers • A detailed analysis of big data technology for manufacturers • A view as to how manufacturers are going about big data adoption • A proven case study with: Omneo • Dell PowerEdge servers with Intel® Xeon® processors
Tags : dell, intel, big data, manufacturing, technology stack, pain points, big data adoption, omneo
     Dell and Intel®
By: Dell and Intel®     Published Date: Sep 06, 2015
In conclusion, the retail experience has changed dramatically in recent years as there has been a power shift over to consumers. Shoppers can easily find and compare products from an array of devices, even while walking through a store. They can share their opinions about retailers and products through social media and influence other prospective customers. To compete in this new multi-channel environment, we’ve seen in this guide how retailers have to adopt new and innovative strategies to attract and retain customers. Big data technologies, specifically Hadoop, enable retailers to connect with customers through multiple channels at an entirely new level by harnessing the vast volumes of new data available today. Hadoop helps retailers store, transform, integrate and analyze a wide variety of online and offline customer data—POS transactions, e-commerce transactions, clickstream data, email, social media, sensor data and call center records—all in one central repository. Retailers can
Tags : 
     Dell and Intel®
By: GridGain     Published Date: Mar 10, 2015
Software as a Service (SaaS) is a software distribution model in which applications are hosted by a vendor or service provider and made available to customers over the Internet. Instead of companies installing software on their own servers, known as the on premises distribution model, application software providers host the software in the cloud and charge customers according to the time they spend using the software, or based on a monthly or annual fee. SaaS is becoming increasingly popular, and as the industry develops, more and more companies are dropping older business models in favor of this rapidly evolving methodology.
Tags : gridgain, saas, saas perfomance and scalability, in memory computing, data fabric, paas for saas, data grid, real-time streaming, hadoop
     GridGain
By: Kx Systems     Published Date: Jan 16, 2015
?Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we wa
Tags : kx systems, kdb+, relational database
     Kx Systems
By: snowflake     Published Date: Jun 09, 2016
Data and the way that data is used have changed, but data warehousing has not. Today’s premises-based data warehouses are based on technology that is, at its core, two decades old. To meet the demands and opportunities of today, data warehouses have to fundamentally change.
Tags : 
     snowflake
By: Cisco EMEA     Published Date: Mar 08, 2019
And then imagine processing power strong enough to make sense of all this data in every language and in every dimension. Unless you’ve achieved that digital data nirvana (and you haven’t told the rest of us), you’re going to have some unknowns in your world. In the world of security, unknown threats exist outside the enterprise in the form of malicious actors, state-sponsored attacks and malware that moves fast and destroys everything it touches. The unknown exists inside the enterprise in the form of insider threat from rogue employees or careless contractors – which was deemed by 24% of our survey respondents to pose the most serious risk to their organizations. The unknown exists in the form of new devices, new cloud applications, and new data. The unknown is what keeps CISOs, what keeps you, up at night – and we know because we asked you.
Tags : 
     Cisco EMEA
By: Cisco EMEA     Published Date: Mar 26, 2019
Imagine if you could see deep into the future. And way back into the past, both at the same time. Imagine having visibility of everything that had ever happened and everything that was ever going to happen, everywhere, all at once. And then imagine processing power strong enough to make sense of all this data in every language and in every dimension. Unless you’ve achieved that digital data nirvana (and you haven’t told the rest of us), you’re going to have some unknowns in your world. In the world of security, unknown threats exist outside the enterprise in the form of malicious actors, state-sponsored attacks and malware that moves fast and destroys everything it touches. The unknown exists inside the enterprise in the form of insider threat from rogue employees or careless contractors – which was deemed by 24% of our survey respondents to pose the most serious risk to their organizations. The unknown exists in the form of new devices, new cloud applications, and new data. The unk
Tags : 
     Cisco EMEA
By: Dell EMC     Published Date: Feb 14, 2019
Disaster recovery and long term retention (LTR) of data can be challenging for mid-sized organizations. Keeping a secondary site up for disaster recovery can be expensive and dealing with tape for LTR can be slow and costly. IDPA DD4400 enables mid-size organizations to take advantage of clould efficiencies for data protection with cloud disaster recovery and long term retention. Download this summary from Dell and Intel® to learn more. Intel Inside®. Powerful Productivity Outside.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: Feb 14, 2019
Today’s organizations must have IT solutions that can handle both current and emerging workloads. The modular design of PowerEdge MX—powered by Intel® Xeon® scabable processors—meet that demand. Access this Dell brief to learn the five ways PowerEdge MX can help you scale, secure and simplify your IT. Intel, the Intel logo, Xeon, and Xeon Inside are trademarks of Intel Corporation or its subsidiaries in the U.S. and/or other countries.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: Feb 14, 2019
Technology is quickly moving to the forefront as organizations undertake digital and IT transformation projects that enable strategic differentiation in a world where users are leveraging applications and data in new ways. The reality is, most organizations were not born digital but instead have legacy business processes, applications, and infrastructure that require modernization and automation. Read this executive summary from Dell and Intel® to learn why businesses must embark on IT transformation to modernize and automate their legacy infrastructure and prime themselves to achieve their digital business goals. Intel Inside®. Powerful Productivity Outside.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: Feb 14, 2019
Isilon scale-out NAS delivers the analytics performance and extreme concurrency at scale to feed the most data hungry analytic algorithms. Access this overview from Dell and Intel® to learn more. Intel Inside®. Powerful Productivity Outside.
Tags : 
     Dell EMC
By: Schneider Electric     Published Date: Mar 28, 2019
Implementing prefabricated modular data centers results in well-understood benefits including speed of deployment, predictability, scalability, and lifecycle cost. The process of deploying them – from designing the data center, to preparing the site, to procuring the equipment, to installation – is quite different than that of a traditional data center. This paper presents practical considerations, guidance, and results that a data center manager should expect from such a deployment.
Tags : predictability, scalability, life cycle cost, schneider electric, - prefabricated data center, modular data center, data center design, data center manager
     Schneider Electric
By: Schneider Electric     Published Date: Mar 28, 2019
To win the colocation race you need to be faster, reliable, innovative and efficient –all while making smarter design choices that will ensure positive returns. Customers are demanding 100% uptime and always-on connectivity –be it small enterprises to large Internet Giants–and colocation providers need to meet these expectations. The growing adoption of prefabricated data centers allows just that. With the undisputed benefits of prefab modules and building components(like speed or quality),colocation providers can manage their business today, and deploy faster in the future. Chris Crosby, CEO for Compass Datacenters, is well-known for his expertise in the data center industry. From its founding in 2012, Compass’ data center solutions have used prefabricated components like exterior walls and power centers to deliver brandable, dedicated facilities for colocation providers. Prefabrication is the central element of the company’s “Kit of Parts” methodology that delivers customizable data center solutions from the core to the edge. By attending this webinar, colocation providers will: • understand the flexibility and value delivered via the use of prefabricated construction • Hear the common misperceptions regarding prefabricated modules and data center components • learn how prefabricated solutions can provide more revenue generation capability than competing alternatives • know what key things to consider when evaluating prefabricated data center design
Tags : data centers, colocation provider, schneider electric, - modular data centers, prefabricated data center
     Schneider Electric
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com