time

Results 1 - 25 of 6444Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 19, 2013
SAP HANA enables real-time access to mission-critical, business data and thus, revolutionizes the way existing information can be utilized to address ever changing business requirements. This whitepaper describes both the business and technical benefits of implementing the Cisco UCS with NetApp Storage for SAP HANA solution.
Tags : 
     NetApp
By: IBM     Published Date: Sep 02, 2014
Life Sciences organizations need to be able to build IT infrastructures that are dynamic, scalable, easy to deploy and manage, with simplified provisioning, high performance, high utilization and able to exploit both data intensive and server intensive workloads, including Hadop MapReduce. Solutions must scale, both in terms of processing and storage, in order to better serve the institution long-term. There is a life cycle management of data, and making it useable for mainstream analyses and applications is an important aspect in system design. This presentation will describe IT requirements and how Technical Computing solutions from IBM and Platform Computing will address these challenges and deliver greater ROI and accelerated time to results for Life Sciences.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
With tougher regulations and continuing market volatility, financial firms are moving to active risk management with a focus on counterparty risk. Firms are revamping their risk and trading practices from top to bottom. They are adopting new risk models and frameworks that support a holistic view of risk. Banks recognize that technology is critical for this transformation, and are adding state-of-the-art enterprise risk management solutions, high performance data and grid management software, and fast hardware. Join IBM Algorithmics and IBM Platform Computing to gain insights on this trend and on technologies for enabling active "real-time" risk management.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
In today’s stringent financial services regulatory environment with exponential growth of data and dynamic business requirements, Risk Analytics has become integral to businesses. IBM Algorithmics provides very sophisticated analyses for a wide range of economic scenarios that better quantify risk for multiple departments within a firm, or across the enterprise. With Algorithmics, firms have a better handle on their financial exposure and credit risks before they finalize real-time transactions. But this requires the performance and agility of a scalable infrastructure; driving up IT risk and complexity. The IBM Application Ready Solution for Algorithmics provides an agile, reliable and high-performance infrastructure to deliver trusted risk insights for sustained growth and profitability. This integrated offering with a validated reference architecture delivers the right risk insights at the right time while lowering the total cost of ownership.
Tags : ibm, it risk, financial risk analytics
     IBM
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: Storiant     Published Date: Mar 16, 2015
Read this new IDC Report about how today's enterprise datacenters are dealing with new challenges that are far more demanding than ever before. Foremost is the exponential growth of data, most of it unstructured data. Big data and analytics implementations are also quickly becoming a strategic priority in many enterprises, demanding online access to more data, which is retained for longer periods of time. Legacy storage solutions with fixed design characteristics and a cost structure that doesn't scale are proving to be ill-suited for these new needs. This Technology Spotlight examines the issues that are driving organizations to replace older archive and backup-and-restore systems with business continuity and always-available solutions that can scale to handle extreme data growth while leveraging a cloudbased pricing model. The report also looks at the role of Storiant and its long-term storage services solution in the strategically important long-term storage market.
Tags : storiant, big data, analytics implementations, cloudbased pricing model, long-term storage services solution, long-term storage market
     Storiant
By: Impetus     Published Date: Feb 04, 2016
This paper explores the top seven must-have features in a Real-Time Streaming Analytics (RTSA) platform in order to help you choose a platform that meets the needs of your organization.
Tags : 
     Impetus
By: Impetus     Published Date: Feb 04, 2016
This white paper explores strategies to leverage the steady flow of new, advanced real-time streaming data analytics (RTSA) application development technologies. It defines a thoughtful approach to capitalize on the window of opportunity to benefit from the power of real-time decision making now, and still be able to move to new and emerging technologies as they become enterprise ready.
Tags : 
     Impetus
By: Impetus     Published Date: Mar 15, 2016
Streaming analytics platforms provide businesses a method for extracting strategic value from data-in-motion in a manner similar to how traditional analytics tools operate on data-at rest. Instead of historical analysis, the goal with streaming analytics is to enable near real-time decision making by letting companies inspect, correlate and analyze data even as it flows into applications and databases from numerous different sources. Streaming analytics allows companies to do event processing against massive volumes of data streaming into the enterprise at high velocity.
Tags : impetus, guide to stream analytics, real time streaming analytics, streaming analytics, real time analytics, big data analytics
     Impetus
By: RedPoint     Published Date: Sep 22, 2014
Enterprises can gain serious traction by taking advantage of the scalability, processing power and lower costs that Hadoop 2.0/YARN offers. YARN closes the functionality gap by opening Hadoop to mature enterprise-class data management capabilities. With a lot of data quality functionality left outside of Hadoop 1, and a lot of data inside HDFS originating outside the enterprise, the quality of the data residing in the Hadoop cluster is sometimes as stinky as elephant dung. Some of the topics discussed in this paper include: • The key features, benefits and limitations of Hadoop 1.0 • The benefit of performing data standardization, identity resolution, and master data management inside of Hadoop. • The transformative power of Hadoop 2.0 and its impact on the speed and cost of accessing, cleansing and delivering high-quality enterprise data. Download this illuminating white paper about what YARN really means to the world of big data management.
Tags : 
     RedPoint
By: MEMSQL     Published Date: Apr 12, 2016
The pace of data is not slowing. Applications of today are built with infinite data sets in mind. As these real-time applications become the norm, and batch processing becomes a relic of the past, digital enterprises will implement memory-optimized, distributed data systems to simplify Lambda Architectures for real-time data processing and exploration.
Tags : 
     MEMSQL
By: IBM     Published Date: Feb 13, 2015
IBM® has created a proprietary implementation of the open-source Hadoop MapReduce run-time that leverages the IBM Platform™ Symphony distributed computing middleware while maintaining application-level compatibility with Apache Hadoop.
Tags : 
     IBM
By: Dell and Intel®     Published Date: Aug 24, 2015
Business need: Merkle needed a scalable, cost-effective way to capture and analyze large amounts of structured and unstructured consumer data for use in developing better marketing campaigns for clients. Solution: The company deployed a Dell and HadoopTM cluster based on Dell and Intel® technologies to support a new big data insight solution that gives clients a unified view of customer data. Benefits: [bullets for the below points] • Partnership with Dell and Intel® leads to new big data solution • Cluster supports the Foundational Marketing Platform, a new data insight solution • Merkle can find patterns in big data and create analytical models that anticipate consumer behavior • Organization cuts costs by 60 percent and boosts processing speeds by 10 times • Solution provides scalability and enables innovation
Tags : 
     Dell and Intel®
By: GridGain     Published Date: Mar 10, 2015
Software as a Service (SaaS) is a software distribution model in which applications are hosted by a vendor or service provider and made available to customers over the Internet. Instead of companies installing software on their own servers, known as the on premises distribution model, application software providers host the software in the cloud and charge customers according to the time they spend using the software, or based on a monthly or annual fee. SaaS is becoming increasingly popular, and as the industry develops, more and more companies are dropping older business models in favor of this rapidly evolving methodology.
Tags : gridgain, saas, saas perfomance and scalability, in memory computing, data fabric, paas for saas, data grid, real-time streaming, hadoop
     GridGain
By: Kx Systems     Published Date: Jan 16, 2015
?Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we wa
Tags : kx systems, kdb+, relational database
     Kx Systems
By: snowflake     Published Date: Jun 09, 2016
THE CHALLENGE: DATA SOLUTIONS CAN’T KEEP PACE WITH DATA NEEDS Organizations are increasingly dependent on diff erent types of data to make successful business decisions. But as the volume, rate, and types of data expand and become less predictable, conventional data warehouses cannot consume all this data eff ectively. Big data solutions like Hadoop increase the complexity of the environment and generally lack the performance of traditional data warehouses. This makes it difficult, expensive, and time-consuming to manage all the systems and the data.
Tags : 
     snowflake
By: Cask     Published Date: Jun 28, 2016
A recent Gartner survey on Hadoop cited the two biggest challenges in working with Hadoop: “Skills gaps continue to be a major adoption inhibitor for 57% of respondents, while deciding how to get value from Hadoop was cited by 49% of respondents.” Cask is the company that makes building and deploying big data apps easy, allowing for 5 times faster time to value. To find out more, read about Cask Hydrator, a self-service, open source framework that lets data scientists easily develop and operate data pipelines using a graphical interface.
Tags : cask hydrator, hadoop, gartner survey, self-service data lakes
     Cask
By: BitStew     Published Date: May 26, 2016
The heaviest lift for an industrial enterprise is data integration, the Achilles’ heel of the Industrial Internet of Things (IIoT). Companies are now recognizing the enormous challenge involved in supporting Big Data strategies that can handle the data that is generated by information systems, operational systems and the extensive networks of old and new sensors. To compound these issues, business leaders are expecting data to be captured, analyzed and used in a near real-time to optimize business processes, drive efficiency and improve profitability. However, integrating this vast amount of dissimilar data into a unified data strategy can be overwhelming for even the largest organizations. Download this white paper, by Bit Stew’s Mike Varney, to learn why a big data solution will not get the job done. Learn how to leverage machine intelligence with a purpose-built IIoT platform to solve the data integration problem.
Tags : 
     BitStew
By: Dell SB     Published Date: Aug 27, 2019
Bon nombre de propriétaires de PME pensent que cela n’arrive qu’aux autres et que leur entreprise est trop petite pour être la cible de piratages, d’attaques par rançongiciel et d’autres types de cybercriminalité. D’autres sont conscients de l’importance de la cybersécurité, mais estiment qu’ils n’ont pas les ressources nécessaires pour en faire une priorité. Ce sont là quelques-unes des raisons pour lesquelles pas moins de 90 % des PME n’ont pas établi de système de protection pour leurs données ou celles de leurs clients.
Tags : 
     Dell SB
By: Hewlett Packard Enterprise     Published Date: Jul 29, 2019
Businesses of all sizes and industries have struggled to implement a hybrid IT environment that can meet their most critical needs, without disrupting business. The challenge is especially acute for midsized firms whose limited IT resources are facing pressure to adopt a never-ending torrent of technology innovations. In an effort to create a hybrid IT environment, many businesses end up with a multiple siloed infrastructure deployment options that consume increasing amounts of management time, effort and budget, while never quite meeting business needs.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Jul 29, 2019
Learn about the HPE Intelligent Data Platform and the new IT realities it addresses. With digital transformation underway in many organizations, more dynamic business models are becoming the key to success. This means infrastructure modernization and the introduction of technologies such as solid state storage, artificial intelligence and machine learning, software-defined infrastructure, and the cloud. At the same time, it means IT infrastructure management becomes much more complex. Enter HPE’s Intelligent Data Platform. With comprehensive coverage and AI/ML-driven real-time optimization that enables intelligent management of the entire data life cycle, the HPE Intelligent Data Platform enables an organization to get the most out of its IT resources while also meeting its evolving needs over time.
Tags : 
     Hewlett Packard Enterprise
By: VMware     Published Date: Sep 12, 2019
Whether a CIO is on a four-year trajectory or has plans to stay much longer, there’s a way to thrive in this role as business priorities continuously evolve. I advocate producing early wins, setting sights on specific goals at specific points in time, and delivering on those goals whether you plan to be at a company for the long term or expect to move on soon after passing the four-year mark.
Tags : 
     VMware
By: VMware     Published Date: Sep 12, 2019
You’ve heard the stories: a large Internet company exposing all three billion of its customer accounts; a major hotel chain compromising five hundred million customer records; and one of the big-three credit reporting agencies exposing more than 143 million records, leading to a 25 percent loss in value and a $439 million hit. At the time, all of these companies had security mechanisms in place. They had trained professionals on the job. They had invested heavily in protection. But the reality is that no amount of investment in preventative technologies can fully eliminate the threat of savvy attackers, malicious insiders, or inadvertent victims of phishing. Breaches are rising, and so are their cost. In 2018, the average cost of a data breach rose 6.4 percent to $3.86 million, and the cost of a “mega breach,” those defined as losing 1 million to 50 million records, carried especially punishing price tags between $40 million and $350 million.2 Despite increasing investment in security
Tags : 
     VMware
By: VMware     Published Date: Sep 12, 2019
It’s time to acknowledge that the tech industry has failed our customers when it comes to cybersecurity and data protection.
Tags : 
     VMware
By: Dell EMEA     Published Date: Sep 09, 2019
Quando si tratta di longevità, nessuno può reggere il confronto con Dell. Oltre a fornire capacità, gestibilità e funzionalità di protezione scelte dai dipartimenti IT, i nostri computer sono anche progettati per garantire cicli di vita più duraturi, con una conseguente riduzione degli sprechi. Non c'è da stupirsi che riscuotano successo nel mercato da così tanto tempo. Ma basta guardare al passato, parliamo piuttosto delle nuove funzionalità innovative. Il Latitude 7400 2-in-1 utilizza la nuova tecnologia ExpressSign-in di Dell che rileva la presenza dell'utente, attiva il sistema in circa un secondo e consente di effettuare l'accesso mediante riconoscimento facciale con Windows Hello. Gli utenti possono semplicemente sedersi alla scrivania e iniziare a lavorare, senza necessità di utilizzare combinazioni da tastiera per cambiare utente o addirittura toccare il tasto di accensione. Di fatto, è il primo PC al mondo a utilizzare un sensore di prossimità con tecnologia Intel® Context Se
Tags : 
     Dell EMEA
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com