time

Results 1 - 25 of 6224Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 19, 2013
SAP HANA enables real-time access to mission-critical, business data and thus, revolutionizes the way existing information can be utilized to address ever changing business requirements. This whitepaper describes both the business and technical benefits of implementing the Cisco UCS with NetApp Storage for SAP HANA solution.
Tags : 
     NetApp
By: IBM     Published Date: Sep 02, 2014
Life Sciences organizations need to be able to build IT infrastructures that are dynamic, scalable, easy to deploy and manage, with simplified provisioning, high performance, high utilization and able to exploit both data intensive and server intensive workloads, including Hadop MapReduce. Solutions must scale, both in terms of processing and storage, in order to better serve the institution long-term. There is a life cycle management of data, and making it useable for mainstream analyses and applications is an important aspect in system design. This presentation will describe IT requirements and how Technical Computing solutions from IBM and Platform Computing will address these challenges and deliver greater ROI and accelerated time to results for Life Sciences.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
With tougher regulations and continuing market volatility, financial firms are moving to active risk management with a focus on counterparty risk. Firms are revamping their risk and trading practices from top to bottom. They are adopting new risk models and frameworks that support a holistic view of risk. Banks recognize that technology is critical for this transformation, and are adding state-of-the-art enterprise risk management solutions, high performance data and grid management software, and fast hardware. Join IBM Algorithmics and IBM Platform Computing to gain insights on this trend and on technologies for enabling active "real-time" risk management.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
In today’s stringent financial services regulatory environment with exponential growth of data and dynamic business requirements, Risk Analytics has become integral to businesses. IBM Algorithmics provides very sophisticated analyses for a wide range of economic scenarios that better quantify risk for multiple departments within a firm, or across the enterprise. With Algorithmics, firms have a better handle on their financial exposure and credit risks before they finalize real-time transactions. But this requires the performance and agility of a scalable infrastructure; driving up IT risk and complexity. The IBM Application Ready Solution for Algorithmics provides an agile, reliable and high-performance infrastructure to deliver trusted risk insights for sustained growth and profitability. This integrated offering with a validated reference architecture delivers the right risk insights at the right time while lowering the total cost of ownership.
Tags : ibm, it risk, financial risk analytics
     IBM
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: Storiant     Published Date: Mar 16, 2015
Read this new IDC Report about how today's enterprise datacenters are dealing with new challenges that are far more demanding than ever before. Foremost is the exponential growth of data, most of it unstructured data. Big data and analytics implementations are also quickly becoming a strategic priority in many enterprises, demanding online access to more data, which is retained for longer periods of time. Legacy storage solutions with fixed design characteristics and a cost structure that doesn't scale are proving to be ill-suited for these new needs. This Technology Spotlight examines the issues that are driving organizations to replace older archive and backup-and-restore systems with business continuity and always-available solutions that can scale to handle extreme data growth while leveraging a cloudbased pricing model. The report also looks at the role of Storiant and its long-term storage services solution in the strategically important long-term storage market.
Tags : storiant, big data, analytics implementations, cloudbased pricing model, long-term storage services solution, long-term storage market
     Storiant
By: Impetus     Published Date: Feb 04, 2016
This paper explores the top seven must-have features in a Real-Time Streaming Analytics (RTSA) platform in order to help you choose a platform that meets the needs of your organization.
Tags : 
     Impetus
By: Impetus     Published Date: Feb 04, 2016
This white paper explores strategies to leverage the steady flow of new, advanced real-time streaming data analytics (RTSA) application development technologies. It defines a thoughtful approach to capitalize on the window of opportunity to benefit from the power of real-time decision making now, and still be able to move to new and emerging technologies as they become enterprise ready.
Tags : 
     Impetus
By: Impetus     Published Date: Mar 15, 2016
Streaming analytics platforms provide businesses a method for extracting strategic value from data-in-motion in a manner similar to how traditional analytics tools operate on data-at rest. Instead of historical analysis, the goal with streaming analytics is to enable near real-time decision making by letting companies inspect, correlate and analyze data even as it flows into applications and databases from numerous different sources. Streaming analytics allows companies to do event processing against massive volumes of data streaming into the enterprise at high velocity.
Tags : impetus, guide to stream analytics, real time streaming analytics, streaming analytics, real time analytics, big data analytics
     Impetus
By: RedPoint     Published Date: Sep 22, 2014
Enterprises can gain serious traction by taking advantage of the scalability, processing power and lower costs that Hadoop 2.0/YARN offers. YARN closes the functionality gap by opening Hadoop to mature enterprise-class data management capabilities. With a lot of data quality functionality left outside of Hadoop 1, and a lot of data inside HDFS originating outside the enterprise, the quality of the data residing in the Hadoop cluster is sometimes as stinky as elephant dung. Some of the topics discussed in this paper include: • The key features, benefits and limitations of Hadoop 1.0 • The benefit of performing data standardization, identity resolution, and master data management inside of Hadoop. • The transformative power of Hadoop 2.0 and its impact on the speed and cost of accessing, cleansing and delivering high-quality enterprise data. Download this illuminating white paper about what YARN really means to the world of big data management.
Tags : 
     RedPoint
By: MEMSQL     Published Date: Apr 12, 2016
The pace of data is not slowing. Applications of today are built with infinite data sets in mind. As these real-time applications become the norm, and batch processing becomes a relic of the past, digital enterprises will implement memory-optimized, distributed data systems to simplify Lambda Architectures for real-time data processing and exploration.
Tags : 
     MEMSQL
By: IBM     Published Date: Feb 13, 2015
IBM® has created a proprietary implementation of the open-source Hadoop MapReduce run-time that leverages the IBM Platform™ Symphony distributed computing middleware while maintaining application-level compatibility with Apache Hadoop.
Tags : 
     IBM
By: Dell and Intel®     Published Date: Aug 24, 2015
Business need: Merkle needed a scalable, cost-effective way to capture and analyze large amounts of structured and unstructured consumer data for use in developing better marketing campaigns for clients. Solution: The company deployed a Dell and HadoopTM cluster based on Dell and Intel® technologies to support a new big data insight solution that gives clients a unified view of customer data. Benefits: [bullets for the below points] • Partnership with Dell and Intel® leads to new big data solution • Cluster supports the Foundational Marketing Platform, a new data insight solution • Merkle can find patterns in big data and create analytical models that anticipate consumer behavior • Organization cuts costs by 60 percent and boosts processing speeds by 10 times • Solution provides scalability and enables innovation
Tags : 
     Dell and Intel®
By: GridGain     Published Date: Mar 10, 2015
Software as a Service (SaaS) is a software distribution model in which applications are hosted by a vendor or service provider and made available to customers over the Internet. Instead of companies installing software on their own servers, known as the on premises distribution model, application software providers host the software in the cloud and charge customers according to the time they spend using the software, or based on a monthly or annual fee. SaaS is becoming increasingly popular, and as the industry develops, more and more companies are dropping older business models in favor of this rapidly evolving methodology.
Tags : gridgain, saas, saas perfomance and scalability, in memory computing, data fabric, paas for saas, data grid, real-time streaming
     GridGain
By: Kx Systems     Published Date: Jan 16, 2015
?Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we wa
Tags : kx systems, kdb+, relational database
     Kx Systems
By: snowflake     Published Date: Jun 09, 2016
THE CHALLENGE: DATA SOLUTIONS CAN’T KEEP PACE WITH DATA NEEDS Organizations are increasingly dependent on diff erent types of data to make successful business decisions. But as the volume, rate, and types of data expand and become less predictable, conventional data warehouses cannot consume all this data eff ectively. Big data solutions like Hadoop increase the complexity of the environment and generally lack the performance of traditional data warehouses. This makes it difficult, expensive, and time-consuming to manage all the systems and the data.
Tags : 
     snowflake
By: Cask     Published Date: Jun 28, 2016
A recent Gartner survey on Hadoop cited the two biggest challenges in working with Hadoop: “Skills gaps continue to be a major adoption inhibitor for 57% of respondents, while deciding how to get value from Hadoop was cited by 49% of respondents.” Cask is the company that makes building and deploying big data apps easy, allowing for 5 times faster time to value. To find out more, read about Cask Hydrator, a self-service, open source framework that lets data scientists easily develop and operate data pipelines using a graphical interface.
Tags : cask hydrator, hadoop, gartner survey, self-service data lakes
     Cask
By: BitStew     Published Date: May 26, 2016
The heaviest lift for an industrial enterprise is data integration, the Achilles’ heel of the Industrial Internet of Things (IIoT). Companies are now recognizing the enormous challenge involved in supporting Big Data strategies that can handle the data that is generated by information systems, operational systems and the extensive networks of old and new sensors. To compound these issues, business leaders are expecting data to be captured, analyzed and used in a near real-time to optimize business processes, drive efficiency and improve profitability. However, integrating this vast amount of dissimilar data into a unified data strategy can be overwhelming for even the largest organizations. Download this white paper, by Bit Stew’s Mike Varney, to learn why a big data solution will not get the job done. Learn how to leverage machine intelligence with a purpose-built IIoT platform to solve the data integration problem.
Tags : 
     BitStew
By: Hewlett Packard Enterprise     Published Date: Jul 29, 2019
Learn about the HPE Intelligent Data Platform and the new IT realities it addresses. With digital transformation underway in many organizations, more dynamic business models are becoming the key to success. This means infrastructure modernization and the introduction of technologies such as solid state storage, artificial intelligence and machine learning, software-defined infrastructure, and the cloud. At the same time, it means IT infrastructure management becomes much more complex. Enter HPE’s Intelligent Data Platform. With comprehensive coverage and AI/ML-driven real-time optimization that enables intelligent management of the entire data life cycle, the HPE Intelligent Data Platform enables an organization to get the most out of its IT resources while also meeting its evolving needs over time.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Jul 29, 2019
Learn how Small and mid-sized businesses (SMBs) that are leveraging Hybrid Cloud are seeing several critical benefits. Based on new Aberdeen research, these gains and benefits are even more vital for today’s organizations. With hybrid cloud, your SMB can reduce downtime, cost and risk while increasing flexibility and scalability.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Jul 25, 2019
Learn about the HPE Intelligent Data Platform and the new IT realities it addresses. With digital transformation underway in many organizations, more dynamic business models are becoming the key to success. This means infrastructure modernization and the introduction of technologies such as solid state storage, artificial intelligence and machine learning, software-defined infrastructure, and the cloud. At the same time, it means IT infrastructure management becomes much more complex. Enter HPE’s Intelligent Data Platform. With comprehensive coverage and AI/ML-driven real-time optimization that enables intelligent management of the entire data life cycle, the HPE Intelligent Data Platform enables an organization to get the most out of its IT resources while also meeting its evolving needs over time.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Jul 25, 2019
Discover the transformational power of intelligent storage. A company’s competitive success depends greatly on how well it harnesses the value of its data. However, it can be challenging to keep pace with the increases in the scale of data being generated, as well as users’ demands for accessibility. Additionally, maximizing the value of digital information requires data to be leveraged at both the optimal time and the optimal place. With HPE’s intelligent storage portfolio, organizations can modernize their IT to better understand their customers, run more agile IT functions, and improve logistical operations.
Tags : 
     Hewlett Packard Enterprise
By: 3D Systems     Published Date: May 15, 2019
Die Technologie des selektiven Lasersinterns (SLS) ist das Herzstück eines wachsenden Trends in der Massenfertigung benutzerdefinierter Produkte sowie der Fertigung funktionaler Prototypen. Die richtigen additiven Technologien, Werkstoffe und Oberflächen führen zu einem Wandel in der Fertigung. Die direkte digitale thermoplastische Fertigung bietet außergewöhnliche Qualität und gibt den Weg frei für neuartige Design-Parameter, die mit Spritzguss nicht möglich wären. Die thermoplastische additive Fertigung umgeht auch die lange Vorlaufzeit und Anfangsinvestitionen in Spritzgusswerkzeuge. Wenn Sie Ihre fertigen Teile anhand dreier Dimensionen beurteilen sollten - Qualität, Markteinführungszeit und Kosten pro Kubikzentimeter - bietet industrielles SLS in vielen Situationen die besseren Gesamtwerte.
Tags : 
     3D Systems
By: Intel     Published Date: Jul 17, 2019
Managing a large, diverse, and geographically dispersed fleet of client systems can be complex and time-consuming. With the increasing prevalence of smart, connected devices that are beginning to appear within the enterprise across industries, technology service organizations will face an explosive demand for a consistent approach to device management and security. Using Intel AMT, service organizations can take simple and effective steps to enable more manageable client systems. They can streamline operations and create a consistent approach to managing a broad spectrum of devices. Powerful platform capabilities can help service organizations meet user needs, minimize downtime, and safeguard the enterprise. Service organizations can draw upon available solution reference architectures, implementation guides, and readily available tools from Intel and others to successfully activate Intel AMT and begin to realize its major benefits.
Tags : 
     Intel
By: Intel     Published Date: Jul 17, 2019
The need for identity protection has never been stronger. Identity theft accounted for 74 percent of all data breaches in the first half of 2017, and costs associated with cybercrime are expected to reach $6 trillion annually by 2021. Any time an employee's username and password are compromised, your business is vulnerable. Eight-character passwords that changed every 90 days worked well a decade ago, but increasingly commonplace attack methods like password cracking, phishing, or screen scraping call for a new kind of protection.
Tags : 
     Intel
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com