time data

Results 1 - 25 of 1077Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 19, 2013
SAP HANA enables real-time access to mission-critical, business data and thus, revolutionizes the way existing information can be utilized to address ever changing business requirements. This whitepaper describes both the business and technical benefits of implementing the Cisco UCS with NetApp Storage for SAP HANA solution.
Tags : 
     NetApp
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: Storiant     Published Date: Mar 16, 2015
Read this new IDC Report about how today's enterprise datacenters are dealing with new challenges that are far more demanding than ever before. Foremost is the exponential growth of data, most of it unstructured data. Big data and analytics implementations are also quickly becoming a strategic priority in many enterprises, demanding online access to more data, which is retained for longer periods of time. Legacy storage solutions with fixed design characteristics and a cost structure that doesn't scale are proving to be ill-suited for these new needs. This Technology Spotlight examines the issues that are driving organizations to replace older archive and backup-and-restore systems with business continuity and always-available solutions that can scale to handle extreme data growth while leveraging a cloudbased pricing model. The report also looks at the role of Storiant and its long-term storage services solution in the strategically important long-term storage market.
Tags : storiant, big data, analytics implementations, cloudbased pricing model, long-term storage services solution, long-term storage market
     Storiant
By: Impetus     Published Date: Feb 04, 2016
This white paper explores strategies to leverage the steady flow of new, advanced real-time streaming data analytics (RTSA) application development technologies. It defines a thoughtful approach to capitalize on the window of opportunity to benefit from the power of real-time decision making now, and still be able to move to new and emerging technologies as they become enterprise ready.
Tags : 
     Impetus
By: Impetus     Published Date: Mar 15, 2016
Streaming analytics platforms provide businesses a method for extracting strategic value from data-in-motion in a manner similar to how traditional analytics tools operate on data-at rest. Instead of historical analysis, the goal with streaming analytics is to enable near real-time decision making by letting companies inspect, correlate and analyze data even as it flows into applications and databases from numerous different sources. Streaming analytics allows companies to do event processing against massive volumes of data streaming into the enterprise at high velocity.
Tags : impetus, guide to stream analytics, real time streaming analytics, streaming analytics, real time analytics, big data analytics
     Impetus
By: RedPoint     Published Date: Sep 22, 2014
Enterprises can gain serious traction by taking advantage of the scalability, processing power and lower costs that Hadoop 2.0/YARN offers. YARN closes the functionality gap by opening Hadoop to mature enterprise-class data management capabilities. With a lot of data quality functionality left outside of Hadoop 1, and a lot of data inside HDFS originating outside the enterprise, the quality of the data residing in the Hadoop cluster is sometimes as stinky as elephant dung. Some of the topics discussed in this paper include: • The key features, benefits and limitations of Hadoop 1.0 • The benefit of performing data standardization, identity resolution, and master data management inside of Hadoop. • The transformative power of Hadoop 2.0 and its impact on the speed and cost of accessing, cleansing and delivering high-quality enterprise data. Download this illuminating white paper about what YARN really means to the world of big data management.
Tags : 
     RedPoint
By: MEMSQL     Published Date: Apr 12, 2016
The pace of data is not slowing. Applications of today are built with infinite data sets in mind. As these real-time applications become the norm, and batch processing becomes a relic of the past, digital enterprises will implement memory-optimized, distributed data systems to simplify Lambda Architectures for real-time data processing and exploration.
Tags : 
     MEMSQL
By: Kx Systems     Published Date: Jan 16, 2015
?Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we wa
Tags : kx systems, kdb+, relational database
     Kx Systems
By: snowflake     Published Date: Jun 09, 2016
THE CHALLENGE: DATA SOLUTIONS CAN’T KEEP PACE WITH DATA NEEDS Organizations are increasingly dependent on diff erent types of data to make successful business decisions. But as the volume, rate, and types of data expand and become less predictable, conventional data warehouses cannot consume all this data eff ectively. Big data solutions like Hadoop increase the complexity of the environment and generally lack the performance of traditional data warehouses. This makes it difficult, expensive, and time-consuming to manage all the systems and the data.
Tags : 
     snowflake
By: Cask     Published Date: Jun 28, 2016
A recent Gartner survey on Hadoop cited the two biggest challenges in working with Hadoop: “Skills gaps continue to be a major adoption inhibitor for 57% of respondents, while deciding how to get value from Hadoop was cited by 49% of respondents.” Cask is the company that makes building and deploying big data apps easy, allowing for 5 times faster time to value. To find out more, read about Cask Hydrator, a self-service, open source framework that lets data scientists easily develop and operate data pipelines using a graphical interface.
Tags : cask hydrator, hadoop, gartner survey, self-service data lakes
     Cask
By: BitStew     Published Date: May 26, 2016
The heaviest lift for an industrial enterprise is data integration, the Achilles’ heel of the Industrial Internet of Things (IIoT). Companies are now recognizing the enormous challenge involved in supporting Big Data strategies that can handle the data that is generated by information systems, operational systems and the extensive networks of old and new sensors. To compound these issues, business leaders are expecting data to be captured, analyzed and used in a near real-time to optimize business processes, drive efficiency and improve profitability. However, integrating this vast amount of dissimilar data into a unified data strategy can be overwhelming for even the largest organizations. Download this white paper, by Bit Stew’s Mike Varney, to learn why a big data solution will not get the job done. Learn how to leverage machine intelligence with a purpose-built IIoT platform to solve the data integration problem.
Tags : 
     BitStew
By: Lenovo UK     Published Date: May 13, 2019
It’s time to think bigger than incremental processing power and storage capacity upgrades. In 2019 IT modernisation offers much more, providing opportunities to closely align infrastructure with your most ambitious goals and build a future-facing foundation for success. Whatever your goals, this playbook gives you six versatile strategies that can help you plan a successful IT modernisation project, supported by examples from our industry-leading data centre portfolio.
Tags : 
     Lenovo UK
By: Hewlett Packard Enterprise     Published Date: Apr 26, 2019
Discover how HPE is responding to the massive growth in enterprise data with intelligent storage. Data helps enterprises find new ways to reach and serve customers to grow profitability, but only when it is available at the right place and the right time. The growing complexity of managing and securing data prevents businesses from gaining its full value. Hewlett Packard Enterprise delivers the world’s most intelligent storage for the hybrid cloud world by providing storage that is driven by artificial intelligence, built for the cloud, and delivered as a service.
Tags : 
     Hewlett Packard Enterprise
By: Cisco EMEA     Published Date: Mar 26, 2019
Imagine if you could see deep into the future. And way back into the past, both at the same time. Imagine having visibility of everything that had ever happened and everything that was ever going to happen, everywhere, all at once. And then imagine processing power strong enough to make sense of all this data in every language and in every dimension. Unless you’ve achieved that digital data nirvana (and you haven’t told the rest of us), you’re going to have some unknowns in your world. In the world of security, unknown threats exist outside the enterprise in the form of malicious actors, state-sponsored attacks and malware that moves fast and destroys everything it touches. The unknown exists inside the enterprise in the form of insider threat from rogue employees or careless contractors – which was deemed by 24% of our survey respondents to pose the most serious risk to their organizations. The unknown exists in the form of new devices, new cloud applications, and new data. The unk
Tags : 
     Cisco EMEA
By: Cisco EMEA     Published Date: Mar 26, 2019
Most organizations have invested, and continue to invest, in people, processes, technology, and policies to meet customer privacy requirements and avoid significant fines and other penalties. In addition, data breaches continue to expose the personal information of millions of people, and organizations are concerned about the products they buy, services they use, people they employ, and with whom they partner and do business with generally. As a result, customers are asking more questions during the buying cycle about how their data is captured, used, transferred, shared, stored, and destroyed. In last year’s study (Cisco 2018 Privacy Maturity Benchmark Study), Cisco introduced data and insights regarding how these privacy concerns were negatively impacting the buying cycle and timelines. This year’s research updates those findings and explores the benefits associated with privacy investment. Cisco’s Data Privacy Benchmark Study utilizes data from Cisco’s Annual Cybersecurity Benchma
Tags : 
     Cisco EMEA
By: Intapp     Published Date: May 10, 2019
The Cornerstone of Financial Control Time equals money. Time plus data equals control. All professionals, whether in management, consulting, engineering, or accounting, must be confident that their value is reflected in their bottom line. One of the primary factors driving that compensation is the amount of time spent on a particular subject or client. But too often front line earners at those firms don’t provide the clean, data-rich timesheets needed to accurately gauge the effort required by each project.
Tags : business, business intelligence, time, tax, time for tax, intapp, applications, time data, automation, reporting, timekeeping, audit, accounting, consulting, professional services, active time capture, passive time capture, time tracking
     Intapp
By: Fidelis Cybersecurity     Published Date: May 15, 2019
When it comes to cybersecurity, you can only defend what you can see. Organizations continue to suffer breaches, oftentimes because they do not have continuous, real-time visibility of all their critical assets. With more data and applications moving to the cloud, IoT and other emerging technologies, the attack surface continues to expand, giving adversaries more blind spots to leverage. Watch a webinar with SANS where we examine how to: Discover, classify and profile assets and network communications Detect threats and decode content in real-time at wire speed Hunt for unknown threats via rich, indexable metadata Alter your terrain and attack surface with deception to slow down attackers By knowing your cyber terrain and increasing the risk of detection and cost to the adversary, you can gain a decisive advantage.
Tags : 
     Fidelis Cybersecurity
By: Bluecore     Published Date: May 07, 2019
For retailers, driving second purchases from first-time buyers presents enormous revenue potential in both the short and long term. But getting those second purchases often proves difficult. So what does it take? Data from over 400 retailers reveals why the second purchase is the most important purchase, why it eludes so many retailers and how to use customer data to turn one-time buyers into two-time buyers.
Tags : 
     Bluecore
By: IBM APAC     Published Date: May 14, 2019
IBM PowerAI Enterprise helps to make deep learning easier and faster for organizations by bringing together some of the most popular open source frameworks for deep learning, with development and management tools in a single installable package. Designed to simplify end-toend deep learning, PowerAI Enterprise allows enterprises to spend less time on data preparation, implementation and integration, and more time training neural networks for results. IBM PowerAI Enterprise version 1.1.2 includes the most popular deep learning frameworks in one installation: - BVLC Caffe - IBMCaffe - TensorFlow - PyTorch - Keras (tensorflow-keras)
Tags : 
     IBM APAC
By: Here Technologies     Published Date: Mar 29, 2019
Discover the four big trends in fleet management being powered by location services. Trends to help you differentiate your solutions and enable transportation companies to overcome their logistical challenges and increase asset utilization. Discover what’s making the biggest impact, together with how, by integrating some of these trends into your solutions, you can position yourself as the service provider of choice in fleet and transportation management solutions. And find out how HERE is delivering features, from comprehensive mapping capabilities and real-time location data, to truck-specific attributes, to help you do just that. Download the eBook now
Tags : here technologies, transport and logistics
     Here Technologies
By: Here Technologies     Published Date: Apr 01, 2019
On-demand companies rely on fast, accurate and robust mapping and location technologies to provide their users with a superior experience. Find out how real-time, predictive and historical traffic data can be applied to traffic-enabled routing algorithms to influence route calculations and automatically plot multiple routes with waypoints sequencing. Discover how HERE can help you communicate updated ETAs and provide an optimized experience to your drivers and customers.
Tags : location data, transport & logistics, location services
     Here Technologies
By: Group M_IBM Q2'19     Published Date: Apr 08, 2019
Empowering the Automotive Industry through Intelligent Orchestration With the increasing complexity and volume of cyberattacks, organizations must have the capacity to adapt quickly and confidently under changing conditions. Accelerating incident response times to safeguard the organization's infrastructure and data is paramount. Achieving this requires a thoughtful plan- one that addresses the security ecosystem, incorporates security orchestration and automation, and provides adaptive workflows to empower the security analysts. In the white paper "Six Steps for Building a Robust Incident Response Function" IBM Resilient provides a framework for security teams to build a strong incident response program and deliver organization-wide coordination and optimizations to accomplish these goals.
Tags : 
     Group M_IBM Q2'19
By: Forcepoint     Published Date: May 14, 2019
Downtime Is Not an Option: High Availability Next Generation Firewall Access to applications, data, and resources on the Internet is mission-critical for every organization. Downtime is unacceptable. Security for that network must also be highly available and not cause performance degradation of the network. The increased workload of security devices as they analyze traffic and defend users from malicious attacks strains computing resources. The next generation of security solutions must build in high availability that can scale as the business changes. Download this whitepaper to find out how high availability is at the core of the Forcepoint NGFW (Next Generation Firewall).
Tags : 
     Forcepoint
By: Forcepoint     Published Date: May 14, 2019
Things are not as they used to be in the enterprise. Today’s employees are mobile, they’re storing and accessing data in cloud apps, and are in disparate networks. While the present-day digital world has changed, the objective of data protection has not: you must still ensure the security of your critical data and intellectual property. However, the threat-centric security approach, with its static policies forces decisions about cyber activity with no insight into the broader context. The result is a disproportionate number of flagged activities, overwhelming security teams who have no way to understand the ones most worthy of investigation. Read Rethinking Data Security with a Risk-Adaptive Approach to learn how a human-centric, risk-adaptive approach can help your organization be more proactive in order to: • Automate policy enforcement to deter data loss events • Reduce the number of security alerts • Cut down on incident investigation time
Tags : 
     Forcepoint
By: Forcepoint     Published Date: May 14, 2019
Downtime Is Not an Option: High Availability Next Generation Firewall Access to applications, data, and resources on the Internet is mission-critical for every organization. Downtime is unacceptable. Security for that network must also be highly available and not cause performance degradation of the network. The increased workload of security devices as they analyze traffic and defend users from malicious attacks strains computing resources. The next generation of security solutions must build in high availability that can scale as the business changes. Download this whitepaper to find out how high availability is at the core of the Forcepoint NGFW (Next Generation Firewall).
Tags : 
     Forcepoint
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com