time data

Results 1 - 25 of 1151Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 19, 2013
SAP HANA enables real-time access to mission-critical, business data and thus, revolutionizes the way existing information can be utilized to address ever changing business requirements. This whitepaper describes both the business and technical benefits of implementing the Cisco UCS with NetApp Storage for SAP HANA solution.
Tags : 
     NetApp
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: Storiant     Published Date: Mar 16, 2015
Read this new IDC Report about how today's enterprise datacenters are dealing with new challenges that are far more demanding than ever before. Foremost is the exponential growth of data, most of it unstructured data. Big data and analytics implementations are also quickly becoming a strategic priority in many enterprises, demanding online access to more data, which is retained for longer periods of time. Legacy storage solutions with fixed design characteristics and a cost structure that doesn't scale are proving to be ill-suited for these new needs. This Technology Spotlight examines the issues that are driving organizations to replace older archive and backup-and-restore systems with business continuity and always-available solutions that can scale to handle extreme data growth while leveraging a cloudbased pricing model. The report also looks at the role of Storiant and its long-term storage services solution in the strategically important long-term storage market.
Tags : storiant, big data, analytics implementations, cloudbased pricing model, long-term storage services solution, long-term storage market
     Storiant
By: Impetus     Published Date: Feb 04, 2016
This white paper explores strategies to leverage the steady flow of new, advanced real-time streaming data analytics (RTSA) application development technologies. It defines a thoughtful approach to capitalize on the window of opportunity to benefit from the power of real-time decision making now, and still be able to move to new and emerging technologies as they become enterprise ready.
Tags : 
     Impetus
By: Impetus     Published Date: Mar 15, 2016
Streaming analytics platforms provide businesses a method for extracting strategic value from data-in-motion in a manner similar to how traditional analytics tools operate on data-at rest. Instead of historical analysis, the goal with streaming analytics is to enable near real-time decision making by letting companies inspect, correlate and analyze data even as it flows into applications and databases from numerous different sources. Streaming analytics allows companies to do event processing against massive volumes of data streaming into the enterprise at high velocity.
Tags : impetus, guide to stream analytics, real time streaming analytics, streaming analytics, real time analytics, big data analytics
     Impetus
By: RedPoint     Published Date: Sep 22, 2014
Enterprises can gain serious traction by taking advantage of the scalability, processing power and lower costs that Hadoop 2.0/YARN offers. YARN closes the functionality gap by opening Hadoop to mature enterprise-class data management capabilities. With a lot of data quality functionality left outside of Hadoop 1, and a lot of data inside HDFS originating outside the enterprise, the quality of the data residing in the Hadoop cluster is sometimes as stinky as elephant dung. Some of the topics discussed in this paper include: • The key features, benefits and limitations of Hadoop 1.0 • The benefit of performing data standardization, identity resolution, and master data management inside of Hadoop. • The transformative power of Hadoop 2.0 and its impact on the speed and cost of accessing, cleansing and delivering high-quality enterprise data. Download this illuminating white paper about what YARN really means to the world of big data management.
Tags : 
     RedPoint
By: MEMSQL     Published Date: Apr 12, 2016
The pace of data is not slowing. Applications of today are built with infinite data sets in mind. As these real-time applications become the norm, and batch processing becomes a relic of the past, digital enterprises will implement memory-optimized, distributed data systems to simplify Lambda Architectures for real-time data processing and exploration.
Tags : 
     MEMSQL
By: Kx Systems     Published Date: Jan 16, 2015
?Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we wa
Tags : kx systems, kdb+, relational database
     Kx Systems
By: snowflake     Published Date: Jun 09, 2016
THE CHALLENGE: DATA SOLUTIONS CAN’T KEEP PACE WITH DATA NEEDS Organizations are increasingly dependent on diff erent types of data to make successful business decisions. But as the volume, rate, and types of data expand and become less predictable, conventional data warehouses cannot consume all this data eff ectively. Big data solutions like Hadoop increase the complexity of the environment and generally lack the performance of traditional data warehouses. This makes it difficult, expensive, and time-consuming to manage all the systems and the data.
Tags : 
     snowflake
By: Cask     Published Date: Jun 28, 2016
A recent Gartner survey on Hadoop cited the two biggest challenges in working with Hadoop: “Skills gaps continue to be a major adoption inhibitor for 57% of respondents, while deciding how to get value from Hadoop was cited by 49% of respondents.” Cask is the company that makes building and deploying big data apps easy, allowing for 5 times faster time to value. To find out more, read about Cask Hydrator, a self-service, open source framework that lets data scientists easily develop and operate data pipelines using a graphical interface.
Tags : cask hydrator, hadoop, gartner survey, self-service data lakes
     Cask
By: BitStew     Published Date: May 26, 2016
The heaviest lift for an industrial enterprise is data integration, the Achilles’ heel of the Industrial Internet of Things (IIoT). Companies are now recognizing the enormous challenge involved in supporting Big Data strategies that can handle the data that is generated by information systems, operational systems and the extensive networks of old and new sensors. To compound these issues, business leaders are expecting data to be captured, analyzed and used in a near real-time to optimize business processes, drive efficiency and improve profitability. However, integrating this vast amount of dissimilar data into a unified data strategy can be overwhelming for even the largest organizations. Download this white paper, by Bit Stew’s Mike Varney, to learn why a big data solution will not get the job done. Learn how to leverage machine intelligence with a purpose-built IIoT platform to solve the data integration problem.
Tags : 
     BitStew
By: Hewlett Packard Enterprise     Published Date: Jul 29, 2019
Learn about the HPE Intelligent Data Platform and the new IT realities it addresses. With digital transformation underway in many organizations, more dynamic business models are becoming the key to success. This means infrastructure modernization and the introduction of technologies such as solid state storage, artificial intelligence and machine learning, software-defined infrastructure, and the cloud. At the same time, it means IT infrastructure management becomes much more complex. Enter HPE’s Intelligent Data Platform. With comprehensive coverage and AI/ML-driven real-time optimization that enables intelligent management of the entire data life cycle, the HPE Intelligent Data Platform enables an organization to get the most out of its IT resources while also meeting its evolving needs over time.
Tags : 
     Hewlett Packard Enterprise
By: Gigamon     Published Date: Sep 03, 2019
We’ve arrived at the second anniversary of the Equifax breach and we now know much more about what happened due to the August 2018 release of the GAO Report. New information came out of that report that was not well-understood at the time of the breach. For example, did you know that while Equifax used a tool for network layer decryption, they had certificates nine months out of date? This lapse gave the threat actors all the time they needed to break in and exfiltrate reams of personal data. As soon as Equifax updated the certs on their decryption tools, they began to realize what happened. On the heels of the Equifax breach, we are reminded of the importance of efficient decryption for effective threat detection. That’s more important than ever today; Ponemon Institute reports that 50% of all malware attacks utilize encryption. During this webinar, we’ll talk about: -How TLS/SSL encryption has become a threat vector -Why decryption is essential to security and how to effectively pe
Tags : 
     Gigamon
By: Gigamon     Published Date: Sep 03, 2019
The IT pendulum is swinging to distributed computing environments, network perimeters are dissolving, and compute is being distributed across various parts of organizations’ infrastructure—including, at times, their extended ecosystem. As a result, organizations need to ensure the appropriate levels of visibility and security at these remote locations, without dramatically increasing staff or tools. They need to invest in solutions that can scale to provide increased coverage and visibility, but that also ensure efficient use of resources. By implementing a common distributed data services layer as part of a comprehensive security operations and analytics platform architecture (SOAPA) and network operations architecture, organizations can reduce costs, mitigate risks, and improve operational efficiency.
Tags : 
     Gigamon
By: HERE Technologies     Published Date: Jul 11, 2019
Supply chain managers are increasingly leveraging location intelligence and location data to raise visibility throughout their whole logistics process and to optimize their delivery routes. Leveraging this data requires an ever-more-robust technology stack. As supply chain technology stacks become more complex, diverse and defined by legacy system integrations, Application Program Interfaces (APIs) are becoming essential to making stacks scale, allowing supply chain managers to better meet the demands of the new generation of consumers. Innovative location APIs provide supply chain stacks and applications with: Greater agility Contextual intelligence Real-time data implementation Speed Scale Introducing new technology into an organization can sometimes be daunting. As one of the world’s leading location platforms, HERE shares insights and tips to streamline the supply chain technology integration across the whole organization.
Tags : here technologies, supply chain, mapping
     HERE Technologies
By: HERE Technologies     Published Date: Aug 28, 2019
Mapping, tracking, positioning and real-time data arekey to supporting defense and intelligence initiatives. Governments and agencies need location data they can trust to track and adjust fixed and mobile resources to address rapidly changing events and circumstances. With Ovum's Location Platform Index: Mapping and Navigation, agencies can assess location platform industry leaders and identify the platform that best meets their product development demands. This year, HERE Technologies cemented its role as the industry leader, earning the highest ranking, and besting Google, for the second time in a row. Download your free report to learn: The relative strengths and weaknesses of each vendor, including data, enablers and features Vendor strategies to keep up with changes in technologies and trends The specific workings of the location platform market, and to better understand what constitutes a healthy location platform and which provider offers the correct portfolio and the necess
Tags : data, platform, mapping, developers, index, vendors, capabilities, maps
     HERE Technologies
By: Cisco Umbrella EMEA     Published Date: Sep 02, 2019
Users are working off-hours, off-network, and off-VPN. Are you up on all the ways DNS can be used to secure them? If not, maybe it’s time to brush up. More than 91% of malware uses DNS to gain command and control, exfiltrate data, or redirect web traffic. Because DNS is a protocol used by all devices that connect to the internet, security at the DNS layer is critical for achieving the visibility and protection you need for any users accessing the internet. Learn how DNS-layer security can help you block threats before they reach your network or endpoints.
Tags : 
     Cisco Umbrella EMEA
By: HERE Technologies     Published Date: Jun 14, 2019
With upward of a billion vehicles in operation across the world and rising urbanization there is an unprecedented level of traffic and congestion in our major towns and cities. On the front-line, are the emergency dispatchers and responders facing complex challenges as they attempt to overcome congested traffic, unexpected road closures and work zones as quickly as possible. Each additional minute of response time has the potential to save a life, reduce suffering or prevent unnecessary property damage. Most public safety and security organizations, however, are still using legacy location technology, which has its limitations and does not properly address some key challenges. As one of the world’s leading location platforms, HERE shares insights and solutions to improve emergency response times with real-time location data
Tags : mapping, public saftey, location data
     HERE Technologies
By: TIBCO Software     Published Date: Feb 13, 2019
Analyst firms Gartner, Inc. and Forrester are projecting accelerated data virtualization adoption for both first-time and expanded deployments. What are the uses cases for this technology? At its Data and Analytics Summit in London in March 2018, Gartner answered this question by identifying 13 data virtualization use cases. This paper explores each of these use cases by: Identifying key requirements Showing how you can apply TIBCO® Data Virtualization to address these needs Listing the benefits you can expect when implementing TIBCO Data Virtualization for the use case
Tags : 
     TIBCO Software
By: TIBCO Software     Published Date: Feb 14, 2019
Tips and best practices for data analytics executives Organizations today understand the value to be derived from arguably their greatest asset—data. When successfully aggregated and analyzed, data can unlock valuable insights, solve problems, improve products and services, and help companies gain a competitive edge. However, analytics executives face significant challenges in collecting, validating and analyzing data to deliver the right analytic insight to the right person at the right time. This e-book is designed to help. First, we'll explore the growing expectations for data analytics and the rise of the analytics executive. Then we'll explore a range of specific challenges those executives face, including those around data blending, analytics, and the organization itself, and offer best practices and strategies for meeting them.
Tags : 
     TIBCO Software
By: TIBCO Software     Published Date: Feb 14, 2019
Analyst firms Gartner, Inc. and Forrester are projecting accelerated data virtualization adoption for both first-time and expanded deployments. What are the uses cases for this technology? At its Data and Analytics Summit in London in March 2018, Gartner answered this question by identifying 13 data virtualization use cases. This paper explores each of these use cases by: Identifying key requirements Showing how you can apply TIBCO® Data Virtualization to address these needs Listing the benefits you can expect when implementing TIBCO Data Virtualization for the use case
Tags : 
     TIBCO Software
By: TIBCO Software     Published Date: Feb 14, 2019
Tips and best practices for data analytics executives Organizations today understand the value to be derived from arguably their greatest asset—data. When successfully aggregated and analyzed, data can unlock valuable insights, solve problems, improve products and services, and help companies gain a competitive edge. However, analytics executives face significant challenges in collecting, validating and analyzing data to deliver the right analytic insight to the right person at the right time. This e-book is designed to help. First, we'll explore the growing expectations for data analytics and the rise of the analytics executive. Then we'll explore a range of specific challenges those executives face, including those around data blending, analytics, and the organization itself, and offer best practices and strategies for meeting them. We'll also provide a short overview of TIBCO Statistica, an easy-to-use predictive analytics software solution designed to turn big data into your bigg
Tags : 
     TIBCO Software
By: Group M_IBM Q3'19     Published Date: Jul 01, 2019
This white paper considers the pressures that enterprises face as the volume, variety, and velocity of relevant data mount and the time to insight seems unacceptably long. Most IT environments seeking to leverage statistical data in a useful way for analysis that can power decision making must glean that data from many sources, put it together in a relational database that requires special configuration and tuning, and only then make it available for data scientists to build models that are useful for business analysts. The complexity of all this is further compounded by the need to collect and analyze data that may reside in a classic datacenter on the premises as well as in private and public cloud systems. This need demands that the configuration support a hybrid cloud environment. After describing these issues, we consider the usefulness of a purpose-built database system that can accelerate access to and management of relevant data and is designed to deliver high performance for t
Tags : 
     Group M_IBM Q3'19
By: Group M_IBM Q3'19     Published Date: Sep 04, 2019
This white paper considers the pressures that enterprises face as the volume, variety, and velocity of relevant data mount and the time to insight seems unacceptably long. Most IT environments seeking to leverage statistical data in a useful way for analysis that can power decision making must glean that data from many sources, put it together in a relational database that requires special configuration and tuning, and only then make it available for data scientists to build models that are useful for business analysts. The complexity of all this is further compounded by the need to collect and analyze data that may reside in a classic datacenter on the premises as well as in private and public cloud systems. This need demands that the configuration support a hybrid cloud environment. After describing these issues, we consider the usefulness of a purpose-built database system that can accelerate access to and management of relevant data and is designed to deliver high performance for t
Tags : 
     Group M_IBM Q3'19
By: MicroStrategy     Published Date: Aug 28, 2019
Why HyperIntelligence? Today, despite massive investments in data, IT infrastructure, and analytics software, the adoption of analytics continues to lag behind. In fact, according to Gartner, most organizations fail to hit the 30% mark. That means that more than 70% of people at most organizations are going without access to the critical information they need to perform to the best of their abilities. What’s stopping organizations from breaking through the 30% barrier and driving the pervasive adoption of intelligence? Simple. The majority of existing tools only cater to users who are naturally analytically inclined—the analysts, data scientists, and architects of the world. The other 70%—the people making the operational decisions daily within a business—simply lack the time, skill, or desire to seek out data and intelligence on their own. HyperIntelligence helps organizations operationalize their existing investments and arm everyone across the organization with intelligence. Whether
Tags : 
     MicroStrategy
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com