log data

Results 1 - 25 of 1714Sort Results By: Published Date | Title | Company Name
By: IBM     Published Date: Sep 02, 2014
Research teams using next-generation sequencing (NGS) technologies face the daunting challenge of supporting compute-intensive analysis methods against petabytes of data while simultaneously keeping pace with rapidly evolving algorithmic best practices. NGS users can now solve these challenges by deploying the Accelrys Enterprise Platform (AEP) and the NGS Collection on optimized systems from IBM. Learn how you can benefit from the turnkey IBM Application Ready Solution for Accelrys with supporting benchmark data.
Tags : ibm, accelrys, turnkey ngs solution
     IBM
By: IBM     Published Date: Sep 02, 2014
With tougher regulations and continuing market volatility, financial firms are moving to active risk management with a focus on counterparty risk. Firms are revamping their risk and trading practices from top to bottom. They are adopting new risk models and frameworks that support a holistic view of risk. Banks recognize that technology is critical for this transformation, and are adding state-of-the-art enterprise risk management solutions, high performance data and grid management software, and fast hardware. Join IBM Algorithmics and IBM Platform Computing to gain insights on this trend and on technologies for enabling active "real-time" risk management.
Tags : 
     IBM
By: Splice Machine     Published Date: Feb 13, 2014
Hadoop: Moving Beyond the Big Data Hype Let’s face it. There is a lot of hype surrounding Big Data and adoop, the defacto Big Data technology platform. Companies want to mine and act on massive data sets, or Big Data, to unlock insights that can help them improve operational efficiency, delight customers, and leapfrog their competition. Hadoop has become popular to store massive data sets because it can distribute them across inexpensive commodity servers. Hadoop is fundamentally a file system (HDFS or Hadoop Distributed File System) with a specialized programming model (MapReduce) to process the data in the files. Big Data has not lived up to expectations so far, partly because of limitations of Hadoop as a technology.
Tags : sql-on-hadoop® evaluation guide, splice machine, adoop
     Splice Machine
By: TIBCO     Published Date: Nov 09, 2015
As one of the most exciting and widely adopted open-source projects, Apache Spark in-memory clusters are driving new opportunities for application development as well as increased intake of IT infrastructure. Apache Spark is now the most active Apache project, with more than 600 contributions being made in the last 12 months by more than 200 organizations. A new survey conducted by Databricks—of 1,417 IT professionals working with Apache Spark finds that high-performance analytics applications that can work with big data are driving a large proportion of that demand. Apache Spark is now being used to aggregate multiple types of data in-memory versus only pulling data from Hadoop. For solution providers, the Apache Spark technology stack is a significant player because it’s one of the core technologies used to modernize data warehouses, a huge segment of the IT industry that accounts for multiple billions in revenue. Spark holds much promise for the future—with data lakes—a storage repo
Tags : 
     TIBCO
By: Storiant     Published Date: Mar 16, 2015
Read this new IDC Report about how today's enterprise datacenters are dealing with new challenges that are far more demanding than ever before. Foremost is the exponential growth of data, most of it unstructured data. Big data and analytics implementations are also quickly becoming a strategic priority in many enterprises, demanding online access to more data, which is retained for longer periods of time. Legacy storage solutions with fixed design characteristics and a cost structure that doesn't scale are proving to be ill-suited for these new needs. This Technology Spotlight examines the issues that are driving organizations to replace older archive and backup-and-restore systems with business continuity and always-available solutions that can scale to handle extreme data growth while leveraging a cloudbased pricing model. The report also looks at the role of Storiant and its long-term storage services solution in the strategically important long-term storage market.
Tags : storiant, big data, analytics implementations, cloudbased pricing model, long-term storage services solution, long-term storage market
     Storiant
By: Dell and Intel®     Published Date: Jun 18, 2015
The rapid evolution of big data technology in the past few years has changed forever the pursuit of scientific exploration and discovery. Along with traditional experiment and theory, computational modeling and simulation is a third paradigm for science. Its value lies in exploring areas of science in which physical experimentation is unfeasible and insights cannot be revealed analytically, such as in climate modeling, seismology and galaxy formation. More recently, big data has been called the “fourth paradigm" of science. Big data can be observed, in a real sense, by computers processing it and often by humans reviewing visualizations created from it. In the past, humans had to reduce the data, often using techniques of statistical sampling, to be able to make sense of it. Now, new big data processing techniques will help us make sense of it without traditional reduction
Tags : 
     Dell and Intel®
By: RedPoint     Published Date: Nov 10, 2014
Adoption of Hadoop by data-driven organizations is exploding. Hadoop’s potential cost effectiveness and facility for accepting unstructured data is making it central to modern, “Big Data” architectures. The advancements in Hadoop 2.0 increase the technology’s promise to an even greater extent. But with these opportunities also come challenges and adoption hurdles that make getting the most out of Hadoop easier said than done. Read on as we review some Hadoop basics, highlight some of the adoption challenges that exist and explain how RedPoint Data Management for Hadoop helps organizations accelerate their work with Hadoop.
Tags : 
     RedPoint
By: GridGain     Published Date: Sep 24, 2014
In-memory computing (IMC) is an emerging field of importance in the big data industry. It is a quickly evolving technology, seen by many as an effective way to address the proverbial 3 V’s of big data—volume, velocity, and variety. Big data requires ever more powerful means to process and analyze growing stores of data, being collected at more rapid rates, and with increasing diversity in the types of data being sought—both structured and unstructured. In-memory computing’s rapid rise in the marketplace has the big data community on alert. In fact, Gartner picked in-memory computing as one of the Top Ten Strategic Initiatives.
Tags : gridgain, in memory computing, big data industry, 3v's of big data-volume
     GridGain
By: Dell and Intel®     Published Date: Apr 02, 2015
In this Guide we have delivered the case for the benefits of big data technology applied to the needs of the manufacturing industry. In demonstrating the value of big data, we included: • An overview of how manufacturing can benefit from the big data technology stack • An overview of how manufacturing can benefit from the big data technology stack • A high-level view of common big data pain points for manufacturers • A detailed analysis of big data technology for manufacturers • A view as to how manufacturers are going about big data adoption • A proven case study with: Omneo • Dell PowerEdge servers with Intel® Xeon® processors
Tags : dell, intel, big data, manufacturing, technology stack, pain points, big data adoption, omneo
     Dell and Intel®
By: Dell and Intel®     Published Date: Aug 24, 2015
Many enterprises are embracing Hadoop because of the unique business benefits it provides. But, until now, this rapidly evolving big data technology hadn’t always met enterprise security needs. In order to protect big data today, organizations must have solutions that address four key areas: authentication, authorization, audit and lineage, and compliant data protection.
Tags : 
     Dell and Intel®
By: Dell and Intel®     Published Date: Aug 24, 2015
Business need: Merkle needed a scalable, cost-effective way to capture and analyze large amounts of structured and unstructured consumer data for use in developing better marketing campaigns for clients. Solution: The company deployed a Dell and HadoopTM cluster based on Dell and Intel® technologies to support a new big data insight solution that gives clients a unified view of customer data. Benefits: [bullets for the below points] • Partnership with Dell and Intel® leads to new big data solution • Cluster supports the Foundational Marketing Platform, a new data insight solution • Merkle can find patterns in big data and create analytical models that anticipate consumer behavior • Organization cuts costs by 60 percent and boosts processing speeds by 10 times • Solution provides scalability and enables innovation
Tags : 
     Dell and Intel®
By: Dell and Intel®     Published Date: Aug 24, 2015
New technologies help decision makers gain insights from all types of data - from traditional databases to high-visibility social media sources. Big data initiatives must ensure data is cost-effectively managed, shared by systems across the enterprise, and quickly and securely made available for analysis and action by line-of-business teams. In this article, learn how Dell working with Intel® helps IT leaders overcome the challenges of IT and business alignment, resource constraints and siloed environments through a comprehensive big data portfolio based on choice and flexibility, redefined economics and connected intelligence.
Tags : 
     Dell and Intel®
By: Dell and Intel®     Published Date: Aug 24, 2015
In today’s digitally driven world, the success of a business is increasingly tied to its ability to extract value from data. Exploiting the untapped value of your data is now the pathway to success. By putting data-driven decision making at the heart of the business, your organization can harness a wealth of information to gain an unparalleled competitive advantage. In a future-ready enterprise, you must make a fundamental shift from a focus on technology to a strategic business focus. Data-driven insights can guide everything from the formulation of top-level corporate strategies to connected devices that monitor and enable immediate critical decisions, to the creation of personalized customer interactions. Data is the foundation for enabling business transformation and innovation.
Tags : 
     Dell and Intel®
By: Dell and Intel®     Published Date: Aug 24, 2015
To extract value from an ever-growing onslaught of data, your organization needs next-generation data management, integration, storage and processing systems that allow you to collect, manage, store and analyze data quickly, efficiently and cost-effectively. That’s the case with Dell| Cloudera® Apache™ Hadoop® solutions for big data. These solutions provide end-to-end scalable infrastructure, leveraging open source technologies, to allow you to simultaneously store and process large datasets in a distributed environment for data mining and analysis, on both structured and unstructured data, and to do it all in an affordable manner.
Tags : 
     Dell and Intel®
By: Dell and Intel®     Published Date: Sep 06, 2015
In conclusion, the retail experience has changed dramatically in recent years as there has been a power shift over to consumers. Shoppers can easily find and compare products from an array of devices, even while walking through a store. They can share their opinions about retailers and products through social media and influence other prospective customers. To compete in this new multi-channel environment, we’ve seen in this guide how retailers have to adopt new and innovative strategies to attract and retain customers. Big data technologies, specifically Hadoop, enable retailers to connect with customers through multiple channels at an entirely new level by harnessing the vast volumes of new data available today. Hadoop helps retailers store, transform, integrate and analyze a wide variety of online and offline customer data—POS transactions, e-commerce transactions, clickstream data, email, social media, sensor data and call center records—all in one central repository. Retailers can
Tags : 
     Dell and Intel®
By: snowflake     Published Date: Jun 09, 2016
Data and the way that data is used have changed, but data warehousing has not. Today’s premises-based data warehouses are based on technology that is, at its core, two decades old. To meet the demands and opportunities of today, data warehouses have to fundamentally change.
Tags : 
     snowflake
By: snowflake     Published Date: Jun 09, 2016
Why Read This Report In the era of big data, enterprise data warehouse (EDW) technology continues to evolve as vendors focus on innovation and advanced features around in-memory, compression, security, and tighter integration with Hadoop, NoSQL, and cloud. Forrester identified the 10 most significant EDW software and services providers — Actian, Amazon Web Services (AWS), Hewlett Packard Enterprise (HPE), IBM, Microsoft, Oracle, Pivotal Software, SAP, Snowflake Computing, and Teradata — in the category and researched, analyzed, and scored them. This report details our findings about how well each vendor fulfills our criteria and where they stand in relation to each other to help enterprise architect professionals select the right solution to support their data warehouse platform.
Tags : 
     snowflake
By: snowflake     Published Date: Jun 09, 2016
Today’s data, and how that data is used, have changed dramatically in the past few years. Data now comes from everywhere—not just enterprise applications, but also websites, log files, social media, sensors, web services, and more. Organizations want to make that data available to all of their analysts as quickly as possible, not limit access to only a few highly-skilled data scientists. However, these efforts are quickly frustrated by the limitations of current data warehouse technologies. These systems simply were not built to handle the diversity of today’s data and analytics. They are based on decades-old architectures designed for a different world, a world where data was limited, users of data were few, and all processing was done in on-premises datacenters.
Tags : 
     snowflake
By: Hewlett Packard Enterprise     Published Date: Jul 29, 2019
Learn about the HPE Intelligent Data Platform and the new IT realities it addresses. With digital transformation underway in many organizations, more dynamic business models are becoming the key to success. This means infrastructure modernization and the introduction of technologies such as solid state storage, artificial intelligence and machine learning, software-defined infrastructure, and the cloud. At the same time, it means IT infrastructure management becomes much more complex. Enter HPE’s Intelligent Data Platform. With comprehensive coverage and AI/ML-driven real-time optimization that enables intelligent management of the entire data life cycle, the HPE Intelligent Data Platform enables an organization to get the most out of its IT resources while also meeting its evolving needs over time.
Tags : 
     Hewlett Packard Enterprise
By: VMware     Published Date: Sep 12, 2019
You’ve heard the stories: a large Internet company exposing all three billion of its customer accounts; a major hotel chain compromising five hundred million customer records; and one of the big-three credit reporting agencies exposing more than 143 million records, leading to a 25 percent loss in value and a $439 million hit. At the time, all of these companies had security mechanisms in place. They had trained professionals on the job. They had invested heavily in protection. But the reality is that no amount of investment in preventative technologies can fully eliminate the threat of savvy attackers, malicious insiders, or inadvertent victims of phishing. Breaches are rising, and so are their cost. In 2018, the average cost of a data breach rose 6.4 percent to $3.86 million, and the cost of a “mega breach,” those defined as losing 1 million to 50 million records, carried especially punishing price tags between $40 million and $350 million.2 Despite increasing investment in security
Tags : 
     VMware
By: F5 Networks Singapore Pte Ltd     Published Date: Sep 09, 2019
Tech advances like the cloud, mobile technology, and the app-based software model have changed the way today’s modern business operates. They’ve also changed the way criminals attack and steal from businesses. Criminals strive to be agile in much the same way that companies do. Spreading malware is a favorite technique among attackers. According to the 2019 Data Breach Investigations Report, 28% of data breaches included malware.¹ While malware’s pervasiveness may not come as a surprise to many people, what’s not always so well understood is that automating app attacks—by means of malicious bots —is the most common way cybercriminals commit their crimes and spread malware. It helps them achieve scale.
Tags : 
     F5 Networks Singapore Pte Ltd
By: Gigamon     Published Date: Sep 03, 2019
Network performance and security are vital elements of any business. Organisations are increasingly adopting virtualisation and cloud technologies to boost productivity, cost savings and market reach. With the added complexity of distributed network architectures, full visibility is necessary to ensure continued high performance and security. Greater volumes of data, rapidlyevolving threats and stricter regulations have forced organisations to deploy new categories of security tools, e.g. Web Access Firewalls (WAFs) or Intrusion Prevention Systems (IPS). Yet, simply adding more security tools may not always be the most efficient solution.
Tags : 
     Gigamon
By: HERE Technologies     Published Date: Aug 28, 2019
Mapping, tracking, positioning and real-time data arekey to supporting defense and intelligence initiatives. Governments and agencies need location data they can trust to track and adjust fixed and mobile resources to address rapidly changing events and circumstances. With Ovum's Location Platform Index: Mapping and Navigation, agencies can assess location platform industry leaders and identify the platform that best meets their product development demands. This year, HERE Technologies cemented its role as the industry leader, earning the highest ranking, and besting Google, for the second time in a row. Download your free report to learn: The relative strengths and weaknesses of each vendor, including data, enablers and features Vendor strategies to keep up with changes in technologies and trends The specific workings of the location platform market, and to better understand what constitutes a healthy location platform and which provider offers the correct portfolio and the necess
Tags : data, platform, mapping, developers, index, vendors, capabilities, maps
     HERE Technologies
By: TIBCO Software     Published Date: Jul 22, 2019
Connected Intelligence in Insurance Insurance as we know it is transforming dramatically, thanks to capabilities brought about by new technologies such as machine learning and artificial intelligence (AI). Download this IDC Analyst Infobrief to learn about how the new breed of insurers are becoming more personalized, more predictive, and more real-time than ever. What you will learn: The insurance industry's global digital trends, supported by data and analysis What capabilities will make the insurers of the future become disruptors in their industry Notable leaders based on IDC Financial Insights research and their respective use cases Essential guidance from IDC
Tags : 
     TIBCO Software
By: CloudHealth by VMware     Published Date: Sep 05, 2019
Both the speed of innovation and the uniqueness of cloud technology is forcing security teams everywhere to rethink classic security concepts and processes. In order to keep their cloud environment secure, businesses are implementing new security strategies that address the distributed nature of cloud infrastructure. Security in the cloud involves policies, procedures, controls, and technologies working together to protect your cloud resources, which includes stored data, deployed applications, and more. But how do you know which cloud service provider offers the best security services? And what do you do if you’re working on improving security for a hybrid or multicloud environment? This ebook provides a security comparison across the three main public cloud providers: Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). With insight from leading cloud experts, we also analyze the differences between security in the cloud and on-premises infrastructure, debunk
Tags : 
     CloudHealth by VMware
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com