data platform

Results 1 - 25 of 767Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 13, 2013
Despite the hype, Big Data has introduced critical challenges for modern organizations – and unprepared organizations risk getting buried beneath an avalanche of information. In this informative webcast, join industry and business intelligence (BI) expert Wayne Eckerson, as he tackles the challenges of Big Data. Uncover practical tips and tactics for driving value with your Big Data platform – watch now to learn more.
Tags : big data problems, how to get the most from your big data
     NetApp
By: IBM     Published Date: Sep 02, 2014
Life Sciences organizations need to be able to build IT infrastructures that are dynamic, scalable, easy to deploy and manage, with simplified provisioning, high performance, high utilization and able to exploit both data intensive and server intensive workloads, including Hadop MapReduce. Solutions must scale, both in terms of processing and storage, in order to better serve the institution long-term. There is a life cycle management of data, and making it useable for mainstream analyses and applications is an important aspect in system design. This presentation will describe IT requirements and how Technical Computing solutions from IBM and Platform Computing will address these challenges and deliver greater ROI and accelerated time to results for Life Sciences.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : ibm, ibm platform computing, save money
     IBM
By: IBM     Published Date: Sep 02, 2014
Research teams using next-generation sequencing (NGS) technologies face the daunting challenge of supporting compute-intensive analysis methods against petabytes of data while simultaneously keeping pace with rapidly evolving algorithmic best practices. NGS users can now solve these challenges by deploying the Accelrys Enterprise Platform (AEP) and the NGS Collection on optimized systems from IBM. Learn how you can benefit from the turnkey IBM Application Ready Solution for Accelrys with supporting benchmark data.
Tags : ibm, accelrys, turnkey ngs solution
     IBM
By: IBM     Published Date: Sep 02, 2014
With tougher regulations and continuing market volatility, financial firms are moving to active risk management with a focus on counterparty risk. Firms are revamping their risk and trading practices from top to bottom. They are adopting new risk models and frameworks that support a holistic view of risk. Banks recognize that technology is critical for this transformation, and are adding state-of-the-art enterprise risk management solutions, high performance data and grid management software, and fast hardware. Join IBM Algorithmics and IBM Platform Computing to gain insights on this trend and on technologies for enabling active "real-time" risk management.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
IBM Platform Symphony - accelerate big data analytics – This demonstration will highlight the benefits and features of IBM Platform Symphony to accelerate big data analytics by maximizing distributed system performance, fully utilizing computing resources and effectively harnessing the power of Hadoop.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Every day, the world creates 2.5 quintillion bytes of data and businesses are realizing tangible results from investments in big data analytics. IBM Spectrum Scale (GPFS) offers an enterprise class alternative to Hadoop Distributed File System (HDFS) for building big data platforms and provides a range of enterprise-class data management features. Spectrum Scale can be deployed independently or with IBM’s big data platform, consisting of IBM InfoSphere® BigInsights™ and IBM Platform™ Symphony. This document describes best practices for deploying Spectrum Scale in such environments to help ensure optimal performance and reliability.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
There is a lot of hype around the potential of big data and organizations are hoping to achieve new innovations in products and services with big data and analytics driving more concrete insights about their customers and their own business operations. To meet these challenges, IBM has introduced IBM® Spectrum Scale™. The new IBM Spectrum Scale storage platform has grown from GPFS, which entered the market in 1998. Clearly, IBM has put significant development into developing this mature platform. Spectrum Scale addresses the key requirements of big data storage - extreme scalability for growth, reduced overhead of data movement, easy accessibility , geographic location independence and advanced storage functionality. Read the paper to learn more!
Tags : 
     IBM
By: Splice Machine     Published Date: Feb 13, 2014
Hadoop: Moving Beyond the Big Data Hype Let’s face it. There is a lot of hype surrounding Big Data and adoop, the defacto Big Data technology platform. Companies want to mine and act on massive data sets, or Big Data, to unlock insights that can help them improve operational efficiency, delight customers, and leapfrog their competition. Hadoop has become popular to store massive data sets because it can distribute them across inexpensive commodity servers. Hadoop is fundamentally a file system (HDFS or Hadoop Distributed File System) with a specialized programming model (MapReduce) to process the data in the files. Big Data has not lived up to expectations so far, partly because of limitations of Hadoop as a technology.
Tags : sql-on-hadoop® evaluation guide, splice machine, adoop
     Splice Machine
By: EMC     Published Date: Jun 13, 2016
EMC® Isilon® is a simple and scalable platform to build a scale-out data lake. Consolidate storage silos, improve storage utilization, reduce costs, and prove a future proofed platform to run today and tomorrow's workloads.
Tags : 
     EMC
By: EMC     Published Date: Jun 13, 2016
IDC believes that EMC Isilon is indeed an easy to operate, highly scalable and efficient Enterprise Data Lake Platform. IDC validated that a shared storage model based on the Data Lake can in fact provide enterprise-grade service-levels while performing better than dedicated commodity off-the-shelf storage for Hadoop workloads.
Tags : 
     EMC
By: IBM     Published Date: Sep 04, 2013
This white paper outline the vale of the big data that continues to accumulate within your organization. It show how by making this data more accessible to relevant stakeholders you can unlock the value and insights, while minimizing risk.
Tags : 
     IBM
By: IBM     Published Date: Nov 14, 2014
Every day, the world creates 2.5 quintillion bytes of data and businesses are realizing tangible results from investments in big data analytics. IBM Spectrum Scale (GPFS) offers an enterprise class alternative to Hadoop Distributed File System (HDFS) for building big data platforms and provides a range of enterprise-class data management features. Spectrum Scale can be deployed independently or with IBM’s big data platform, consisting of IBM InfoSphere® BigInsights™ and IBM Platform™ Symphony. This document describes best practices for deploying Spectrum Scale in such environments to help ensure optimal performance and reliability.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
To quickly and economically meet new and peak demands, Platform LSF (SaaS) and Platform Symphony (SaaS) workload management as well as Elastic Storage on Cloud data management software can be delivered as a service, complete with SoftLayer cloud infrastructure and 24x7 support for technical computing and service-oriented workloads. Watch this demonstration to learn how the IBM Platform Computing Cloud Service can be used to simplify and accelerate financial risk management using IBM Algorithmics.
Tags : 
     IBM
By: Dell and Intel®     Published Date: Aug 24, 2015
Business need: Merkle needed a scalable, cost-effective way to capture and analyze large amounts of structured and unstructured consumer data for use in developing better marketing campaigns for clients. Solution: The company deployed a Dell and HadoopTM cluster based on Dell and Intel® technologies to support a new big data insight solution that gives clients a unified view of customer data. Benefits: [bullets for the below points] • Partnership with Dell and Intel® leads to new big data solution • Cluster supports the Foundational Marketing Platform, a new data insight solution • Merkle can find patterns in big data and create analytical models that anticipate consumer behavior • Organization cuts costs by 60 percent and boosts processing speeds by 10 times • Solution provides scalability and enables innovation
Tags : 
     Dell and Intel®
By: snowflake     Published Date: Jun 09, 2016
Why Read This Report In the era of big data, enterprise data warehouse (EDW) technology continues to evolve as vendors focus on innovation and advanced features around in-memory, compression, security, and tighter integration with Hadoop, NoSQL, and cloud. Forrester identified the 10 most significant EDW software and services providers — Actian, Amazon Web Services (AWS), Hewlett Packard Enterprise (HPE), IBM, Microsoft, Oracle, Pivotal Software, SAP, Snowflake Computing, and Teradata — in the category and researched, analyzed, and scored them. This report details our findings about how well each vendor fulfills our criteria and where they stand in relation to each other to help enterprise architect professionals select the right solution to support their data warehouse platform.
Tags : 
     snowflake
By: BitStew     Published Date: May 26, 2016
The heaviest lift for an industrial enterprise is data integration, the Achilles’ heel of the Industrial Internet of Things (IIoT). Companies are now recognizing the enormous challenge involved in supporting Big Data strategies that can handle the data that is generated by information systems, operational systems and the extensive networks of old and new sensors. To compound these issues, business leaders are expecting data to be captured, analyzed and used in a near real-time to optimize business processes, drive efficiency and improve profitability. However, integrating this vast amount of dissimilar data into a unified data strategy can be overwhelming for even the largest organizations. Download this white paper, by Bit Stew’s Mike Varney, to learn why a big data solution will not get the job done. Learn how to leverage machine intelligence with a purpose-built IIoT platform to solve the data integration problem.
Tags : 
     BitStew
By: Dell APAC     Published Date: May 29, 2019
Digital transformation (DX) is reaching a macroeconomic scale. DX business objectives are balanced between tactical and strategic objectives and range from improvement in operational efficiencies and customer satisfaction to increasing existing product revenue to improving profit margins to launching new digital revenue streams. Successful DX relies on utilizing data for services as well as converting data into actionable insights. This reliance on data is contributing to a new digital era. 3rd Platform (cloud, social, mobile, and Big Data) computing is the underpinning of DX worldwide. It enables collection of a vast breadth of data sets and delivers the agility and efficiency needed to accelerate DX
Tags : 
     Dell APAC
By: Dell APAC     Published Date: May 30, 2019
Digital transformation (DX) is reaching a macroeconomic scale. DX business objectives are balanced between tactical and strategic objectives and range from improvement in operational efficiencies and customer satisfaction to increasing existing product revenue to improving profit margins to launching new digital revenue streams. Successful DX relies on utilizing data for services as well as converting data into actionable insights. This reliance on data is contributing to a new digital era. 3rd Platform (cloud, social, mobile, and Big Data) computing is the underpinning of DX worldwide. It enables collection of a vast breadth of data sets and delivers the agility and efficiency needed to accelerate DX
Tags : 
     Dell APAC
By: Dell APAC     Published Date: May 30, 2019
Digital transformation (DX) is reaching a macroeconomic scale. DX business objectives are balanced between tactical and strategic objectives and range from improvement in operational efficiencies and customer satisfaction to increasing existing product revenue to improving profit margins to launching new digital revenue streams. Successful DX relies on utilizing data for services as well as converting data into actionable insights. This reliance on data is contributing to a new digital era. 3rd Platform (cloud, social, mobile, and Big Data) computing is the underpinning of DX worldwide. It enables collection of a vast breadth of data sets and delivers the agility and efficiency needed to accelerate DX
Tags : 
     Dell APAC
By: Hewlett Packard Enterprise     Published Date: Jul 29, 2019
Learn about the HPE Intelligent Data Platform and the new IT realities it addresses. With digital transformation underway in many organizations, more dynamic business models are becoming the key to success. This means infrastructure modernization and the introduction of technologies such as solid state storage, artificial intelligence and machine learning, software-defined infrastructure, and the cloud. At the same time, it means IT infrastructure management becomes much more complex. Enter HPE’s Intelligent Data Platform. With comprehensive coverage and AI/ML-driven real-time optimization that enables intelligent management of the entire data life cycle, the HPE Intelligent Data Platform enables an organization to get the most out of its IT resources while also meeting its evolving needs over time.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Jul 25, 2019
Learn about the HPE Intelligent Data Platform and the new IT realities it addresses. With digital transformation underway in many organizations, more dynamic business models are becoming the key to success. This means infrastructure modernization and the introduction of technologies such as solid state storage, artificial intelligence and machine learning, software-defined infrastructure, and the cloud. At the same time, it means IT infrastructure management becomes much more complex. Enter HPE’s Intelligent Data Platform. With comprehensive coverage and AI/ML-driven real-time optimization that enables intelligent management of the entire data life cycle, the HPE Intelligent Data Platform enables an organization to get the most out of its IT resources while also meeting its evolving needs over time.
Tags : 
     Hewlett Packard Enterprise
By: HERE Technologies     Published Date: Jul 11, 2019
Supply chain managers are increasingly leveraging location intelligence and location data to raise visibility throughout their whole logistics process and to optimize their delivery routes. Leveraging this data requires an ever-more-robust technology stack. As supply chain technology stacks become more complex, diverse and defined by legacy system integrations, Application Program Interfaces (APIs) are becoming essential to making stacks scale, allowing supply chain managers to better meet the demands of the new generation of consumers. Innovative location APIs provide supply chain stacks and applications with: Greater agility Contextual intelligence Real-time data implementation Speed Scale Introducing new technology into an organization can sometimes be daunting. As one of the world’s leading location platforms, HERE shares insights and tips to streamline the supply chain technology integration across the whole organization.
Tags : here technologies, supply chain, mapping
     HERE Technologies
By: TIBCO Software     Published Date: Feb 14, 2019
With the new TIBCO Spotfire® A(X) Experience, we are revolutionizing analytics and business intelligence. This new platform accelerates the personal and enterprise analytics experience so you can get from data to insights in the fastest possible way. With the fusion of technology enablers like machine learning, artificial intelligence, and natural language search, the Spotfire® X platform redefines what’s possible for analytics and business intelligence, simplifying for everyone how data and insights are generated, consumed, and acted on. Download this whitepaper to learn more, then check out the new Spotfire analytics. It’s unlike anything you have ever seen. Simple, yet powerful, it changes everything.
Tags : 
     TIBCO Software
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com