it platform

Results 1 - 25 of 2327Sort Results By: Published Date | Title | Company Name
By: NetApp     Published Date: Dec 13, 2013
Interested in running a Hadoop proof of concept on enterprise-class storage? Download this solutions guide to get a technical overview on building Hadoop on NetApp E-series storage. NetApp Open Solution for Hadoop delivers big analytics with preengineered, compatible, and supported solutions based on high-quality storage platforms so you reduce the cost, schedule, and risk of do-it-yourself systems and relieving the skills gap most organizations have with Hadoop. See how on going operational and maintenance costs can be reduced with a high available and scalable Hadoop solution.
Tags : open solutions, hadoop solutions guide
     NetApp
By: NetApp     Published Date: Dec 13, 2013
Despite the hype, Big Data has introduced critical challenges for modern organizations – and unprepared organizations risk getting buried beneath an avalanche of information. In this informative webcast, join industry and business intelligence (BI) expert Wayne Eckerson, as he tackles the challenges of Big Data. Uncover practical tips and tactics for driving value with your Big Data platform – watch now to learn more.
Tags : big data problems, how to get the most from your big data
     NetApp
By: IBM     Published Date: Sep 02, 2014
Life Sciences organizations need to be able to build IT infrastructures that are dynamic, scalable, easy to deploy and manage, with simplified provisioning, high performance, high utilization and able to exploit both data intensive and server intensive workloads, including Hadop MapReduce. Solutions must scale, both in terms of processing and storage, in order to better serve the institution long-term. There is a life cycle management of data, and making it useable for mainstream analyses and applications is an important aspect in system design. This presentation will describe IT requirements and how Technical Computing solutions from IBM and Platform Computing will address these challenges and deliver greater ROI and accelerated time to results for Life Sciences.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : ibm, ibm platform computing, save money
     IBM
By: IBM     Published Date: Sep 02, 2014
Research teams using next-generation sequencing (NGS) technologies face the daunting challenge of supporting compute-intensive analysis methods against petabytes of data while simultaneously keeping pace with rapidly evolving algorithmic best practices. NGS users can now solve these challenges by deploying the Accelrys Enterprise Platform (AEP) and the NGS Collection on optimized systems from IBM. Learn how you can benefit from the turnkey IBM Application Ready Solution for Accelrys with supporting benchmark data.
Tags : ibm, accelrys, turnkey ngs solution
     IBM
By: IBM     Published Date: Sep 02, 2014
With tougher regulations and continuing market volatility, financial firms are moving to active risk management with a focus on counterparty risk. Firms are revamping their risk and trading practices from top to bottom. They are adopting new risk models and frameworks that support a holistic view of risk. Banks recognize that technology is critical for this transformation, and are adding state-of-the-art enterprise risk management solutions, high performance data and grid management software, and fast hardware. Join IBM Algorithmics and IBM Platform Computing to gain insights on this trend and on technologies for enabling active "real-time" risk management.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
IBM Platform Symphony - accelerate big data analytics – This demonstration will highlight the benefits and features of IBM Platform Symphony to accelerate big data analytics by maximizing distributed system performance, fully utilizing computing resources and effectively harnessing the power of Hadoop.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Every day, the world creates 2.5 quintillion bytes of data and businesses are realizing tangible results from investments in big data analytics. IBM Spectrum Scale (GPFS) offers an enterprise class alternative to Hadoop Distributed File System (HDFS) for building big data platforms and provides a range of enterprise-class data management features. Spectrum Scale can be deployed independently or with IBM’s big data platform, consisting of IBM InfoSphere® BigInsights™ and IBM Platform™ Symphony. This document describes best practices for deploying Spectrum Scale in such environments to help ensure optimal performance and reliability.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
There is a lot of hype around the potential of big data and organizations are hoping to achieve new innovations in products and services with big data and analytics driving more concrete insights about their customers and their own business operations. To meet these challenges, IBM has introduced IBM® Spectrum Scale™. The new IBM Spectrum Scale storage platform has grown from GPFS, which entered the market in 1998. Clearly, IBM has put significant development into developing this mature platform. Spectrum Scale addresses the key requirements of big data storage - extreme scalability for growth, reduced overhead of data movement, easy accessibility , geographic location independence and advanced storage functionality. Read the paper to learn more!
Tags : 
     IBM
By: Splice Machine     Published Date: Feb 13, 2014
Hadoop: Moving Beyond the Big Data Hype Let’s face it. There is a lot of hype surrounding Big Data and adoop, the defacto Big Data technology platform. Companies want to mine and act on massive data sets, or Big Data, to unlock insights that can help them improve operational efficiency, delight customers, and leapfrog their competition. Hadoop has become popular to store massive data sets because it can distribute them across inexpensive commodity servers. Hadoop is fundamentally a file system (HDFS or Hadoop Distributed File System) with a specialized programming model (MapReduce) to process the data in the files. Big Data has not lived up to expectations so far, partly because of limitations of Hadoop as a technology.
Tags : sql-on-hadoop® evaluation guide, splice machine, adoop
     Splice Machine
By: RYFT     Published Date: Apr 03, 2015
The new Ryft ONE platform is a scalable 1U device that addresses a major need in the fast-growing market for advanced analytics — avoiding I/O bottlenecks that can seriously impede analytics performance on today's hyperscale cluster systems. The Ryft ONE platform is designed for easy integration into existing cluster and other server environments, where it functions as a dedicated, high-performance analytics engine. IDC believes that the new Ryft ONE platform is well positioned to exploit the rapid growth we predict for the high-performance data analysis market.
Tags : ryft, ryft one platform, 1u deivce, advanced analytics, avoiding i/o bottlenecks, idc
     RYFT
By: IBM     Published Date: Nov 14, 2014
Every day, the world creates 2.5 quintillion bytes of data and businesses are realizing tangible results from investments in big data analytics. IBM Spectrum Scale (GPFS) offers an enterprise class alternative to Hadoop Distributed File System (HDFS) for building big data platforms and provides a range of enterprise-class data management features. Spectrum Scale can be deployed independently or with IBM’s big data platform, consisting of IBM InfoSphere® BigInsights™ and IBM Platform™ Symphony. This document describes best practices for deploying Spectrum Scale in such environments to help ensure optimal performance and reliability.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
To quickly and economically meet new and peak demands, Platform LSF (SaaS) and Platform Symphony (SaaS) workload management as well as Elastic Storage on Cloud data management software can be delivered as a service, complete with SoftLayer cloud infrastructure and 24x7 support for technical computing and service-oriented workloads. Watch this demonstration to learn how the IBM Platform Computing Cloud Service can be used to simplify and accelerate financial risk management using IBM Algorithmics.
Tags : 
     IBM
By: Dell and Intel®     Published Date: Aug 24, 2015
Business need: Merkle needed a scalable, cost-effective way to capture and analyze large amounts of structured and unstructured consumer data for use in developing better marketing campaigns for clients. Solution: The company deployed a Dell and HadoopTM cluster based on Dell and Intel® technologies to support a new big data insight solution that gives clients a unified view of customer data. Benefits: [bullets for the below points] • Partnership with Dell and Intel® leads to new big data solution • Cluster supports the Foundational Marketing Platform, a new data insight solution • Merkle can find patterns in big data and create analytical models that anticipate consumer behavior • Organization cuts costs by 60 percent and boosts processing speeds by 10 times • Solution provides scalability and enables innovation
Tags : 
     Dell and Intel®
By: Dell and Intel®     Published Date: Aug 24, 2015
This business-oriented white paper summarizes the wide-ranging benefits of the Hadoop platform, highlights common data processing use cases and explores examples of specific use cases in vertical industries. The information presented here draws on the collective experiences of three leaders in the use of Hadoop technologies—Dell and its partners Cloudera® and Intel®.
Tags : 
     Dell and Intel®
By: snowflake     Published Date: Jun 09, 2016
Why Read This Report In the era of big data, enterprise data warehouse (EDW) technology continues to evolve as vendors focus on innovation and advanced features around in-memory, compression, security, and tighter integration with Hadoop, NoSQL, and cloud. Forrester identified the 10 most significant EDW software and services providers — Actian, Amazon Web Services (AWS), Hewlett Packard Enterprise (HPE), IBM, Microsoft, Oracle, Pivotal Software, SAP, Snowflake Computing, and Teradata — in the category and researched, analyzed, and scored them. This report details our findings about how well each vendor fulfills our criteria and where they stand in relation to each other to help enterprise architect professionals select the right solution to support their data warehouse platform.
Tags : 
     snowflake
By: BitStew     Published Date: May 26, 2016
The heaviest lift for an industrial enterprise is data integration, the Achilles’ heel of the Industrial Internet of Things (IIoT). Companies are now recognizing the enormous challenge involved in supporting Big Data strategies that can handle the data that is generated by information systems, operational systems and the extensive networks of old and new sensors. To compound these issues, business leaders are expecting data to be captured, analyzed and used in a near real-time to optimize business processes, drive efficiency and improve profitability. However, integrating this vast amount of dissimilar data into a unified data strategy can be overwhelming for even the largest organizations. Download this white paper, by Bit Stew’s Mike Varney, to learn why a big data solution will not get the job done. Learn how to leverage machine intelligence with a purpose-built IIoT platform to solve the data integration problem.
Tags : 
     BitStew
By: Cisco EMEA     Published Date: Oct 01, 2019
De Cisco-ervaring voor vergaderingen en teamsamenwerking De werkplek is veranderd en bedrijven met de meest flexibele werknemers presteren beter dan traditionele bedrijven. Maar de realiteit van het moderne bedrijfsleven is dat u een cultuur van innovatie zelf moet stimuleren. Teams hebben een werkvloer nodig die is toegespitst op innovatie en snelheid. Maak kennis met Cisco Webex Teams, een platform dat teams helpt het allemaal te doen. Dit is een essentieel gereedschap voor vooruitstrevende MKB-bedrijven.
Tags : 
     Cisco EMEA
By: Hewlett Packard Enterprise     Published Date: Jul 29, 2019
Learn about the HPE Intelligent Data Platform and the new IT realities it addresses. With digital transformation underway in many organizations, more dynamic business models are becoming the key to success. This means infrastructure modernization and the introduction of technologies such as solid state storage, artificial intelligence and machine learning, software-defined infrastructure, and the cloud. At the same time, it means IT infrastructure management becomes much more complex. Enter HPE’s Intelligent Data Platform. With comprehensive coverage and AI/ML-driven real-time optimization that enables intelligent management of the entire data life cycle, the HPE Intelligent Data Platform enables an organization to get the most out of its IT resources while also meeting its evolving needs over time.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Jul 29, 2019
As you strive to deliver the speed your business demands, this new whitepaper from IDC provides the latest insights about the trends and challenges of delivering IT services in the new hybrid cloud world. You’ll also learn the characteristics at the heart of the new hybrid cloud platforms and find out more about how the right solutions and the right partner enables an extension of the cloud experience across your business in a way that’s open, flexible, and hybrid by design
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Jul 25, 2019
Provision IT the way you want it, when you want it, through a platform that automates operations. That's Composable Infrastructure. Learn how it represents a key competitive advantage for organizations that want to succeed in the Idea Economy.
Tags : 
     Hewlett Packard Enterprise
By: Nutanix     Published Date: Aug 22, 2019
Organizations can now fully automate hybrid cloud architecture deployments, scaling both multitiered and distributed applications across different cloud environments, including Amazon Web Services (AWS) and Google Cloud Platform (GCP). Ready to learn more about hyperconverged infrastructure and the Nutanix Enterprise Cloud? Contact us at info@nutanix.com, follow us on Twitter @nutanix, or send us a request at www.nutanix.com/demo to set up your own customized briefing and demonstration to see how validated and certified solutions from Nutanix can help your organization make the most of its enterprise applications.
Tags : 
     Nutanix
By: Gigamon     Published Date: Sep 03, 2019
This white paper will examine the security issues introduced by more data over faster networks, how an architectural approach can solve those challenges and introduces the GigaSECURE® Security Delivery Platform, the leading next-generation network packet broker purpose-built for security tools to work more efficiently across physical, virtual and cloud environments. In fact, IHS Markit1 has named Gigamon the market leader and the best-known vendor in the space with #1 market share in multiple industries – 36% overall and 59% in the government sector.
Tags : 
     Gigamon
By: Gigamon     Published Date: Sep 03, 2019
The IT pendulum is swinging to distributed computing environments, network perimeters are dissolving, and compute is being distributed across various parts of organizations’ infrastructure—including, at times, their extended ecosystem. As a result, organizations need to ensure the appropriate levels of visibility and security at these remote locations, without dramatically increasing staff or tools. They need to invest in solutions that can scale to provide increased coverage and visibility, but that also ensure efficient use of resources. By implementing a common distributed data services layer as part of a comprehensive security operations and analytics platform architecture (SOAPA) and network operations architecture, organizations can reduce costs, mitigate risks, and improve operational efficiency.
Tags : 
     Gigamon
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com