data systems

Results 1 - 25 of 833Sort Results By: Published Date | Title | Company Name
By: IBM     Published Date: Sep 02, 2014
Research teams using next-generation sequencing (NGS) technologies face the daunting challenge of supporting compute-intensive analysis methods against petabytes of data while simultaneously keeping pace with rapidly evolving algorithmic best practices. NGS users can now solve these challenges by deploying the Accelrys Enterprise Platform (AEP) and the NGS Collection on optimized systems from IBM. Learn how you can benefit from the turnkey IBM Application Ready Solution for Accelrys with supporting benchmark data.
Tags : ibm, accelrys, turnkey ngs solution
     IBM
By: IBM     Published Date: Sep 16, 2015
The IBM Spectrum Scale solution provided for up to 11x better throughput results than EMC Isilon for Spectrum Protect (TSM) workloads. Using published data, Edison compared a solution comprised of EMC® Isilon® against an IBM® Spectrum Scale™ solution. (IBM Spectrum Scale was formerly IBM® General Parallel File System™ or IBM® GPFS™, also known as code name Elastic Storage). For both solutions, IBM® Spectrum Protect™ (formerly IBM Tivoli® Storage Manager or IBM® TSM®) is used as a common workload performing the backups to target storage systems evaluated.
Tags : 
     IBM
By: Storiant     Published Date: Mar 16, 2015
Read this new IDC Report about how today's enterprise datacenters are dealing with new challenges that are far more demanding than ever before. Foremost is the exponential growth of data, most of it unstructured data. Big data and analytics implementations are also quickly becoming a strategic priority in many enterprises, demanding online access to more data, which is retained for longer periods of time. Legacy storage solutions with fixed design characteristics and a cost structure that doesn't scale are proving to be ill-suited for these new needs. This Technology Spotlight examines the issues that are driving organizations to replace older archive and backup-and-restore systems with business continuity and always-available solutions that can scale to handle extreme data growth while leveraging a cloudbased pricing model. The report also looks at the role of Storiant and its long-term storage services solution in the strategically important long-term storage market.
Tags : storiant, big data, analytics implementations, cloudbased pricing model, long-term storage services solution, long-term storage market
     Storiant
By: MEMSQL     Published Date: Apr 12, 2016
The pace of data is not slowing. Applications of today are built with infinite data sets in mind. As these real-time applications become the norm, and batch processing becomes a relic of the past, digital enterprises will implement memory-optimized, distributed data systems to simplify Lambda Architectures for real-time data processing and exploration.
Tags : 
     MEMSQL
By: Data Direct Networks     Published Date: Apr 08, 2014
DataDirect Networks (DDN), the largest privately-held provider of high-performance storage, has a large and growing presence in HPC markets. HPC users identify DDN as their storage provider more than any other storage-focused company, with twice the mentions of EMC, and more the twice the mentions of NetApp, Hitachi Data Systems, or Panasas.(5) DDN’s strength in HPC is anchored by its Storage Fusion Architecture (SFA), winner of the HPCwire Editor’s Choice Award for “Best HPC Storage Product or Technology” in each of the past three years. The DDN SFA12KX combines SATA, SAS, and solid-state disks (SSDs) for an environment that can be tailored to a balance of throughput and capacity
Tags : 
     Data Direct Networks
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflops of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Lustre and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters. Further, Intel Enterprise Edition for Lustre software is backed by Intel, the recognized technical support providers for Lustre, and includes 24/7 service level agreement (SLA) coverage.
Tags : 
     Intel
By: Dell and Intel®     Published Date: Aug 24, 2015
New technologies help decision makers gain insights from all types of data - from traditional databases to high-visibility social media sources. Big data initiatives must ensure data is cost-effectively managed, shared by systems across the enterprise, and quickly and securely made available for analysis and action by line-of-business teams. In this article, learn how Dell working with Intel® helps IT leaders overcome the challenges of IT and business alignment, resource constraints and siloed environments through a comprehensive big data portfolio based on choice and flexibility, redefined economics and connected intelligence.
Tags : 
     Dell and Intel®
By: Dell and Intel®     Published Date: Aug 24, 2015
Organizations working at gathering insights from vast volumes of varied data types understand that they need more than traditional, structured systems, and tools. This paper discusses how the many Dell | Cloudera Hadoop solutions help organizations of all sizes, and with a variety of needs and use cases, tackle their big data requirements.
Tags : 
     Dell and Intel®
By: Dell and Intel®     Published Date: Aug 24, 2015
To extract value from an ever-growing onslaught of data, your organization needs next-generation data management, integration, storage and processing systems that allow you to collect, manage, store and analyze data quickly, efficiently and cost-effectively. That’s the case with Dell| Cloudera® Apache™ Hadoop® solutions for big data. These solutions provide end-to-end scalable infrastructure, leveraging open source technologies, to allow you to simultaneously store and process large datasets in a distributed environment for data mining and analysis, on both structured and unstructured data, and to do it all in an affordable manner.
Tags : 
     Dell and Intel®
By: Kx Systems     Published Date: Jan 16, 2015
?Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we wa
Tags : kx systems, kdb+, relational database
     Kx Systems
By: snowflake     Published Date: Jun 09, 2016
Today’s data, and how that data is used, have changed dramatically in the past few years. Data now comes from everywhere—not just enterprise applications, but also websites, log files, social media, sensors, web services, and more. Organizations want to make that data available to all of their analysts as quickly as possible, not limit access to only a few highly-skilled data scientists. However, these efforts are quickly frustrated by the limitations of current data warehouse technologies. These systems simply were not built to handle the diversity of today’s data and analytics. They are based on decades-old architectures designed for a different world, a world where data was limited, users of data were few, and all processing was done in on-premises datacenters.
Tags : 
     snowflake
By: snowflake     Published Date: Jun 09, 2016
THE CHALLENGE: DATA SOLUTIONS CAN’T KEEP PACE WITH DATA NEEDS Organizations are increasingly dependent on diff erent types of data to make successful business decisions. But as the volume, rate, and types of data expand and become less predictable, conventional data warehouses cannot consume all this data eff ectively. Big data solutions like Hadoop increase the complexity of the environment and generally lack the performance of traditional data warehouses. This makes it difficult, expensive, and time-consuming to manage all the systems and the data.
Tags : 
     snowflake
By: BitStew     Published Date: May 26, 2016
The heaviest lift for an industrial enterprise is data integration, the Achilles’ heel of the Industrial Internet of Things (IIoT). Companies are now recognizing the enormous challenge involved in supporting Big Data strategies that can handle the data that is generated by information systems, operational systems and the extensive networks of old and new sensors. To compound these issues, business leaders are expecting data to be captured, analyzed and used in a near real-time to optimize business processes, drive efficiency and improve profitability. However, integrating this vast amount of dissimilar data into a unified data strategy can be overwhelming for even the largest organizations. Download this white paper, by Bit Stew’s Mike Varney, to learn why a big data solution will not get the job done. Learn how to leverage machine intelligence with a purpose-built IIoT platform to solve the data integration problem.
Tags : 
     BitStew
By: Oracle     Published Date: Sep 25, 2019
Research shows that legacy ERP 1.0 systems were not designed for usability and insight. More than three quarters of business leaders say their current ERP system doesn’t meet their requirements, let alone future plans 1. These systems lack modern best-practice capabilities needed to compete and grow. To enable today’s data-driven organization, the very foundation from which you are operating needs to be re-established; it needs to be “modernized”. Oracle’s goal is to help you navigate your own journey to modernization by sharing the knowledge we’ve gained working with many thousands of customers using both legacy and modern ERP systems. To that end, we’ve crafted this handbook outlining the fundamental characteristics that define modern ERP.
Tags : 
     Oracle
By: F5 Networks Singapore Pte Ltd     Published Date: Sep 19, 2019
"Security analysts have a tougher job than ever. New vulnerabilities and security attacks used to be a monthly occurrence, but now they make the headlines almost every day. It’s become much more difficult to effectively monitor and protect all the data passing through your systems. Automated attacks from bad bots that mimic human behavior have raised the stakes, allowing criminals to have machines do the work for them. Not only that, these bots leave an overwhelming number of alert bells, false positives, and inherent stress in their wake for security practitioners to sift through. Today, you really need a significant edge when combating automated threats launched from all parts of the world. Where to start? With spending less time investigating all that noise in your logs."
Tags : 
     F5 Networks Singapore Pte Ltd
By: Gigamon     Published Date: Sep 03, 2019
Network performance and security are vital elements of any business. Organisations are increasingly adopting virtualisation and cloud technologies to boost productivity, cost savings and market reach. With the added complexity of distributed network architectures, full visibility is necessary to ensure continued high performance and security. Greater volumes of data, rapidlyevolving threats and stricter regulations have forced organisations to deploy new categories of security tools, e.g. Web Access Firewalls (WAFs) or Intrusion Prevention Systems (IPS). Yet, simply adding more security tools may not always be the most efficient solution.
Tags : 
     Gigamon
By: BehavioSec     Published Date: Oct 04, 2019
In this case study, a large enterprise with an increasing amount of off-site work from both work-related travel and a fast-growing remote workforce, is faced with a unique challenge to ensure their data security is scalable and impenetrable. Their data access policies rely on physical access management provided at the company offices and do not always provide off-site employees with the ability to complete work-critical tasks. Legacy security solutions only add burden to productivity, sometimes causing employees to ignore security protocols in order to simply complete their work. Upon evaluating security vendors for a frictionless solution, they selected BehavioSec for its enterprise-grade capabilities with on-premise deployment and integration with existing legacy risk management systems.
Tags : 
     BehavioSec
By: TIBCO Software     Published Date: Aug 02, 2019
As an insurer, the challenges you face today are unprecedented. Siloed and heterogeneous existing systems make understanding what’s going on inside and outside your business difficult and costly. Your systems weren’t set up to take advantage of, or even handle, the volume, velocity, and variety of new data streaming in from the internet of things, sensors, wearables, telematics, weather, social media, and more. And they weren’t designed for heavy human interaction. Millennials demand immediate information and services across digital channels. Can your systems keep up?
Tags : 
     TIBCO Software
By: Infinidat EMEA     Published Date: May 14, 2019
Big Data and analytics workloads represent a new frontier for organizations. Data is being collected from sources that did not exist 10 years ago. Mobile phone data, machine-generated data, and website interaction data are all being collected and analyzed. In addition, as IT budgets are already under pressure, Big Data footprints are getting larger and posing a huge storage challenge. This paper provides information on the issues that Big Data applications pose for storage systems and how choosing the correct storage infrastructure can streamline and consolidate Big Data and analytics applications without breaking the bank.
Tags : 
     Infinidat EMEA
By: Infinidat EMEA     Published Date: May 14, 2019
Continuous data availability is a key business continuity requirement for storage systems. It ensures protection against downtime in case of serious incidents or disasters and enables recovery to an operational state within a reasonably short period. To ensure continuous availability, storage solutions need to meet resiliency, recovery, and contingency requirements outlined by the organization.
Tags : 
     Infinidat EMEA
By: IBM APAC     Published Date: Sep 30, 2019
Companies that are undergoing a technology-enabled business strategy such as digital transformation urgently need modern infrastructure solutions. The solutions should be capable of supporting extreme performance and scalability, uncompromised data-serving capabilities and pervasive security and encryption. According to IDC, IBM’s LinuxONE combines the advantages of both commercial (IBM Z) and opensource (Linux)systems with security capabilities unmatched by any other offering and scalability for systems-of-record workloads. The report also adds LinuxONE will be a good fit for enterprises as well as managed and cloud service provider firms. Read more about the benefits of LinuxONE in this IDC Whitepaper.
Tags : 
     IBM APAC
By: Selligent Marketing Cloud     Published Date: Sep 24, 2019
Every company markets to consumers differently. From call centers to emails to apps and aggregator sites, orchestrating a relationship marketing strategy requires a bespoke collection of marketing technologies. Marketers have the budgets to spend on CRM, email, mobile and data management, but fitting these capabilities together and ensuring they work with legacy business systems is not easy.
Tags : 
     Selligent Marketing Cloud
By: ASG Software Solutions     Published Date: Feb 24, 2010
A recent survey of CIOs found that over 75% want to develop an overall information strategy in the next three years, yet over 85% are not close to implementing an enterprise-wide content management strategy. Meanwhile, data runs rampant, slows systems, and impacts performance. Hard-copy documents multiply, become damaged, or simply disappear.
Tags : asg, cmdb, bsm, itil, bsm, metacmdb, archiving, sap, ilm, mobius, workload automation, wla, visibility, configuration management, metadata, metacmdb, lob, sdm, service dependency mapping, ecommerce
     ASG Software Solutions
By: Gigaom     Published Date: Oct 24, 2019
A huge array of BI, analytics, data prep and machine learning platforms exist in the market, and each of those may have a variety of connectors to different databases, file systems and applications, both on-premises and in the cloud. But in today’s world of myriad data sources, simple connectivity is just table stakes. What’s essential is a data access strategy that accounts for the variety of data sources out there, including relational and NoSQL databases, file formats across storage systems — even enterprise SaaS applications — and can make them all consumable by tools and applications built for tabular data. In today’s data-driven business environment, fitting omni-structured data and disparate applications into a consistent data API makes comprehensive integration, and insights, achievable. Want to learn more and map out your data access strategy? Join us for this free 1-hour webinar from GigaOm Research. The webinar features GigaOm analyst Andrew Brust and special guests, Eric
Tags : 
     Gigaom
By: ASME     Published Date: Oct 03, 2019
One of the most frustrating aspects of the measurement of severe pyroshock events is the acceleration offset that almost invariably occurs. Dependent on its magnitude, this can result in large, low-frequency errors in both shock response spectra (SRS) and velocity-based damage analyses. Fortunately, recent developments in accelerometer technology, signal conditioning, and data acquisition systems have reduced these errors significantly. Best practices have been demonstrated to produce offset errors less than 0.25% of Peak-Peak value in measured near-field pyrotechnic accelerations: a remarkable achievement. This paper will discuss the sensing technologies, including both piezoelectric and piezoresistive, that have come together to minimize these offsets. More important, it will document the many other potential contributors to these offsets. Included among these are accelerometer mounting issues, cable and connector sources, signal conditioning amplitude range/bandwidth, and digitizi
Tags : 
     ASME
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com