enterprise data quality

Results 1 - 25 of 25Sort Results By: Published Date | Title | Company Name
By: RedPoint     Published Date: Sep 22, 2014
Enterprises can gain serious traction by taking advantage of the scalability, processing power and lower costs that Hadoop 2.0/YARN offers. YARN closes the functionality gap by opening Hadoop to mature enterprise-class data management capabilities. With a lot of data quality functionality left outside of Hadoop 1, and a lot of data inside HDFS originating outside the enterprise, the quality of the data residing in the Hadoop cluster is sometimes as stinky as elephant dung. Some of the topics discussed in this paper include: • The key features, benefits and limitations of Hadoop 1.0 • The benefit of performing data standardization, identity resolution, and master data management inside of Hadoop. • The transformative power of Hadoop 2.0 and its impact on the speed and cost of accessing, cleansing and delivering high-quality enterprise data. Download this illuminating white paper about what YARN really means to the world of big data management.
Tags : 
     RedPoint
By: Zynapse     Published Date: Jun 16, 2010
Data Governance has emerged as the point of convergence for people, technology and process in order to manage the crucial data (information) of an enterprise. This is a vital link in the overall ongoing data management process for it maintains the quality of data and makes it available to a wide range of decision making hierarchy across an organization
Tags : zynapse, erp projects, data information, data management, governance, mdm, master data management, odm
     Zynapse
By: Dun & Bradstreet     Published Date: Mar 03, 2017
Complexity, globalization and digitalization are just some of the elements at play in the risk landscape—and data is becoming a core part of understanding and navigating risk. How do modern finance leaders view, navigate and manage enterprise risk with data? Dun & Bradstreet surveyed global finance leaders across industries and business types. Here are the top trends that emerged from the study: 1. The Enterprise Risk & Strategy Disconnect—Finance leaders are using data and managing risk programs, but over 65% of finance leaders say there’s missing link between risk and strategy. 2. The Risks of the Use and Misuse of Data—Up to 50% of the data used to manage modern risk is disconnected. Only 15% of leaders are confident about the quality of their data. 3. Risky Relationships—Only 20% of finance leaders say the data they use to manage risk is fully integrated and shared. Download the study to learn how finance leaders are approaching data and enterprise risk management
Tags : 
     Dun & Bradstreet
By: ServiceNow     Published Date: Oct 18, 2013
Certifying the accuracy of a CMDB with inconsistent, manual methods is unreliable. ServiceNow's built-in data certification gives you the ability to automate the process. Learn how data certification works with the ServiceNow CMDB to deliver a trustworthy single system of record you can rely on.
Tags : configuration management, data certification, product management, servicenow, enterprise it cloud, it cloud company, infrastructure applications, management applications, operational applications, identifying configuration items, defining configuration items, integration, webinar, data quality
     ServiceNow
By: Schneider Electric     Published Date: Mar 28, 2019
To win the colocation race you need to be faster, reliable, innovative and efficient –all while making smarter design choices that will ensure positive returns. Customers are demanding 100% uptime and always-on connectivity –be it small enterprises to large Internet Giants–and colocation providers need to meet these expectations. The growing adoption of prefabricated data centers allows just that. With the undisputed benefits of prefab modules and building components(like speed or quality),colocation providers can manage their business today, and deploy faster in the future. Chris Crosby, CEO for Compass Datacenters, is well-known for his expertise in the data center industry. From its founding in 2012, Compass’ data center solutions have used prefabricated components like exterior walls and power centers to deliver brandable, dedicated facilities for colocation providers. Prefabrication is the central element of the company’s “Kit of Parts” methodology that delivers customizable data center solutions from the core to the edge. By attending this webinar, colocation providers will: • understand the flexibility and value delivered via the use of prefabricated construction • Hear the common misperceptions regarding prefabricated modules and data center components • learn how prefabricated solutions can provide more revenue generation capability than competing alternatives • know what key things to consider when evaluating prefabricated data center design
Tags : data centers, colocation provider, schneider electric, - modular data centers, prefabricated data center
     Schneider Electric
By: CA Technologies     Published Date: Jan 14, 2015
In the past, Enterprise IT managed the back of?ce…the data center…the systems that kept the business running. Today, there is a mandate for IT to adopt a new role—one that speaks a common language with the business. This new IT mans the storefront, interacts directly with customers and thinks beyond uptime metrics to give equal consideration to cost, spend, quality and consumption.
Tags : enterprise it, metrics, cost, quality and consumption, front office
     CA Technologies
By: CA Technologies_Business_Automation     Published Date: Jun 29, 2018
Challenge It is not uncommon for SAP system copies, including any post-editing, to take several days to complete. Meanwhile, testing, development and training activities come to a standstill, and the large number of manual tasks in the entire process ties up highly skilled SAP BASIS staff. Opportunity Enterprises are looking to automation as a way to accelerate SAP system copies and free up staff. However, this is only one part of the problem: What further complicates the system copy process is the need to safeguard sensitive data and manage huge data volumes while also ensuring that the data used in non-production systems adequately reflects the data in production systems so the quality of development, testing and training activities is not compromised. Benefits This white paper explains how a considerable portion of the SAP system copy process can be automated using the CA Automic Automated System Copy for SAP solution and SNP T-Bone, helping enterprises become more agile.
Tags : 
     CA Technologies_Business_Automation
By: IBM     Published Date: Jul 09, 2018
As the information age matures, data has become the most powerful resource enterprises have at their disposal. Businesses have embraced digital transformation, often staking their reputations on insights extracted from collected data. While decision-makers hone in on hot topics like AI and the potential of data to drive businesses into the future, many underestimate the pitfalls of poor data governance. If business decision-makers can’t trust the data within their organization, how can stakeholders and customers know they are in good hands? Information that is not correctly distributed, or abandoned within an IT silo, can prove harmful to the integrity of business decisions. In search of instant analytical insights, businesses often prioritize data access and analysis over governance and quality. However, without ensuring the data is trustworthy, complete and consistent, leaders cannot be confident their decisions are rooted in facts and reality
Tags : 
     IBM
By: Group M_IBM Q418     Published Date: Oct 15, 2018
The enterprise data warehouse (EDW) has been at the cornerstone of enterprise data strategies for over 20 years. EDW systems have traditionally been built on relatively costly hardware infrastructures. But ever-growing data volume and increasingly complex processing have raised the cost of EDW software and hardware licenses while impacting the performance needed for analytic insights. Organizations can now use EDW offloading and optimization techniques to reduce costs of storing, processing and analyzing large volumes of data. Getting data governance right is critical to your business success. That means ensuring your data is clean, of excellent quality, and of verifiable lineage. Such governance principles can be applied in Hadoop-like environments. Hadoop is designed to store, process and analyze large volumes of data at significantly lower cost than a data warehouse. But to get the return on investment, you must infuse data governance processes as part of offloading.
Tags : 
     Group M_IBM Q418
By: CyrusOne     Published Date: Jul 05, 2016
In June 2016, CyrusOne completed the Sterling II data center at its Northern Virginia campus. A custom facility featuring 220,000 square feet of space and 30 MW of power, Sterling II was built from the ground up and completed in only six months, shattering all previous data center construction records. The Sterling II facility represents a new standard in the building of enterpriselevel data centers, and confirms that CyrusOne can use the streamlined engineering elements and methods used to build Sterling II to build customized, quality data centers anywhere in the continental United States, with a similarly rapid time to completion.
Tags : cyrusone, data, technology, productivity, engineering
     CyrusOne
By: MarkLogic     Published Date: Mar 29, 2018
Executives, managers, and users will not trust data unless they understand where it came from. Enterprise metadata is the “data about data” that makes this trust possible. Unfortunately, many healthcare and life sciences organizations struggle to collect and manage metadata with their existing relational and column-family technology tools. MarkLogic’s multi-model architecture makes it easier to manage metadata, and build trust in the quality and lineage of enterprise data. Healthcare and life sciences companies are using MarkLogic’s smart metadata management capabilities to improve search and discovery, simplify regulatory compliance, deliver more accurate and reliable quality reports, and provide better customer service. This paper explains the essence and advantages of the MarkLogic approach.
Tags : enterprise, metadata, management, organizations, technology, tools, mark logic
     MarkLogic
By: MarkLogic     Published Date: May 07, 2018
Executives, managers, and users will not trust data unless they understand where it came from. Enterprise metadata is the “data about data” that makes this trust possible. Unfortunately, many healthcare and life sciences organizations struggle to collect and manage metadata with their existing relational and column-family technology tools. MarkLogic’s multi-model architecture makes it easier to manage metadata, and build trust in the quality and lineage of enterprise data. Healthcare and life sciences companies are using MarkLogic’s smart metadata management capabilities to improve search and discovery, simplify regulatory compliance, deliver more accurate and reliable quality reports, and provide better customer service. This paper explains the essence and advantages of the MarkLogic approach.
Tags : agile, enterprise, metadata, management, organization
     MarkLogic
By: Epicor Software Corporation     Published Date: Apr 12, 2017
An Enterprise Resource Planning (ERP) system is a series of software applications or modules that collects data from your sales, purchasing, finance, inventory, supply chain, manufacturing and quality functions into a common database so that your company can share the information, coordinate activities and collaborate. If you’re looking for your first ERP system or looking to upgrade from an existing system, the evaluation, selection and implementation process is a long-term strategic decision for your organization.
Tags : data management, data system, business development, software integration, resource planning, enterprise management, data collection
     Epicor Software Corporation
By: Alteryx, Inc.     Published Date: Sep 06, 2017
From small organizations using spreadsheets and visual discovery tools to large enterprises trying to improve data quality and delivery, data preparation difficulties are a major concern. Download your complimentary copy of the full report so you can tackle your data preparation challenges.
Tags : 
     Alteryx, Inc.
By: SAP Inc.     Published Date: Jul 28, 2009
Data quality is an elusive subject that can defy measurement and yet be critical enough to derail any single IT project, strategic initiative, or even a company as a whole.
Tags : roi, data quality, sap, return-on-investment, crm, erp, enterprise resource management, customer relationship management, crm, business intelligence, referential integrity, sql, data quality scoring, target marketing
     SAP Inc.
By: Forcepoint     Published Date: May 14, 2019
2018 NSS Labs SD-WAN Group Test In this report, NSS Labs simulated an enterprise network that has branches connected to a data center through two links: an MPLS line and a commercial broadband connection. They reviewed a select number of vendors, testing their throughput performance, video quality and VoIP quality as well as security effectiveness. NSS Labs verified that Forcepoint NGFW handled all of their use cases and offers all the operational capabilities that they recommend as necessary for SD-WAN as well as scoring 100% across all security tests, blocking all evasion techniques. “Forcepoint is one of the few vendors to support all of the use cases and capabilities we tested as well as strong security in their SD-WAN solution. They should be on the short list for any organization that’s looking to connect and protect their distributed enterprise.” - Vikram Phatak, CEO, NSS Labs Read the report and learn how Forcepoint delivers SD-WAN with enterprise scale and security to ma
Tags : 
     Forcepoint
By: Forcepoint     Published Date: May 14, 2019
2018 NSS Labs SD-WAN Group Test In this report, NSS Labs simulated an enterprise network that has branches connected to a data center through two links: an MPLS line and a commercial broadband connection. They reviewed a select number of vendors, testing their throughput performance, video quality and VoIP quality as well as security effectiveness. NSS Labs verified that Forcepoint NGFW handled all of their use cases and offers all the operational capabilities that they recommend as necessary for SD-WAN as well as scoring 100% across all security tests, blocking all evasion techniques. “Forcepoint is one of the few vendors to support all of the use cases and capabilities we tested as well as strong security in their SD-WAN solution. They should be on the short list for any organization that’s looking to connect and protect their distributed enterprise.” - Vikram Phatak, CEO, NSS Labs Read the report and learn how Forcepoint delivers SD-WAN with enterprise scale and security to ma
Tags : 
     Forcepoint
By: Forcepoint     Published Date: May 14, 2019
2018 NSS Labs SD-WAN Group Test In this report, NSS Labs simulated an enterprise network that has branches connected to a data center through two links: an MPLS line and a commercial broadband connection. They reviewed a select number of vendors, testing their throughput performance, video quality and VoIP quality as well as security effectiveness. NSS Labs verified that Forcepoint NGFW handled all of their use cases and offers all the operational capabilities that they recommend as necessary for SD-WAN as well as scoring 100% across all security tests, blocking all evasion techniques. “Forcepoint is one of the few vendors to support all of the use cases and capabilities we tested as well as strong security in their SD-WAN solution. They should be on the short list for any organization that’s looking to connect and protect their distributed enterprise.” - Vikram Phatak, CEO, NSS Labs Read the report and learn how Forcepoint delivers SD-WAN with enterprise scale and security to ma
Tags : 
     Forcepoint
By: SRC,LLC     Published Date: Jun 01, 2009
To mine raw data and extract crucial insights, business decision‐makers need fast and comprehensive access to all the information stored across their enterprise, regardless of its format or location. Furthermore, that data must be organized, analyzed and visualized in ways that permit easy interpretation of market opportunities growth, shifts and trends and the business‐process changes required to address them. Gaining a true perspective on an organization’s customer base, market area or potential expansion can be a challenging task, because companies use so many relational databases, data warehouse technologies, mapping systems and ad hoc data repositories to gather and house information for a wide variety of specialized purposes.
Tags : src, enterprise, enterprise applications, convergence, compared, counted, combined, reorganized, analyzed, visualized, mapped, database, gis, geographic business intelligence, data independence, etl, csv, delimited text file, mdb (both for microsoft access, esri personal geodatabase
     SRC,LLC
By: Trillium Software     Published Date: May 19, 2011
This report provides recommendations for improving the quality of operational data, which in turn contributes to an organization's drive toward operational excellence.
Tags : trillium software, tdwi checklist report, philip russom, operational data quality, opdq, analytic data, the data warehousing institute, enterprise data quality
     Trillium Software
By: VMTurbo     Published Date: Jul 08, 2015
This paper examines the evolution of enterprise applications, the data centers in which these applications reside and the increasing expectations of businesses and end-users on the Quality of Service (QoS) their applications deliver.
Tags : application control, application performance, enterprise applications, cloud-based applications, apm, application performance monitoring, data center, infrastructure performance
     VMTurbo
By: SAP     Published Date: Jun 23, 2009
Learn the importance of Data Quality and the six key steps that you can take and put into process to help you realize tangible ROI on your data quality initiative.
Tags : roi, data quality, sap, return-on-investment, crm, erp, enterprise resource management, customer relationship management, crm, business intelligence, referential integrity, sql, data quality scoring, target marketing
     SAP
By: SAP     Published Date: Feb 21, 2008
Many significant business initiatives and large IT projects depend upon a successful data migration. Your goal is to minimize as much risk as possible through effective planning and scoping. This paper will provide insight into what issues are unique to data migration projects and offer advice on how to best approach them.
Tags : sap, data architect, data migration, business objects, information management software, bloor, sap r/3, application, enterprise applications, data quality management, master data management, mdm, extraction, transformation load, etl
     SAP
By: IBM     Published Date: Oct 03, 2017
Many new regulations are spurring banks to rethink how data from across the enterprise flows into the aggregated risk and capital reports required by regulatory agencies. Data must be complete, correct and consistent to maintain confidence in risk reports, capital reports and analytical analyses. At the same time, banks need ways to monetize, grant access to and generate insight from data. To keep pace with regulatory changes, many banks will need to reapportion their budgets to support the development of new systems and processes. Regulators continually indicate that the banks must be able to provide, secure and deliver high-quality information that is consistent and mature.
Tags : data aggregation, risk reporting, bank regulation, enterprise, reapportion budgets
     IBM
By: Dun & Bradstreet     Published Date: Feb 21, 2017
As the volume of data coming into organizations – from both internal and external sources – continues to grow and makes its way across departmental systems in many different formats, there is a critical need to create a single, holistic view of the key data entities in common use across the enterprise. Master Data Management (MDM) aims to accomplish this goal. Not surprisingly, MDM has become a significant priority for global enterprises, with the market expected to triple from $9.4B to $26.8B by 2020 according to analysts. The reality, though, is that while seemingly everyone is investing heavily in the tools to manage data, few are putting a great enough emphasis on the data itself. And that’s a problem. Poor data quality is said to be costing businesses $3.1 trillion annually – and that’s just in the US alone. The information being put into MDM tools must be mastered first and foremost.
Tags : managing data, data management insight, mdm, master data management
     Dun & Bradstreet
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com