data engineers

Results 1 - 16 of 16Sort Results By: Published Date | Title | Company Name
By: Cisco     Published Date: Jan 05, 2015
Virtualization has transformed the data center over the past decade. IT departments use virtualization to consolidate multiple server workloads onto a smaller number of more powerful servers. They use virtualization to scale existing applications by adding more virtual machines to support them, and they deploy new applications without having to purchase additional servers to do so. They achieve greater resource utilization by balancing workloads across a large pool of servers in real time—and they respond more quickly to changes in workload or server availability by moving virtual machines between physical servers. Virtualized environments support private clouds on which application engineers can now provision their own virtual servers and networks in environments that expand and contract on demand.
Tags : datacenter, data management, collaborations, business capabilities
     Cisco
By: Dell Storage     Published Date: Apr 17, 2012
A scale-out storage architecture helps organizations deal with demands for growing data capacity and access. Dell engineers put DellT EqualLogicT scale-out storage through its paces to demonstrate its scalability in both file and block I/O scenarios.
Tags : 
     Dell Storage
By: EMC Corporation     Published Date: Jul 07, 2013
3TIER helps organizations understand and manage the risks associated with renewable energy projects. A pioneer in wind and solar generation risks analysis, 3TIER uses science and technology to frame the risk of weather-driven variability, anywhere on Earth. 3TIER's unique expertise is in combining the latest weather data with historical weather patterns, and using the expertise of 3TIER's meteorologists, engineers and data scientists to create a detailed independent assessment of the future renewable energy potential of any location.
Tags : renewable energy, customer profile, emc, risk management, best practices, storage, technology, security
     EMC Corporation
By: Drillinginfo     Published Date: Nov 18, 2015
The Bakken is a very large hydrocarbon-bearing subsurface rock formation underneath a large portion of the Williston Basin in North Dakota, Montana, Saskatchewan and Winnepeg. The Bakken has been the scene of many advancements in drilling technology – horizontal drilling, pad drilling and downspacing, to name a few. This first Edition of the DI Expert eBook by Drillinginfo, the premier provider of data and insights for oil and gas exploration decisions, is a collection of articles posted by our staff of engineers, analysts and geologists about the Bakken over the past year.
Tags : geologists, downspacing, innovations, oil, gas exploration, drilling
     Drillinginfo
By: NetApp     Published Date: Feb 19, 2015
NetApp Flash Pool is a storage cache option within the NetApp Virtual Storage Tier product family, available for NetApp FAS storage systems. A Flash Pool configures solid state drives (SSDs) and hard disk drives (HDDs) into a single storage pool, known as an “aggregate” in NetApp parlance, with the SSDs providing a fast response time cache for volumes that are provisioned on the Flash Pool aggregate. In this lab evaluation, NetApp commissioned Demartek to evaluate the effectiveness of Flash Pool with different types and numbers of hard disk drives using an online transaction processing (OLTP) database workload, and to evaluate the performance of Flash Pool in a clustered Data ONTAP environment during a cluster storage node failover scenario. In the report, you’ll dis cover how Demartek test engineers documented a 283% gain in IOPS and a reduction in latency by a factor of 66x after incorporating NetApp Flash Pool technology.
Tags : 
     NetApp
By: NetApp     Published Date: Sep 22, 2014
NetApp Flash Pool is a storage cache option within the NetApp Virtual Storage Tier product family, available for NetApp FAS storage systems. A Flash Pool configures solid state drives (SSDs) and hard disk drives (HDDs) into a single storage pool, known as an “aggregate” in NetApp parlance, with the SSDs providing a fast response time cache for volumes that are provisioned on the Flash Pool aggregate. In this lab evaluation, NetApp commissioned Demartek to evaluate the effectiveness of Flash Pool with different types and numbers of hard disk drives using an online transaction processing (OLTP) database workload, and to evaluate the performance of Flash Pool in a clustered Data ONTAP environment during a cluster storage node failover scenario. In the report, you’ll dis cover how Demartek test engineers documented a 283% gain in IOPS and a reduction in latency by a factor of 66x after incorporating NetApp Flash Pool technology.
Tags : flash pool, fas storage systems, ssd, online transaction processing, cluster storage
     NetApp
By: IBM     Published Date: Jan 18, 2017
In the domain of data science, solving problems and answering questions through data analysis is standard practice. Data scientists experiment continuously by constructing models to predict outcomes or discover underlying patterns, with the goal of gaining new insights. But data scientists can only go so far without support.
Tags : ibm, analytics, aps data, open data science, data science, data engineers
     IBM
By: Dassault Systemes SolidWorks Corp.     Published Date: Mar 27, 2018
When it comes to the Internet of Things (IoT), the evolution of connected devices and data can often make it difficult for teams — including designers, engineers and communication specialists — to efficiently work together. The SOLIDWORKS connected devices ecosystem, however, provides development capabilities that easily bring teams together to manage complexities and synchronize and facilitate product development. This webinar further explains smart, connected devices and how SOLIDWORKS can leverage these capabilities to help drive business change.
Tags : business, devices, solidworks, standard, iot, ecosystem, development
     Dassault Systemes SolidWorks Corp.
By: BubblewrApp     Published Date: Jan 20, 2015
The Company (name withheld) provides data center management and monitoring services to a number of enterprises across the United States. The Company maintains multiple network operations centers (NOCs) across the country where engineers monitor customer networks and application uptimes around the clock. The Company evaluated BubblewrApp’s Secure Access Service and was able to enable access to systems within customer data centers in 15 minutes. In addition, the Company was able to: a. Do away with site-to-site VPNs – no more reliance on jump hosts in the NOC b. Build out monitoring systems in the NOC without worry about possible IP subnet conflicts c. Enable NOC engineers to access allowed systems in customer networks from any device
Tags : systems management, customer data centers, network operation centers, secure access service
     BubblewrApp
By: Mentor Graphics     Published Date: Mar 10, 2010
This paper provides CIMdata's perspective on Computational Fluid Dynamics (CFD) analysis; the motivations for its use, its value and future, and the importance for making CFD available to all engineers earlier in the product design/development lifecycle.
Tags : mentor graphics, cfd, mechanical design engineer, cimdata, computational fluid dynamics, product design, development lifecycle, cad
     Mentor Graphics
By: Amazon Web Services, Inc     Published Date: Sep 24, 2013
Amazon Web Services (AWS) provides a secure and dependable environment for deploying Microsoft Exchange Server 2010. Customers can use their existing Windows Server applications licenses, such as MS Exchange or SharePoint, on AWS without paying additional licensing fees. Take advantage of the pay-as-you-go-pricing, scalability, and data integrity of the AWS Cloud to run your Exchange Server workloads today. Download the Planning and Implementation Guide to learn more. This guide discusses planning topics, architectural considerations, and configuration steps to run a high-availability and site-resilient Exchange architecture. This guide is designed for Microsoft Exchange Server administrators, architects, and engineers. In the guide, we also provide a sample AWS CloudFormation template that is designed to help you easily provision the necessary AWS resources and update them in an orderly and predictable fashion. Learn more today!
Tags : aws, ms exchange, windows server, share point, pay as you go pricing, scalability, data integrity, cloud, server, technology, configuration, cloudformation, security
     Amazon Web Services, Inc
By: IBM     Published Date: Oct 10, 2013
Virtualization is one of the most highly demanded of all IT projects this decade because it enables enterprises to reduce capex and opex costs and at the same time increase business efficiency and agility. However, implementing, supporting, and managing virtualization can often be difficult tasks, especially as deployments increase in scale and complexity and impact more areas of the datacenter. These more complex infrastructures often require highly skilled engineers with in-depth business knowledge and systems management capabilities. Projects such as infrastructure optimization and automation initiatives for self-service provisioning are key because they enable enterprises to streamline business processes and utilize cloud strategies and mobility solutions. IDC shows how partnering with experts who can implement, optimize, support, and manage virtualized environments may be the right course of action. By utilizing partners, enterprises will be able to avoid some of the potential pit
Tags : virtual environment, business initiatives, maximize initiatives, datacenter, complex infrastructure, business efficiency, business agility, deployments, automation initiatives, cloud strategies, mobility solutions
     IBM
By: Cisco     Published Date: Apr 10, 2015
Walk past your data center, and you might hear a soft, plaintive call: “Feed me, feed me…” It is not your engineers demanding more pizza. It is your servers and applications. And the call is growing louder New Cisco® 40-Gbps bidirectional (BiDi) optical technology lets you bring 40-Gbps speeds to the access layer using the same 10-Gbps cable plant you are using today. It is a huge cost savings, whether you are upgrading your current data center or building a new one. And it means you can start taking advantage of 40-Gbps performance for your business right now without needing special budget approval and without having to wait a year to get the capacity you need.
Tags : cisco, transceiver modules, wireless, cost effective, high performance switch, optical technology
     Cisco
By: Cisco     Published Date: Sep 15, 2015
Walk past your data center, and you might hear a soft, plaintive call: “Feed me, feed me…” It is not your engineers demanding more pizza. It is your servers and applications. And the call is growing louder. Mobile and virtualized workloads, cloud applications, big data, heterogeneous devices: they are all growing in your business, demanding previously unimagined capacity and performance from your servers and data center fabric. And that demand is not slacking. Your employees, applications, and competitive advantage increasingly depend on it. Those servers and applications need to be fed. And if you have not started planning for 40 gigabits per second (Gbps) to the server rack, you will need to soon.
Tags : applications, systems integration, cloud, development
     Cisco
By: Cisco     Published Date: Jan 05, 2016
With a global monthly reach consisting of more than 500 million mobile users and billions of requests, Tapjoy historically relied heavily on their IT footprint. They needed an environment that would allow them to accelerate the development and improve the performance of their big data algorithms, which help drive real time decision-making that delivers the best content to their global audience. Read more to learn how Tapjoy engineers opted for a cloud-based model to run their big data platform instead of bare metal, prioritizing agility and the ability to scale over bare metal performance.
Tags : 
     Cisco
By: Cisco     Published Date: Jul 11, 2016
With a global monthly reach consisting of more than 500 million mobile users and billions of requests, Tapjoy historically relied heavily on their IT footprint. They needed an environment that would allow them to accelerate the development and improve the performance of their big data algorithms, which help drive real time decision-making that delivers the best content to their global audience. Tapjoy engineers opted for a cloud-based model to run their big data platform instead of bare metal, prioritizing agility and the ability to scale over bare metal performance. Initially they deployed at AWS, but as the platform grew and their AWS costs increased, Tapjoy began to look for ways to better manage their growing public-cloud spend while increasing efficiency. They needed to do this without compromising the cloud experience they were giving their developers.
Tags : 
     Cisco
Search White Papers      

Add White Papers

Get your white papers featured in the insideBIGDATA White Paper Library contact: Kevin@insideHPC.com