data lake

Results 1 - 25 of 64Sort Results By: Published Date | Title | Company Name
Published By: Informatica     Published Date: Apr 30, 2020
To get the greatest value from your Amazon Web Services (AWS) data lake, you need an intelligent data management strategy that replaces slow, tedious legacy manual processes with fast, simple automation powered by machine learning. Download the white paper Power Your AWS Data Lake with AI-Driven Data Management to learn best practices for a successful data lake project—and how businesses are applying them for real-world insight. Written for enterprise data architects, the white paper explains how to create a systematic, intelligent approach to data management that includes: • Better understanding of business goals and objectives • Agile development methodologies • A metadata approach to data management Learn how to manage your AWS data lake in a way that delivers radically new business insights faster and more efficiently than ever. Download the white paper today.
Tags : 
    
Informatica
Published By: Zaloni     Published Date: Apr 23, 2019
Although data and analytics are highlighted throughout the popular press as well as in trade publications, too many managers think the value of this data processing is limited to a few numerically intensive fields such as science and finance. In fact, big data and the insights that emerge from analyzing it will transform every industry, from “precision farming” to manufacturing and construction. Governments must also be alert to the value of data and analytics as the enabler for smart cities. Institutions that master available data will leap ahead of their less statistically adept competitors through many advantages: finding hidden opportunities for efficiency, using data to become more responsive to clients, and developing entirely new and unanticipated product lines. The average time spent by most companies on the S&P 500 Index has decreased from an average of 60 to 70 years to only 22 years. There are winners and losers in the changes that come with the evolution of both technology
Tags : 
    
Zaloni
Published By: Zaloni     Published Date: Apr 24, 2019
Why your data catalog won’t deliver significant ROI According to Gartner, organizations that provide access to a curated catalog of internal and external data assets will derive twice as much business value from their analytics investments by 2020 than those that do not. That’s a ringing endorsement of data catalogs, and a growing number of enterprises seem to agree. In fact, the global data catalog market is expected to grow from US$210.0 million in 2017 to US$620.0 million by 2022, at a Compound Annual Growth Rate (CAGR) of 24.2%. Why such large and intensifying demand for data catalogs? The primary driver is that many organizations are working to modernize their data platforms with data lakes, cloud-based data warehouses, advanced analytics and various SaaS applications in order to grow profitable digital initiatives. To support these digital initiatives and other business imperatives, organizations need more reliable, faster access to their data. However, modernizing data plat
Tags : 
    
Zaloni
Published By: Oracle     Published Date: Jan 16, 2018
Download this webinar to gain insight on the Data Lake. Learn about the definitions and drivers as well as barriers to Data Lake Success, and Cloud Object Storage.
Tags : 
    
Oracle
Published By: IBM APAC     Published Date: Jul 09, 2017
Organizations today collect a tremendous amount of data and are bolstering their analytics capabilities to generate new, data-driven insights from this expanding resource. To make the most of growing data volumes, they need to provide rapid access to data across the enterprise. At the same time, they need efficient and workable ways to store and manage data over the long term. A governed data lake approach offers an opportunity to manage these challenges. Download this white paper to find out more.
Tags : 
data lake, big data, analytics
    
IBM APAC
Published By: IBM APAC     Published Date: Jul 09, 2017
This Knowledge Brief investigates the impact of a data lake maintained in a cloud or hybrid infrastructure.
Tags : 
data lake, cloud, hybrid
    
IBM APAC
Published By: TIBCO Software     Published Date: Mar 16, 2020
Since its emergence 15 years ago, master data management (MDM) has come up against big data, data lakes, regulatory challenges and, of course, digital transformation. Where does MDM fit in the overall picture today? What are its main use cases? Two experts, Christophe Barriolade from TIBCO EBX software, and Pascal Anthoine from international consulting and engineering group Micropole, share their views.
Tags : 
    
TIBCO Software
Published By: TIBCO Software     Published Date: Mar 17, 2020
Data science and machine learning are key technologies for enterprises that want to take advantage of the massive insights buried in their data marts, data warehouses, Apache Hadoop lakes, and spreadsheets. But, despite the millions of dollars invested in analytics technologies, the majority of companies still struggle to establish an efficient and programmatic way to do analytics at scale. According to Gartner Inc., over 60% of models developed with the intention of operationalizing them were never actually operationalized. Why are these investments failing to meet expectations? In this paper, we delve into today's most common data science and ML myths and offer potential solutions.
Tags : 
    
TIBCO Software
Published By: Teradata     Published Date: Jan 30, 2015
Our goal is to share best practices so you can understand how designing a data lake strategy can enhance and amplify existing investments and create new forms of business value.
Tags : 
data lake, data warehouse, enterprise data, migration, enterprise use, data lake strategy, business value, data management
    
Teradata
Published By: RedPoint Global     Published Date: May 11, 2017
While they’re intensifying, business-data challenges aren’t new. Companies have tried several strategies in their attempt to harness the power of data in ways that are feasible and effective. The best data analyses and game-changing insights will never happen without the right data in the right place at the right time. That’s why data preparation is a non-negotiable must for any successful customer-engagement initiative. The fact is, you can’t simply load data from multiple sources and expect it to make sense. This white paper examines the shortcomings of traditional approaches such as data warehouses/data lakes and explores the power of connected data.
Tags : 
customer engagement, marketing data, marketing data analytics, customer data platform
    
RedPoint Global
Published By: Attunity     Published Date: Nov 15, 2018
Change data capture (CDC) technology can modernize your data and analytics environment with scalable, efficient and real-time data replication that does not impact production systems. To realize these benefits, enterprises need to understand how this critical technology works, why it’s needed, and what their Fortune 500 peers have learned from their CDC implementations. This book serves as a practical guide for enterprise architects, data managers and CIOs as they enable modern data lake, streaming and cloud architectures with CDC. Read this book to understand: ? The rise of data lake, streaming and cloud platforms ? How CDC works and enables these architectures ? Case studies of leading-edge enterprises ? Planning and implementation approaches
Tags : 
optimize customer service
    
Attunity
Published By: Attunity     Published Date: Jan 14, 2019
This whitepaper explores how to automate your data lake pipeline to address common challenges including how to prevent data lakes from devolving into useless data swamps and how to deliver analytics-ready data via automation. Read Increase Data Lake ROI with Streaming Data Pipelines to learn about: • Common data lake origins and challenges including integrating diverse data from multiple data source platforms, including lakes on premises and in the cloud. • Delivering real-time integration, with change data capture (CDC) technology that integrates live transactions with the data lake. • Rethinking the data lake with multi-stage methodology, continuous data ingestion and merging processes that assemble a historical data store. • Leveraging a scalable and autonomous streaming data pipeline to deliver analytics-ready data sets for better business insights. Read this Attunity whitepaper now to get ahead on your data lake strategy in 2019.
Tags : 
data lake, data pipeline, change data capture, data swamp, hybrid data integration, data ingestion, streaming data, real-time data
    
Attunity
Published By: Attunity     Published Date: Feb 12, 2019
This technical whitepaper by Radiant Advisors covers key findings from their work with a network of Fortune 1000 companies and clients from various industries. It assesses the major trends and tips to gain access to and optimize data streaming for more valuable insights. Read this report to learn from real-world successes in modern data integration, and better understand how to maximize the use of streaming data. You will also learn about the value of populating a cloud data lake with streaming operational data, leveraging database replication, automation and other key modern data integration techniques. Download this whitepaper today for about the latest approaches on modern data integration and streaming data technologies.
Tags : 
streaming data, cloud data lakes, cloud data lake, data lake, cloud, data lakes, streaming data, change data capture
    
Attunity
Published By: Attunity     Published Date: Feb 12, 2019
Read this checklist report, with results based on the Eckerson Group’s survey and the Business Application Research Center (BARC), on how companies using the cloud for data warehousing and BI has increased by nearly 50%. BI teams must address multiple issues including data delivery, security, portability and more before moving to the cloud for its infinite scalability and elasticity. Read this report to understand all 7 seven considerations – what, how and why they impact the decision to move to the cloud.
Tags : 
cloud, business intelligence, analytics, cloud data, data lake, data warehouse automation tools, dwa, data warehouse
    
Attunity
Published By: Larsen & Toubro Infotech(LTI)     Published Date: Jan 31, 2019
LTI built a transaction monitoring cognitive data lake to facilitate AML transaction monitoring across post trade transactions for a leading global bank, which resulted in reduction of human errors by 30% and TAT improvement by 50%. Download Complete Case Study.
Tags : 
    
Larsen & Toubro Infotech(LTI)
Published By: Larsen & Toubro Infotech(LTI)     Published Date: Jan 31, 2019
LTI helped a leading global bank digitize its traditional product ecosystem for AML transaction monitoring. With the creation of a data lake and efficient learning models, the bank successfully reduced false positives and improved customer risk assessment. Download Complete Case Study.
Tags : 
    
Larsen & Toubro Infotech(LTI)
Published By: Paxata     Published Date: Nov 14, 2018
This eBook provides a step-by-step best practices guide for creating successful data lakes.
Tags : 
data lakes, governance, monetization
    
Paxata
Published By: Wavestone     Published Date: Apr 03, 2020
"Coronavirus has plunged the world into recession, so this might be a good time to take stock and streamline your operations. Application portfolio optimization (APO) is one of the most direct and effective ways to reduce costs, complexity, and risk, enabling you to focus on the areas that drive more value to the overall business. Do it right and you can achieve significant savings, drive down future costs exponentially and improve staff skills, capabilities, and productivity. This white paper serves as a comprehensive guide to optimizing your portfolio with Wavestone’s proven framework and sourcing strategies. You’ll learn: • Three sourcing strategies that will dramatically reduce IT spend • How to use technology enablers such as DevOps, APIs, data lakes, and more • What an optimized portfolio looks like—with actual case studies of industry leaders"
Tags : 
    
Wavestone
Published By: IBM APAC     Published Date: May 14, 2019
If anything is certain about the future, it’s that there will be more complexity, more data to manage and greater pressure to deliver instantly. The hardware you buy should meet today’s expectations and prepare you for whatever comes next. Power Systems are built for the most demanding, data-intensive, computing on earth. Our cloudready servers help you unleash insight from your data pipeline — from managing mission-critical data, to managing your operational data stores and data lakes, to delivering the best server for cognitive computing. With industry leading reliability and security, our infrastructure is designed to crush the most data-intensive workloads imaginable, while keeping your business protected. - Simplified Multicloud - Built-in end-to-end security - Proven Reliability - Industry-leading value and performance
Tags : 
    
IBM APAC
Published By: Google     Published Date: Feb 27, 2020
In this 25-criterion evaluation of data management for analytics (DMA) providers, we identified the 14 most significant ones — Amazon Web Services (AWS), Cloudera, Google, Hewlett Packard Enterprise (HPE), IBM, MemSQL, Micro Focus, Microsoft, MongoDB, Oracle, Pivotal Software, SAP, Snowflake, and Teradata — and researched, analyzed, and scored them. Forrester's research uncovered a market in which Google, SAP, Oracle, IBM, and Teradata, are Leaders.
Tags : 
    
Google
Published By: Teradata     Published Date: May 02, 2017
Kylo overcomes common challenges of capturing and processing big data. It lets businesses easily configure and monitor data flows in and through the data lake so users have constant access to high-quality data. It also enhances data profiling while offering self-service and data wrangling capabilities.
Tags : 
cost reduction, data efficiency, data security, data integration, financial services, data discovery, data accessibility, data comprehension
    
Teradata
Published By: IBM     Published Date: Jan 27, 2017
Companies today increasingly look for ways to house multiple disparate forms forms of data under the same roof, maintaining original integrity and attributes. Enter the Hadoop-based data lake. While a traditional on-premise data lake might address the immediate needs for scalability and flexibility, research suggests that it may fall short in supporting key aspects of the user experience. This Knowledge Brief investigates the impact of a data lake maintained in a cloud or hybrid infrastructure.
Tags : 
    
IBM
Published By: IBM     Published Date: Apr 18, 2017
The data integration tool market was worth approximately $2.8 billion in constant currency at the end of 2015, an increase of 10.5% from the end of 2014. The discipline of data integration comprises the practices, architectural techniques and tools that ingest, transform, combine and provision data across the spectrum of information types in the enterprise and beyond — to meet the data consumption requirements of all applications and business processes. The biggest changes in the market from 2015 are the increased demand for data virtualization, the growing use of data integration tools to combine "data lakes" with existing integration solutions, and the overall expectation that data integration will become cloud- and on-premises-agnostic.
Tags : 
data integration, data security, data optimization, data virtualization, database security, data analytics, data innovation
    
IBM
Published By: IBM     Published Date: Jul 06, 2017
Companies today increasingly look for ways to house multiple disparate forms of data under the same roof, maintaining original integrity and attributes. Enter the Hadoop-based data lake. While a traditional on-premise data lake might address the immediate needs for scalability and flexibility, research suggests that it may fall short in supporting key aspects of the user experience. This Knowledge Brief investigate the impact of a data lake maintained in a cloud or hybrid infrastucture.
Tags : 
data lake, user experience, knowledge brief, cloud infrastructure
    
IBM
Published By: Group M_IBM Q1'20     Published Date: Jan 14, 2020
Ten years ago, the journey began to find a flexible, versatile approach to build a central data store where all enterprise data could reside. The solution was the data lake — a general-purpose data storage environment that would store practically any type of data. It would also allow business analysts and data scientists to apply the most appropriate analytics engines and tools to each data set, in its original location. As these data lakes began to grow, a set of problems became apparent. While the technology was physically capable of scaling to capture, store and analyze vast and varied collections of structured and unstructured data, too little attention was paid to the practicalities of how to embed these capabilities into business workflows. Today, many organizations have recognized their failures, have changed leadership teams for the data lake implementation and are launching a second, third or even fourth attempt to implement a data lake successfully – this time leading with da
Tags : 
data ops, data lake, data storage, data governance, data integration
    
Group M_IBM Q1'20
Start   Previous   1 2 3    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.