data warehouse

Results 1 - 25 of 219Sort Results By: Published Date | Title | Company Name
Published By: TIBCO Software     Published Date: Mar 17, 2020
Data science and machine learning are key technologies for enterprises that want to take advantage of the massive insights buried in their data marts, data warehouses, Apache Hadoop lakes, and spreadsheets. But, despite the millions of dollars invested in analytics technologies, the majority of companies still struggle to establish an efficient and programmatic way to do analytics at scale. According to Gartner Inc., over 60% of models developed with the intention of operationalizing them were never actually operationalized. Why are these investments failing to meet expectations? In this paper, we delve into today's most common data science and ML myths and offer potential solutions.
Tags : 
    
TIBCO Software
Published By: TIBCO Software     Published Date: Mar 17, 2020
Data virtualization is part of data architecture that accompanies analytic projects—providing data as a service, the logical data warehouse, and a 360-degree view—to maximize the potential of data. In this interview, François Rivard, managing partner, Astrakhan Consulting, and Sadaq Boutrif, consulting solutions director, TIBCO Software France answer questions on data virtualization: How is data virtualization positioned in the stack of data solutions? How to compare data virtualization with other approaches, such as data wrangling or ETL? Gartner estimates that by 2020, 50% of companies will have deployed some form of data virtualization. What are the uses that will drive this growth? What does data virtualization mean in terms of organization? What are the obstacles or risks that can complicate the gestation of these projects? How does TIBCO see its position today in this landscape of data virtualization?
Tags : 
    
TIBCO Software
Published By: Attivio     Published Date: Aug 20, 2010
With the explosion of unstructured content, the data warehouse is under siege. In this paper, Dr. Barry Devlin discusses data and content as two ends of a continuum, and explores the depth of integration required for meaningful business value.
Tags : 
attivio, data warehouse, unified information, data, content, unstructured content, integration, clob, blob
    
Attivio
Published By: Attivio     Published Date: Aug 20, 2010
Current methods for accessing complex, distributed information delay decisions and, even worse, provide incomplete insight. This paper details the impact of Unified Information Access (UIA) in improving the agility of information-driven business processes by bridging information silos to unite content and data in one index to power solutions and applications that offer more complete insight.
Tags : 
attivio, data warehouse, unified information, data, content, unstructured content, integration, clob, blob
    
Attivio
Published By: Group M_IBM Q2'20     Published Date: Mar 23, 2020
Enterprise Data Warehouses (EDWs) have existed for about 20 years. They serve as the foundations of insight-driven organizations, delivering timely analysis and reporting of structured data, handling large analytic workloads, and supporting the high levels of concurrency that these organizations demand (i.e., many users simultaneously accessing the EDW). But while EDWs have been a familiar presence in many organizations, as companies look to reduce their data center footprints, increase organizational agility, and incorporate as much data as possible into their analytic workflows, the architectural rigidity, complexity, and cost of a traditional EDW are becoming increasingly apparent. Due to their long-established presence, EDWs have been relegated to catchall status, with organizations utilizing them for activities for which they weren’t originally created. This unwieldly scenario is pushing the EDW to become more of a cost center than an insight enabler.
Tags : 
    
Group M_IBM Q2'20
Published By: Zaloni     Published Date: Apr 24, 2019
Why your data catalog won’t deliver significant ROI According to Gartner, organizations that provide access to a curated catalog of internal and external data assets will derive twice as much business value from their analytics investments by 2020 than those that do not. That’s a ringing endorsement of data catalogs, and a growing number of enterprises seem to agree. In fact, the global data catalog market is expected to grow from US$210.0 million in 2017 to US$620.0 million by 2022, at a Compound Annual Growth Rate (CAGR) of 24.2%. Why such large and intensifying demand for data catalogs? The primary driver is that many organizations are working to modernize their data platforms with data lakes, cloud-based data warehouses, advanced analytics and various SaaS applications in order to grow profitable digital initiatives. To support these digital initiatives and other business imperatives, organizations need more reliable, faster access to their data. However, modernizing data plat
Tags : 
    
Zaloni
Published By: SAP     Published Date: May 18, 2014
New data sources are fueling innovation while stretching the limitations of traditional data management strategies and structures. Data warehouses are giving way to purpose built platforms more capable of meeting the real-time needs of a more demanding end user and the opportunities presented by Big Data. Significant strategy shifts are under way to transform traditional data ecosystems by creating a unified view of the data terrain necessary to support Big Data and real-time needs of innovative enterprises companies.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management, business insights, architecture, business intelligence, big data tools
    
SAP
Published By: Oracle     Published Date: Nov 28, 2017
Today’s leading-edge organizations differentiate themselves through analytics to further their competitive advantage by extracting value from all their data sources. Other companies are looking to become data-driven through the modernization of their data management deployments. These strategies do include challenges, such as the management of large growing volumes of data. Today’s digital world is already creating data at an explosive rate, and the next wave is on the horizon, driven by the emergence of IoT data sources. The physical data warehouses of the past were great for collecting data from across the enterprise for analysis, but the storage and compute resources needed to support them are not able to keep pace with the explosive growth. In addition, the manual cumbersome task of patch, update, upgrade poses risks to data due to human errors. To reduce risks, costs, complexity, and time to value, many organizations are taking their data warehouses to the cloud. Whether hosted lo
Tags : 
    
Oracle
Published By: AWS     Published Date: Sep 05, 2018
Big data alone does not guarantee better business decisions. Often that data needs to be moved and transformed so Insight Platforms can discern useful business intelligence. To deliver those results faster than traditional Extract, Transform, and Load (ETL) technologies, use Matillion ETL for Amazon Redshift. This cloud- native ETL/ELT offering, built specifically for Amazon Redshift, simplifies the process of loading and transforming data and can help reduce your development time. This white paper will focus on approaches that can help you maximize your investment in Amazon Redshift. Learn how the scalable, cloud- native architecture and fast, secure integrations can benefit your organization, and discover ways this cost- effective solution is designed with cloud computing in mind. In addition, we will explore how Matillion ETL and Amazon Redshift make it possible for you to automate data transformation directly in the data warehouse to deliver analytics and business intelligence (BI
Tags : 
    
AWS
Published By: AWS     Published Date: Sep 05, 2018
AbeBooks, with Amazon Redshift, has been able to upgrade to a comprehensive data warehouse with the enlistment of Matillion ETL for Amazon Redshift. In this case study, we share AbeBooks’ data warehouse success story.
Tags : 
    
AWS
Published By: Oracle CX     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle CX
Published By: Oracle CX     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle CX
Start   Previous   1 2 3 4 5 6 7 8 9    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.