dataset

Results 1 - 25 of 33Sort Results By: Published Date | Title | Company Name
Published By: SAP     Published Date: May 18, 2014
In-memory technology—in which entire datasets are pre-loaded into a computer’s random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management, business insights, architecture, business intelligence, big data tools
    
SAP
Published By: AWS     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
    
AWS
Published By: TIBCO Software     Published Date: Jan 17, 2019
Are you considering data virtualization for your organization today? In this paper you will learn 10 core truths about data virtualization and gain essential knowledge for overcoming analytic data bottlenecks and driving better outcomes.
Tags : 
virtualization, data, analytics, datasets, software, access, integration, projects, tools, scalability
    
TIBCO Software
Published By: Carbonite     Published Date: Apr 09, 2018
IT admins tasked with restoring servers or lost data during a disruption are consumed with a single-minded purpose: successful recovery. But it shouldn’t take an adverse event to underscore the importance of recovery as part of an overall backup strategy. This is especially true with large datasets. Before you consider how you’re going to back up large datasets, first consider how you may need to recover the data. Variables abound. Is it critical or non-critical data? A simple file deletion or a system-wide outage? A physical server running onsite or a virtual one hosted offsite? These and a handful of other criteria will determine your backup and disaster recovery (BDR) deployment. What do we mean by large? A simple question with a not-so-simple answer. If your total data footprint is 5 TB or more, that’s considered large. But what kind of data is it? How many actual files are there? How frequently do they change? How much can they be compressed? It’s likely that two different 5 TB en
Tags : 
    
Carbonite
Published By: Cognizant     Published Date: Oct 23, 2018
In the last few years, a wave of digital technologies changed the banking landscape - social/ mobile altered the way banks engage with customers, analytics enabled hyper personalized offerings by making sense of large datasets, Cloud technologies shifted the computing paradigm from CapEx to OpEx, enabling delivery of business processes as services from third-party platforms. Now, a second wave of disruption is set to drive even more profound changes - including robotic process automation (RPA), AI, IOT instrumentation, blockchain distributed ledger and shared infrastructure, and open banking platforms controlled by application programming interfaces (API). As these technologies become commercialized, and demand increases for digitally-enabled services, we will see unprecedented disruption, as non-traditional banks and fintechs rush into all segments of the banking space. This whitepaper examines key considerations for banks as they explore value in the emerging Digital 2.0 world.
Tags : 
cognizant, banking, digital
    
Cognizant
Published By: Wasabi     Published Date: Oct 23, 2017
An explosion of data storage needs, both in terms of volume and accessibility, are unmet by first-generation storage solutions. The massive datasets being generated are un-storable due to costs and unable to be fully leveraged because of speed limitations. The needs of individual businesses, and our greater economy, demand the commoditization of cloud storage. Cloud Storage 2.0 represents a new generation of solutions that promise to turn Cloud Storage into a utility along the lines of bandwidth and electricity. Leading this evolution with high-speed, low cost, reliable cloud storage is Wasabi. In this white paper we look at the genesis and possibilities of Cloud Storage 2.0, and Wasabi’s place at its forefront. Free trial with no credit card required offer available as well.
Tags : 
wasabi, cloud storage, data storage, storage solutions
    
Wasabi
Published By: HERE Technologies     Published Date: Jan 23, 2020
Managing and monetizing mobility data - How Chief Data Officers create the digital ecosystems needed for growth As mobility firms become multi-modal platform businesses, new datasets can open up fresh monetization opportunities with the creation of new services and the enrichment of existing services. The prospect of growing revenue and insights from transportation data is enticing. Yet, a distinct competitive advantage is only possible with the right digital infrastructure in place. This eBook looks at the challenges Chief Data Officers need to overcome as they build infrastructure. We'll also show how products and services from HERE can help them achieve their goals. Read the eBook to discover how to: • Enrich datasets, enable new portfolio services and monetize data through use of a neutral platform to buy and sell data • Develop key third-party relationships to create a true mobility data ecosystem • Introduce infrastructure to establish and manage data architecture, governance, in
Tags : 
    
HERE Technologies
Published By: Carbonite     Published Date: Oct 10, 2018
IT admins tasked with restoring servers or lost data during a disruption are consumed with a single-minded purpose: successful recovery. But it shouldn’t take an adverse event to underscore the importance of recovery as part of an overall backup strategy. This is especially true with large datasets. Before you consider how you’re going to back up large datasets, first consider how you may need to recover the data.
Tags : 
    
Carbonite
Published By: Fiserv     Published Date: Nov 07, 2017
"In today’s ever-evolving lending landscape where loan quality and risk management challenge profitability and the customer experience, technology may be the key to thriving – both now and in the future. Winning financial services institutions will be the ones that transform their business models to place loan quality and risk management at the center of their operations. To facilitate continuous life-of-loan management, inclusive of the requisite data transparency and audit trails that support loan quality and loss mitigation, these institutions will implement and automate a loan completion process. Such a process will manage data quality and access to loan data and documents throughout origination, servicing and sale on the secondary market."
Tags : 
mortgage data quality, loan quality, loan data quality, mortgage quality, loan compliance, lending compliance, mortgage compliance, trid, tila respa integrated disclosure, lending efficiency, loan automation, lending automation, mortgage automation, ucd, uniform closing dataset
    
Fiserv
Published By: Fiserv     Published Date: Nov 07, 2017
Learn how loan onboarding can become more efficient and accurate by eliminating manual data validation with automation technology that is poised to transform mortgage servicing. From end-to-end, tools can simplify workflow processes, driving time and cost efficiencies. Trained staff can be deployed to greater effect and can be crucial to eliminating servicing errors. In the process, servicers improve data quality, save time and money, and deliver a better borrower experience.
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, lending efficiency, loan automation, lending automation, mortgage automation, ucd, uniform closing dataset, borrower satisfaction, borrower experience, mortgage origination, mortgage origination automation, mortgage servicing, mortgage servicing automation
    
Fiserv
Published By: Fiserv     Published Date: Nov 07, 2017
"Recently, a number of factors have come together to decimate the profitability of the mortgage banking industry. To regain its footing, the industry must return to mortgage banking fundamentals. This paper carefully examines each function within the mortgage business to determine if there is a better approach that will save money and improve long-term profitability."
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, trid, tila respa integrated disclosure, lending efficiency, loan automation, lending automation, mortgage automation, ucd, uniform closing dataset, borrower satisfaction, borrower experience
    
Fiserv
Published By: Fiserv     Published Date: Nov 07, 2017
"Improve Loan Data Quality and Compliance from Origination to Delivery. This complimentary CEB Gartner paper helps identify process and technology issues that lead to loan defects. Learn strategies for fixing issues and recommends technologies to help lenders improve loan data quality and compliance to reduce costs and improve the borrower experience. "
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, trid, tila respa integrated disclosure, lending efficiency, loan automation, lending automation, mortgage automation, ucd, uniform closing dataset, borrower satisfaction, borrower experience
    
Fiserv
Published By: Fiserv     Published Date: Nov 09, 2017
Digital loan origination processes can still require significant manual support, which is often inaccurate and time-consuming. This National Mortgage News paper, sponsored by Fiserv, explains how you can improve your current loan production while reducing costs and risk of non-compliance.
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, trid, tila respa integrated disclosure, lending efficiency, loan automation, lending automation, mortgage automation, ucd, uniform closing dataset, borrower satisfaction, borrower experience
    
Fiserv
Published By: Waterline Data & Research Partners     Published Date: Nov 07, 2016
For many years, traditional businesses have had a systematic set of processes and practices for deploying, operating and disposing of tangible assets and some forms of intangible asset. Through significant growth in our inquiry discussions with clients, and in observing increased attention from industry regulators, Gartner now sees the recognition that information is an asset becoming increasingly pervasive. At the same time, CDOs and other data and analytics leaders must take into account both internally generated datasets and exogenous sources, such as data from partners, open data and content from data brokers and analytics marketplaces, as they come to terms with the ever-increasing quantity and complexity of information assets. This task is clearly impossible if the organization lacks a clear view of what data is available, how to access it, its fitness for purpose in the contexts in which it is needed, and who is responsible for it.
Tags : 
    
Waterline Data & Research Partners
Published By: Intel     Published Date: Sep 27, 2019
As the first major memory and storage breakthrough in 25 years, Intel Optane technology combines industry-leading low latency, high endurance, QoS, and high throughput that allows the creation of solutions to remove data bottlenecks, and unleash CPU utilization. With Intel Optane technology, data centers can deploy bigger and more affordable datasets to gain new insights from large memory pools. Here are just ten way Intel Optane technology can make a difference to your business. To find out more download this whitepaper today.
Tags : 
    
Intel
Published By: Samasource     Published Date: Feb 26, 2020
While there’s no argument that data quality influences the success of your algorithm, the definition of “high quality” is often ambiguous. It can also be challenging to obtain quality datasets at an affordable cost and at scale. Ultimately, high-quality data is defined as data free of errors. We like to think of it as anything that could mislead your learning algorithm. Here are a few questions to ask when determining how to ensure data quality for your AI and ML algorithms.
Tags : 
    
Samasource
Published By: Samasource     Published Date: Feb 26, 2020
Vulcan, the Seattle-based organization built by Microsoft co-founder Paul Allen, has a long history of supporting research and initiatives that make a global impact. Now the Vulcan Impact team is continuing its commitment to better protect wild plant and animal species and their habitat by using AI for wildlife conservation.
Tags : 
    
Samasource
Published By: Samasource     Published Date: Feb 26, 2020
Quid puts the world’s information at your fingertips, creating a birds-eye view of market landscapes, competitors or brand perception, amongst others. Visualizing information with a data platform, powered by natural language processing, still has limits collecting information solely from algorithms. Therefore, Quid supplemented their information using human ingenuity. Learn how they improved and scaled their data by partnering with Samasource to help enrich their datasets.
Tags : 
    
Samasource
Published By: Alteryx, Inc.     Published Date: Sep 07, 2017
To learn how to get your Tableau datasets faster, download the How To Guide “6 Steps to Faster Data Blending for Tableau.”
Tags : 
    
Alteryx, Inc.
Published By: Quantum Metric     Published Date: Oct 18, 2019
Attaining your customers’ undivided attention is a key challenge for any online business. This challenge is amplified, however, when they explore your brand via their mobile device. Using Quantum Metric’s unique dataset, this infographic showcases the importance of keeping your mobile visitors engaged. Mobile visitors purchase 4X less than desktop visitors among Fortune-500 companies.
Tags : 
    
Quantum Metric
Published By: Plotly     Published Date: Jan 30, 2020
The emergent field of bioinformatics is an amalgamation of computer science, statistics, and biology; it has proven itself revolutionary in biomedical research. As scientific techniques in areas such as genomics and proteomics improve, experimentalists in bioinformatics may find themselves needing to interpret large volumes of data. In order to use this data to efficiently provide meaningful solutions to biological problems, it is important to have robust data visualization tools. Many bioinformaticians have already created analysis and visualization tools with Dash and plotly.py, but only through significant workarounds and modifications made to preexisting graph types. We present an interface to create single-line declarations of charts for complex datasets such as hierarchical clustering and multiple sequence alignment. In addition, we introduce several new chart types, three-dimensional and interactive molecule visualization tools, and components that are specifically related to g
Tags : 
    
Plotly
Published By: Sage     Published Date: Jul 08, 2015
This white paper describes how ERP technology can improve efficiency by: • Standardizing and automating business processes—locally as well as across multiple locations and countries—to accelerate business operations. • Offering a fully integrated suite of business management applications that share a common dataset and extending these applications over the Internet, allowing visibility and collaboration across departments, as well as with customers, partners, suppliers, and remote users. • Providing flexible and customizable reporting to improve business reporting, analysis,and insight.
Tags : 
enterprise resource planning, erp, efficiency, operating costs, standardization, automation, business management
    
Sage
Published By: AWS     Published Date: Nov 14, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources. This e-book aims to provide you with expert tips on how to use Amazon Redshift Spectrum to increase performance and potentially reduce the cost of your queries.
Tags : 
    
AWS
Published By: Vertica     Published Date: Feb 23, 2010
Ovum takes a deep-dive technology audit of Vertica's Analytic Database that is designed specifically for storing and querying large datasets.
Tags : 
ovum, vertica, analytical databases, dbms, technology audit, mpp, rdbms
    
Vertica
Published By: HP     Published Date: Jan 16, 2015
Register below to gain exclusive access to the HP-NVIDIA® Autodesk Building Design Suite 2015 Graphics Optimization guide to help you get the most out of your workstation. Upon submission of your personal information, an HP representative will be in contact in regards to your interests and needs.
Tags : 
visualization, bim, business information, workstations, building design, viewport, autodesk, cloud datasets, optimization, best practice, business application, workflow, autodesk showcase, productivity
    
HP
Previous   1 2    Next    
Search      

Add Research

Get your company's research in the hands of targeted business professionals.