database

Results 1 - 25 of 887Sort Results By: Published Date | Title | Company Name
Published By: Oracle ZDLRA     Published Date: Jan 10, 2018
Business leaders expect two things from IT: keep mission-critical applications available and high performing 24x7 and, if something does happen, recover to be back in business quickly and without losing any critical data so there is no impact on revenue stream. Of course, there is a gap between this de facto expectation from nontechnical business leaders and what current technology is actually capable of delivering. For mission-critical workloads, which are most often hosted on databases, organizations may choose to implement high availability (HA) technologies within the database to avoid downtime and data loss.
Tags : 
recovery point, recovery time, backup appliance, san/nas, service level agreement, oracle
    
Oracle ZDLRA
Published By: Oracle ZDLRA     Published Date: Jan 10, 2018
Traditional backup systems fail to meet the database protection and recovery requirements of modern organizations. These systems require ever-growing backup windows, negatively impact performance in mission-critical production databases, and deliver recovery time objectives (RTO) and recovery point objectives (RPO) measured in hours or even days, failing to meet the requirements of high-volume, high transactional databases -- potentially costing millions in lost productivity and revenue, regulatory penalties, and reputation damage due to an outage or data loss.
Tags : 
data protection, backup speed, recovery, overhead, assurance, storage, efficiency, oracle
    
Oracle ZDLRA
Published By: Sitecore EMEA     Published Date: Jan 23, 2018
The global beauty company provides an exhilarating example of the power of context-driven commerce. When customers can rely on brands to achieve the outcomes that are important to them, brand owners can shift their focus from nonstop customer acquisition to long-term customer engagement. The Sitecore Experience Platform and the underlying Sitecore Experience Database are foundation technologies for IT organizations wanting to implement context-driven commerce environments that function elegantly today and into the future. Sitecore puts an end to digital retailers’ obsession with cart abandonment, replacing it with opportunities to delight customers with personal, meaningful commerce offers.
Tags : 
advertising, revenue, customers, social channels, sitecore
    
Sitecore EMEA
Published By: Quick Base     Published Date: Dec 18, 2017
Spreadsheets are good for number crunching – but many professionals use them to do things they were never meant for. The result? Long office hours, chasing down status updates, and compiling data manually. Learn the 5 signs of spreadsheet misuse and how to overcome them in this webcast.
Tags : 
audit, insights, accounting, ap management, compliance, finance, spreadsheets, collaboration
    
Quick Base
Published By: Oracle Dyn     Published Date: Dec 06, 2017
Every user’s first interaction with your website begins with a series of DNS queries. The Domain Name System (DNS) is a distributed internet database that maps human-readable names to IP addresses, ensuring users reach the correct website when entering a URL. DNS mappings are maintained in special-purpose servers called DNS nameservers. When a user enters your company’s URL, a DNS query is routed to a DNS nameserver containing the address mappings for your company’s internet domain.
Tags : 
traffic, dns, internet, visibility, programming, interfaces, optimization, routing
    
Oracle Dyn
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
As organizations develop next-generation applications for the digital era, many are using cognitive computing ushered in by IBM Watson® technology. Cognitive applications can learn and react to customer preferences, and then use that information to support capabilities such as confidence-weighted outcomes with data transparency, systematic learning and natural language processing. To make the most of these next-generation applications, you need a next-generation database. It must handle a massive volume of data while delivering high performance to support real-time analytics. At the same time, it must provide data availability for demanding applications, scalability for growth and flexibility for responding to changes.
Tags : 
database, applications, data availability, cognitive applications
    
Group M_IBM Q1'18
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
For increasing numbers of organizations, the new reality for development, deployment and delivery of applications and services is hybrid cloud. Few, if any, organizations are going to move all their strategic workloads to the cloud, but virtually every enterprise is embracing cloud for a wide variety of requirements. To accelerate innovation, improve the IT delivery economic model and reduce risk, organizations need to combine data and experience in a cognitive model that yields deeper and more meaningful insights for smarter decisionmaking. Whether the user needs a data set maintained in house for customer analytics or access to a cloud-based data store for assessing marketing program results — or any other business need — a high-performance, highly available, mixed-load database platform is required.
Tags : 
cloud, database, hybrid cloud, database platform
    
Group M_IBM Q1'18
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
Effectively using and managing information has become critical to driving growth in areas such as pursuing new business opportunities, attracting and retaining customers, and streamlining operations. In the era of big data, you must accommodate a rapidly increasing volume, variety and velocity of data while extracting actionable business insight from that data, faster than ever before. These needs create a daunting array of workload challenges and place tremendous demands on your underlying IT infrastructure and database systems. This e-book presents six reasons why you should consider a database change, including opinions from industry analysts and real-world customer experiences. Read on to learn more.
Tags : 
database, streamlining, it infrastructure, database systems
    
Group M_IBM Q1'18
Published By: IBM     Published Date: Nov 08, 2017
This paper gives key considerations when making a strategic commitment to a database platform.
Tags : 
ibm, cloud, cloud computing, database
    
IBM
Published By: IBM     Published Date: Nov 08, 2017
IBM DB2 with BLU Acceleration helps tackle the challenges presented by big data. It delivers analytics at the speed of thought, always-available transactions, future-proof versatility, disaster recovery and streamlined ease-of-use to unlock the value of data.
Tags : 
ibm, cloud, cloud computing, database, ibm db2
    
IBM
Published By: IBM     Published Date: Nov 08, 2017
Flexible deployment options, licensing models help take the challenges out of change. As you move toward the cloud, you're likely planning or managing a mixed environment of on- premises and on- cloud applications. To help you succeed in this transition, you need a trans-formative, mixed-workload database that can handle a massive volume of data while delivering high performance, data availability and the flexibility to adapt respond to business changes.
Tags : 
ibm, cloud, cloud computing, database, ibm db2
    
IBM
Published By: IBM     Published Date: Nov 08, 2017
This paper gives key considerations when making a strategic commitment to a database platform.
Tags : 
ibm, cloud, cloud computing, database
    
IBM
Published By: IBM     Published Date: Nov 08, 2017
Flexible deployment options, licensing models help take the challenges out of change. As you move toward the cloud, you're likely planning or managing a mixed environment of on- premises and on- cloud applications. To help you succeed in this transition, you need a trans-formative, mixed-workload database that can handle a massive volume of data while delivering high performance, data availability and the flexibility to adapt respond to business changes.
Tags : 
ibm db2, cloud, on-cloud applications, mixed-workload database
    
IBM
Published By: Group M_IBM Q1'18     Published Date: Jan 23, 2018
This paper gives key considerations when making a strategic commitment to a database platform.
Tags : 
cloud, database, hybrid cloud, database platform
    
Group M_IBM Q1'18
Published By: Group M_IBM Q1'18     Published Date: Jan 23, 2018
Flexible deployment options, licensing models help take the challenges out of change. As you move toward the cloud, you're likely planning or managing a mixed environment of on- premises and on- cloud applications. To help you succeed in this transition, you need a trans-formative, mixed-workload database that can handle a massive volume of data while delivering high performance, data availability and the flexibility to adapt respond to business changes.
Tags : 
cloud applications, database, data volume, data availability
    
Group M_IBM Q1'18
Published By: AstuteIT_ABM_EMEA     Published Date: Feb 02, 2018
The demand for databases is on the rise as organizations build next-generation business applications. NoSQL offers enterprise architecture (EA) pros new choices to store, process, and access new data formats, deliver extreme web-scale, and lower data management costs. Forrester’s 26-criteria evaluation of 15 big data NoSQL solutions will help EA pros understand the choices available and recommend the best for their organization. This report details our findings about how each vendor fulfills our criteria and where they stand in relation to each other to help EA.
Tags : 
nosql, market, industries, strategy, presence, vendor
    
AstuteIT_ABM_EMEA
Published By: AstuteIT_ABM_EMEA     Published Date: Feb 02, 2018
MongoDB is an open-source, document database designed with both scalability and developer agility in mind. MongoDB bridges the gap between key-value stores, which are fast and scalable, and relational databases, which have rich functionality. Instead of storing data in rows and columns as one would with a relational database, MongoDB stores JSON documents with dynamic schemas. Customers should consider three primary factors when evaluating databases: technological fit, cost, and topline implications. MongoDB's flexible and scalable data model, robust feature set, and high-performance, high-availability architecture make it suitable for a wide range of database use cases. Given that in many cases relational databases may also be a technological fit, it is helpful to consider the relative costs of each solution when evaluating which database to adopt.
Tags : 
total, cost, ownership, comparison, mongodb, oracle
    
AstuteIT_ABM_EMEA
Published By: AstuteIT_ABM_EMEA     Published Date: Feb 02, 2018
The relational database has been the foundation of enterprise data management for over thirty years. But the way we build and run applications today, coupled with unrelenting growth in new data sources and growing user loads are pushing relational databases beyond their limits. This can inhibit business agility, limit scalability and strain budgets, compelling more and more organizations to migrate to alternatives like MongoDB or NoSQL databases.
Tags : 
rdbms, mongodb, migration, data, enterprise, management
    
AstuteIT_ABM_EMEA
Published By: Oracle     Published Date: Oct 20, 2017
Oracle has just announced a new microprocessor, and the servers and engineered system that are powered by it. The SPARC M8 processor fits in the palm of your hand, but it contains the result of years of co-engineering of hardware and software together to run enterprise applications with unprecedented speed and security. The SPARC M8 chip contains 32 of today’s most powerful cores for running Oracle Database and Java applications. Benchmarking data shows that the performance of these cores reaches twice the performance of Intel’s x86 cores. This is the result of exhaustive work on designing smart execution units and threading architecture, and on balancing metrics such as core count, memory and IO bandwidth. It also required millions of hours in testing chip design and operating system software on real workloads for database and Java. Having faster cores means increasing application capability while keeping the core count and software investment under control. In other words, a boost
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
Modern technology initiatives are driving IT infrastructure in a new direction. Big data, social business, mobile applications, the cloud, and real-time analytics all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Customers increasingly drive the speed of business, and organizations need to engage with customers on their terms. The need to manage sensitive information with high levels of security as well as capture, analyze, and act upon massive volumes of data every hour of every day has become critical. These challenges will dramatically change the way that IT systems are designed, funded, and run compared to the past few decades. Databases and Java have become the de facto language in which modern, cloud-ready applications are written. The massive explosion in the volume, variety, and velocity of data increases the need for secure and effective analytics so that organizations can make bette
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
The Software in Silicon design of the SPARC M7 processor, and the recently announced SPARC S7 processor, implement memory access validation directly into the processor so that you can protect application data that resides in memory. It also includes on-chip Data Analytics Accelerator (DAX) engines that are specifically designed to accelerate analytic functions. The DAX engines make in-memory databases and applications run much faster, plus they significantly increase usable memory capacity by allowing compressed databases to be stored in memory without a performance penalty. The following Software in Silicon technologies are implemented in the SPARC S7 and M7 processors: Note: Security in Silicon encompasses both Silicon Secured Memory and cryptographic instruction acceleration, whereas SQL in Silicon includes In-Memory Query Acceleration and In-Line Decompression. Silicon Secured Memory is the first-ever end-to-end implementation of memory-access validation done in hardware. It
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
Modern technology initiatives are driving IT infrastructure in a new direction. Big data, social business, mobile applications, the cloud, and real-time analytics all require forward-thinking solutions and enough compute power to deliver the performance required in a rapidly evolving digital marketplace. Customers increasingly drive the speed of business, and organizations need to engage with customers on their terms. The need to manage sensitive information with high levels of security as well as capture, analyze, and act upon massive volumes of data every hour of every day has become critical. These challenges will dramatically change the way that IT systems are designed, funded, and run compared to the past few decades. Databases and Java have become the de facto language in which modern, cloud-ready applications are written. The massive explosion in the volume, variety, and velocity of data increases the need for secure and effective analytics so that organizations can make bette
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
    
Oracle
Published By: Oracle     Published Date: Oct 20, 2017
Security has become top of mind for CIOs, and CEOs. Encryption at rest is a piece of the solution, but not a big piece. Encryption over the network is another piece, but only a small piece. These and other pieces do not fit together well; they need to unencrypt and reencrypt the data when they move through the layers, leaving clear versions that create complex operational issues to monitor and detect intrusion. Larger-scale high-value applications requiring high security often use Oracle middleware, including Java and Oracle database. Traditional security models give the data to the processors to encrypt and unencrypt, often many times. The overhead is large, and as a result encryption is used sparingly on only a few applications. The risk to enterprises is that they may have created an illusion of security, which in reality is ripe for exploitation. The modern best-practice security model is an end-to-end encryption architecture. The application deploys application-led encryption s
Tags : 
    
Oracle
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.