{{filteredTerms.length}} search results for “{{searchTerm}}”

{{ term.header }}

{{ term.summary }}

Lisez plus

What is an Active Data Warehouse?

An Active Data Warehouse is a combination of products, features, services, and business partnerships that support the Active Enterprise Intelligence business strategy. This term was coined by Teradata in 2001.

Lisez plus

What is an Algorithm?

Within the context of big data, algorithms are the primary means for uncovering insights and detecting patterns. Thus, they are essential to realizing the big data business case.

Lisez plus

What is an Analytics Platform?

An analytics platform is a full-featured technology solution designed to address the needs of large enterprises. Typically, it joins different tools and analytics systems together with an engine to execute, a database or repository to store and manage the data, data mining processes, and techniques and mechanisms for obtaining and preparing data that is not stored.

Lisez plus

What is Apache Hive?

Apache Hive is an open-source data warehouse infrastructure that provides tools for data summarization, query and analysis. It is specifically designed to support the analysis of large datasets stored in Hadoop files and compatible file systems, such as Amazon S3. Hive was initially developed by data engineers at Facebook in 2008, but is now used by many other companies.

Lisez plus

What are Behavioral Analytics?

Behavioral analytics measure how users engage with digital applications (web, mobile, IoT) and how seemingly unrelated data points can explain or predict outcomes.

Lisez plus

What is Big Data?

Big data is a group of data sets too large and complex to manipulate or query with standard tools.

Lisez plus

What is Big Data Analytics?

Big data analytics refers to the strategy of analyzing large volumes of data gathered from a wide variety of sources, including social networks, videos, digital images, sensors and sales transaction records.

Lisez plus

What is Business Intelligence?

Business Intelligence (BI) parses business data and presents easy-to-digest reports, performance measures, and trends that drive management decisions.

Lisez plus

What is Cascading?

Cascading is a platform for developing Big Data applications on Hadoop. It offers a computation engine, systems integration framework, data processing and scheduling capabilities.

Lisez plus

What is a CDP?

A Customer Data Platform (CDP) is a type of packaged software which creates a persistent, unified customer database that is accessible to other systems.

Lisez plus

What is Cloud Computing?

Cloud computing refers to the practice of using a network of remote servers to store, manage and process data (rather than an on-premise server or a personal computer) with access to such data provided through the Internet (the cloud).

Lisez plus

What is Cluster Analysis?

Cluster analysis or clustering is a statistical classification technique or activity that involves grouping a set of objects or data so that those in the same group (called a cluster) are similar to each other, but different from those in other clusters.

Lisez plus

What is Comparative Analysis?

Comparative analysis refers to the comparison of two or more processes, documents, data sets or other objects. Pattern analysis, filtering and decision-tree analytics are forms of comparative analysis.

Lisez plus

What is Connection Analytics?

Connection analytics is an emerging discipline that helps to discover interrelated connections and influences between people, products, processes, machines and systems within a network by mapping those connections and continuously monitoring interactions between them.

Lisez plus

What is Concurrency/Concurrent Computing?

Concurrency or concurrent computing refers to the form of computing in which multiple computing tasks occur simultaneously or at overlapping times. These tasks can be handled by individual computers, specific applications or across networks.

Lisez plus

What is Correlation Analysis?

Correlation analysis refers to the application of statistical analysis and other mathematical techniques to evaluate or measure the relationships between variables.

Lisez plus

What is a Data Analyst?

Data analysts serve the critical purpose of helping to operationalize big data within specific functions and processes, with a clear focus on performance trends and operational information.

Lisez plus

What is Data Analytics?

Data analytics, also known as advanced analytics or big data analytics, is an autonomous or semi-autonomous inspection of data or content using sophisticated techniques and tools beyond those of traditional business intelligence (BI), to uncover deeper insights, make predictions, or produce recommendations. Techniques include data/text mining, machine learning, pattern matching, forecasting, visualization, semantic analysis, sentiment analysis, network and cluster analysis, multivariate statistics, graph analysis, simulation, complex event processing, neural networks.

Lisez plus

What is Data Architecture?

Teradata Unified Data Architecture is the first comprehensive big data architecture. This framework harnesses relational and non-relational repositories via SQL and non-SQL analytics.

Lisez plus

What is Data Cleansing?

Data cleansing, or data scrubbing, is the process of detecting and correcting or removing inaccurate data or records from a database.

Lisez plus

What is Data Gravity?

Data gravity appears when the amount of data volume in a repository grows, along with the number of uses, and makes the ability to copy or migrate data onerous and expensive.

Lisez plus

What is a Data Lake?

Data lakes complement data warehouses with a design pattern that focuses on original raw data fidelity and long-term storage at a low cost while providing a new form of analytical agility.

Lisez plus

What is Data Latency?

Data latency is the ability to load and update data in near real-time while simultaneously supporting query workloads.

Lisez plus

What is a Data Mart?

A data mart is a subject-oriented slice of the data warehouse logical model serving a narrow group of users.

Lisez plus

What is Data Mining?

Data mining is the process of analyzing hidden patterns of data according to different perspectives for categorization into useful information, which is collected and assembled in common areas, such as data warehouses.

Lisez plus

What is Data Modeling?

Data models that are tailored to specific industries or business functions can provide a strong foundation or "jump-start" for big data programs and investments.

Lisez plus

What are Descriptive Analytics?

Descriptive analytics are the analysis of historical data to determine what happened, what changed and what patterns can be identified.

Lisez plus

What is a Data Warehouse?

In computing, a data warehouse (DW or DWH), also known as an enterprise data warehouse (EDW), is a system used for reporting and data analysis.

Lisez plus

What is Data Volume?

Data volume is the storing and processing of petabytes of data natively and in object storage.

Lisez plus

What is Data Warehousing?

A data warehouse is a design pattern or data architecture that tracks integrated, consistent, and detailed data over time, establishing relationships between them using metadata and schema.

Lisez plus

What is Deep Learning?

Deep learning, also known as deep neural learning or deep neural network, is an artificial intelligence (AI) function that mimics how the human brain works to process data and create patterns that facilitate decision making.

Lisez plus

What is ETL?

Extract, Transform and Load (ETL) refers to the process in data warehousing that concurrently reads (or extracts) data from source systems; converts (or transforms) the data into the proper format for querying and analysis; and loads it into a data warehouse, operational data store or data mart).

Lisez plus

What is an Exabyte?

An extraordinarily large unit of digital data, one Exabyte (EB) is equal to 1,000 Petabytes or one billion gigabytes (GB). Some technologists have estimated that all the words ever spoken by mankind would be equal to five Exabytes.

Lisez plus

What is Finance Analytics?

Finance analytics, also known as financial analytics, provides differing perspectives on the financial data of a given business, giving insights that can facilitate strategic decisions and actions that improve the overall performance of the business.

Lisez plus

What is Hadoop?

Hadoop is a distributed data management platform or open-source software framework for storing and processing big data. It is sometimes described as a cut-down distributed operating system.

Lisez plus

What is a Hybrid Cloud?

Hybrid cloud, also known as hybrid cloud architecture, is the combination of on-premises and cloud deployment – whether public cloud, private cloud, or multi-cloud. Whether an organization’s resources include on-premises, private, public, or managed cloud, a hybrid cloud ecosystem can deliver the best of all worlds: on-prem when needed and cloud when needed.

Lisez plus

What is Internet of Things (IoT)?

The Internet of Things, also known as IoT, is a concept that describes the connection of everyday physical objects and products to the Internet so that they are recognizable by (through unique identifiers) and can relate to other devices.

Lisez plus

What is Machine Learning?

Machine learning is a type of artificial intelligence (AI) that provides computers with the ability to learn without being explicitly programmed. It focuses on the development of computer programs that can teach themselves to grow and change when exposed to new data.

Lisez plus

What is Master Data Management (MDM)?

Master Data Management (MDM) provides a unified view of data across multiple systems to meet the analytic needs of a global business. MDM creates singular views of master and reference data, whether it describes customers, products, suppliers, locations, or any other important attribute.

Lisez plus

What is Metadata?

Metadata is data that describes other data in a structured, consistent way, so that large amounts of data can be collected, stored, and analyzed over time.

Lisez plus

What is Mixed Workload?

A mixed workload is an ability to support multiple applications with different SLAs in a single environment.

Lisez plus

What is MongoDB?

MongoDB is a cross-platform, open-source database that uses a document-oriented data model, rather than a traditional table-based relational database structure. This type of model makes the integration of structured and unstructured data easier and faster.

Lisez plus

What is Natural Language Processing?

A branch of artificial intelligence, natural language processing (NLP) deals with making human language (in both written and spoken forms) comprehensible to computers.

Lisez plus

What is Pattern Recognition?

Pattern recognition occurs when an algorithm locates recurrences or regularities within large data sets or across disparate data sets. It is closely linked and even considered synonymous with machine learning and data mining.

Lisez plus

What is a Petabyte?

An extremely large unit of digital data, one Petabyte is equal to 1,000 Terabytes. Some estimates hold that a Petabyte is the equivalent of 20 million tall filing cabinets or 500 billion pages of standard printed text.

Lisez plus

What are Predictive Analytics?

Predictive analytics refers to the analysis of big data to make predictions and determine the likelihood of future outcomes, trends or events.

Lisez plus

What are Prescriptive Analytics?

A type or extension of predictive analytics, prescriptive analytics is used to recommend or prescribe specific actions when certain information states are reached or conditions are met.

Lisez plus

What is Python?

Python is an interpreted, object-oriented, high-level programming language with dynamic semantics. Python has a reputation as a beginner-friendly language, replacing Java as the most widely used introductory language because it handles much of the complexity for the user, allowing beginners to focus on fully grasping programming concepts rather than minute details.

Lisez plus

What is R?

R is an open-source programming language for statistical analysis. It includes a command line interface and several graphical interfaces. Popular algorithm types include linear and nonlinear modeling, time-series analysis, classification and clustering.

Lisez plus

What is Retail Analytics?

Retail analytics is the analysis of data generated by retail operations with a goal of making business decisions that drive profitability. The use of retail analytics developed as a response to the retail transformation being driven by unprecedented changes in consumer behavior, intensified pressure on margins, the changing role of stores, and intensified competition for both on and offline channels.

Lisez plus

What is Risk Management?

Risk management, sometimes referred to as risk mitigation, is the process of calculating the maximum acceptable level of overall risk to and from an activity, then using risk assessment techniques to pinpoint the initial level of risk and, if found to be excessive, developing a strategy to mitigate specific individual risks until the collective risk level is pared down to an acceptable level.

Lisez plus

What is RTIM?

RTIM, also known as Real Time Interaction Manager or Management, uses real-time customer interactions, predictive modeling, and machine learning to deliver consistent, personalized customer experiences across channels.

Lisez plus

What is Semi-Structured Data?

Semi-structured data does not follow the format of a tabular data model or relational databases because it does not have a fixed schema. However, the data is not completely raw or unstructured, and does contain some structural elements such as tags and organizational metadata that make it easier to analyze.

Lisez plus

What is Sentiment Analysis?

Sentiment analysis is the capture and tracking of opinions, emotions or feelings expressed by consumers engaged in various types of interactions, such as social media posts, customer service calls and surveys.

Lisez plus

What is Structured Data?

Structured data refers to data sets with strong and consistent organization. Structured data is managed by structured query language (SQL), by which users can easily search and manipulate the data.

Lisez plus

What is a Terabyte?

A relatively large unit of digital data, one Terabyte (TB) equals 1,000 Gigabytes. It has been estimated that 10 Terabytes could hold the entire printed collection of the U.S. Library of Congress, while a single TB could hold 1,000 copies of the Encyclopedia Brittanica.

Lisez plus

What is Unstructured Data?

Unstructured data refers to unfiltered information with no fixed organizing principle. It is often called raw data. Common examples are web logs, XML, JSON, text documents, images, video, and audio files.

Lisez plus

What is VPC?

VPC stand for virtual private cloud. VPC is a personal and private virtual network space hosted within a public cloud environment. Each VPC is secure and logically isolated from other virtual networks in the same public cloud.

Lisez plus

What are the V's?

Big data – and the business challenges and opportunities associated with it – are often discussed or described in the context of multiple V’s:

Lisez plus

Want to see these concepts in action?

Explore Vantage