Using Local Intelligence to Actively Monitor Distributed Data Sources in Decision Making Architectures

  • Emma Chavez-Mora

Student thesis: Doctoral Thesis

Abstract

In data warehouse environments operational processes are used to move data from a variety of sources to a central repository usually called the warehouse. These processes include data export, data preparation, and data loading which are usually performed by using Extraction, Transformation and Loading (ETL) tools. Most Business Intelligence (BI) applications have to consolidate data first before performing any sort of data analysis. Therefore, they use centralized decision support architectures in which commonly a data warehouse technology is employed as the way to consolidate, analyse and report data to support decision making based on all the information that has been collected from a variety of data sources. ETL functions have been described as a critical component of a BI solution as they are responsible for retrieving data from different sources (one or more), preparing it by normalizing and transforming it in some way to then be inserted into some other repository, usually called a data warehouse, for analysis and reporting [152][56]. The main characteristics of the ETL mechanisms used in BI include the use of incremental loading techniques. This movement of data, commonly called the data refreshment, is usually driven from the central repository, which is programmed to request “query” data from the sources in a timely manner. Consequently, data refreshments are time dependent.
Date of Award9 Feb 2013
Original languageEnglish
SupervisorGavin Finnie (Supervisor)

Cite this

'