Consumer

Consumer of data, in most situation the consumer gets access to processed data (cleaning, filtering, transforming) based on a standardized model

Package BD-DWH Integration ABB
Auteur Bert Dingemans
Alias
Stereotypes BusinessRole

Diagrammen

Serial BDP-DWH ABB

Serial integration is implemented by introducing a big data platform for the transformation and extraction of unstructured and semi structured data as source for the EDWH functionality. Characteristics - Introduction of the big data platform is relatively easy since it is an extra layer added to the DWH functionality - Relative easy big data patterns are available because the source is always the datawarehouse - Introducing big data solutions for other functionalities than DHW is not possible.

Appliance BDP-DWH ABB

In the appliance integration of a big data platform with DWH functionality the appliance acts like a black box in which all functionality is integrated in a (proprietary) solution. This solution is configured for optimal performance of transformation and analysis. Characteristics - Appliance is developed, configured and often maintained by an external supplier - It is introduced as a fully integrated solution therefore existing implementations of the DWH have to migrate to this solution - Appliances are often introduced when a cloud solution is selected for the data platform

Parallel BDP-DWH ABB

The parallel integration is an extension of the DWH functiionality with the Big Data Platform. This extension makes it possible to use both functionalities side by side. Characteristics - Easy (incremental) introduction of the Big Data functionality - Integration of both functionalities requires attention for the introduction of the interconnect functionality because this can become a bottleneck in performance and configuration -

Virtualisation BDP-DWH ABB

This integration pattern has a close relation with the parallel integration, however there is an extra layer introduced for the virtualisation and standardisation of data extraction to consumers of the data. Characteristics - Virtualisation layer encapsulate the internal confuguration of the two platforms - The virtualisation layer requires a standardized data or objectmodel for the extraction by the consumers - The virtualisation can become a bottleneck in a number of qualities like performance, integratability e.g. -

Data Integration General view

The hourglass model is a specific model for the transformation of data sources to a standardized model in a target datastore. It is the simplified implementation of a layered Big Data architecture. The hourglass model can be used to medel specific implementations of transformation of data in a pattern called the datapipe. In a number of other diagrams a detail view is given of these implementations in projects like Digital Transformation, TDP, MaxLimit and others.

Data Integration Consumers overview

Overview of internal and external consumers of the standardized datasets.

Gekoppelde elementen


Links 2 Tags