Big Data Pipeline

--

Version 1.0
Created date 02-05-2021

Automated Dataset Execution

How can the execution of a number of data processing activities starting from data ingress to egress be automated?

Author Bert Dingemans
Alias --
Stereotypes ApplicationFunction
Details of Automated Dataset Execution

Big Data Pipeline*

The Big Data pipeline compound pattern generally comprises multiple stages whose objectives are to divide complex processing operations into down into modular steps for easier understanding and debugging and to be amenable to future data processing requir

Author Bert Dingemans
Alias --
Stereotypes ApplicationFunction
Details of Big Data Pipeline*

Big Data Processing Environment*

The Big Data Processing Environment represents an environment capable of handling the range of distinct requirements of large-scale dataset processing.

Author Bert Dingemans
Alias --
Stereotypes ApplicationFunction
Details of Big Data Processing Environment*

Poly Sink*

The Poly Sink compound pattern represents a part of a Big Data platform capable of egressing high-volume, high-velocity and high-variety data out to downstream enterprise systems.

Author Bert Dingemans
Alias --
Stereotypes ApplicationFunction
Details of Poly Sink*

Poly Source*

The Poly Source compound pattern represents a part of a Big Data platform capable of ingesting high-volume and high-velocity data from a range of structured, unstructured and semi-structured data sources.

Author Bert Dingemans
Alias --
Stereotypes ApplicationFunction
Details of Poly Source*

Poly Storage*

The Poly Storage compound pattern represents a part of a Big Data platform capable of storing high-volume, high-velocity and high-variety data.

Author Bert Dingemans
Alias --
Stereotypes ApplicationFunction
Details of Poly Storage*