On the Trade-off Between Robustness and Complexity in Data Pipelines
Paper in proceeding, 2021

Data pipelines play an important role throughout the data management process whether these are used for data analytics or machine learning. Data-driven organizations can make use of data pipelines for producing good quality data applications. Moreover, data pipelines ensure end-to-end velocity by automating the processes involved in extracting, transforming, combining, validating, and loading data for further analysis and visualization. However, the robustness of data pipelines is equally important since unhealthy data pipelines can add more noise to the input data. This paper identifies the essential elements for a robust data pipeline and analyses the trade-off between data pipeline robustness and complexity.

Robustness

Data quality

Complexity

Composite nodes

Trade-off,

Author

Aiswarya Raj Munappy

Testing, Requirements, Innovation and Psychology

Jan Bosch

Testing, Requirements, Innovation and Psychology

Helena Holmström Olsson

Chalmers, Computer Science and Engineering (Chalmers), Software Engineering (Chalmers)

Communications in Computer and Information Science

1865-0929 (ISSN) 18650937 (eISSN)

Vol. 1439 401-415
978-303085346-4 (ISBN)

Quality of Information and Communications Technology
Faro, Portugal,

Software Engineering for AI/ML/DL

Chalmers AI Research Centre (CHAIR), 2019-11-01 -- 2022-11-01.

Subject Categories

Other Computer and Information Science

Bioinformatics (Computational Biology)

Media Engineering

DOI

10.1007/978-3-030-85347-1_29

More information

Latest update

3/21/2023