On the Trade-off Between Robustness and Complexity in Data Pipelines
Paper in proceeding, 2021

Data pipelines play an important role throughout the data management process whether these are used for data analytics or machine learning. Data-driven organizations can make use of data pipelines for producing good quality data applications. Moreover, data pipelines ensure end-to-end velocity by automating the processes involved in extracting, transforming, combining, validating, and loading data for further analysis and visualization. However, the robustness of data pipelines is equally important since unhealthy data pipelines can add more noise to the input data. This paper identifies the essential elements for a robust data pipeline and analyses the trade-off between data pipeline robustness and complexity.

Complexity

Trade-off,

Data quality

Composite nodes

Robustness

Author

Aiswarya Raj Munappy

Testing, Requirements, Innovation and Psychology

Jan Bosch

Testing, Requirements, Innovation and Psychology

Helena Holmström Olsson

Chalmers, Computer Science and Engineering (Chalmers), Software Engineering (Chalmers)

International Conference on the Quality of Information and Communications Technology

1865-0929 (ISSN) 1865-0937 (eISSN)

Vol. 1439 401-415
978-303085346-4 (ISBN)

Quality of Information and Communications Technology
Faro, Portugal,

Software Engineering for AI/ML/DL

Chalmers AI Research Centre (CHAIR), 2019-11-01 -- 2022-11-01.

Subject Categories

Other Computer and Information Science

Bioinformatics (Computational Biology)

Media Engineering

DOI

10.1007/978-3-030-85347-1

More information

Latest update

12/17/2021