Data Pipelines with Apache Airflow

Data Pipelines with Apache Airflow

Data Pipelines with Apache Airflow

Author: by Bas P. Harenslak (Author), Julian Rutger de Ruiter (Author)

Publisher finelybook 出版社:‏ ‎ Manning

Edition 版次:‏ ‎ N/A

Publication Date 出版日期:‏ ‎ 2021-04-27

Language 语言: ‎ English

Print Length 页数: ‎ 480 pages

ISBN-10: ‎ 1617296902

ISBN-13: ‎ 9781617296901


Book Description
By finelybook

Data Pipelines with Apache Airflow teaches you how to build and maintain effective data pipelines.

Summary
A successful pipeline moves data efficiently, minimizing pauses and blockages between tasks, keeping every process along the way operational. Apache Airflow provides a single customizable environment for building and managing data pipelines, eliminating the need for a hodgepodge collection of tools, snowflake code, and homegrown processes. Using real-world scenarios and examples,
Data Pipelines with Apache Airflow teaches you how to simplify and automate data pipelines, reduce operational overhead, and smoothly integrate all the technologies in your stack.

Purchase of the print book includes a free eBook in PDF, Kindle, and ePub formats from Manning Publications.

About the technology
Data pipelines manage the flow of data from initial collection through consolidation, cleaning, analysis, visualization, and more. Apache Airflow provides a single platform you can use to design, implement, monitor, and maintain your pipelines. Its easy-to-use UI, plug-and-play options, and flexible Python scripting make Airflow perfect for any data management task.

About the book
Data Pipelines with Apache Airflow teaches you how to build and maintain effective data pipelines. You’ll explore the most common usage patterns, including aggregating multiple data sources, connecting to and from data lakes, and cloud deployment. Part reference and part tutorial, this practical guide covers every aspect of the directed acyclic graphs (DAGs) that power Airflow, and how to customize them for your pipeline’s needs.

What’s inside
Build, test, and deploy Airflow pipelines as DAGs
Automate moving and transforming data
Analyze historical datasets using backfilling
Develop custom components
Set up Airflow in production environments

About the reader
For DevOps, data engineers, machine learning engineers, and sysadmins with intermediate Python skills.

About the author
Bas Harenslak and Julian de Ruiter are data engineers with extensive experience using Airflow to develop pipelines for major companies. Bas is also an Airflow committer.


Table of Contents

PART 1 – GETTING STARTED

1 Meet Apache Airflow
2 Anatomy of an Airflow DAG
3 Scheduling in Airflow
4 Templating tasks using the Airflow context
5 Defining dependencies between tasks

PART 2 – BEYOND THE BASICS

6 Triggering workflows
7 Communicating with external systems
8 Building custom components
9 Testing
10 Running tasks in containers

PART 3 – AIRFLOW IN PRACTICE

11 Best practices
12 Operating Airflow in production
13 Securing Airflow
14 Project: Finding the fastest way to get around NYC

PART 4 – IN THE CLOUDS

15 Airflow in the clouds
16 Airflow on AWS
17 Airflow on Azure
18 Airflow in GCP

From the Back Cover

Pipelines can be challenging to manage, especially when your data has to flow through a collection of application components, servers, and cloud services. Airflow lets you schedule, restart, and backfill pipelines, and its easy-to-use UI and workflows with Python scripting has users praising its incredible flexibility. Data Pipelines with Apache Airflow takes you through best practices for creating pipelines for multiple tasks, including data lakes, cloud deployments, and data science.

Data Pipelines with Apache Airflow teaches you the ins-and-outs of the Directed Acyclic Graphs (DAGs) that power Airflow, and how to write your own DAGs to meet the needs of your projects. With complete coverage of both foundational and lesser-known features, when you’re done you’ll be set to start using Airflow for seamless data pipeline development and management.

Key Features

Framework foundation and best practices

Airflow’s execution and dependency system

Testing Airflow DAGs

Running Airflow in production

For data-savvy developers, DevOps and data engineers, and system

administrators with intermediate Python skills.

About the technology

Data pipelines are used to extract, transform and load data to and from multiple sources, routing it wherever it’s needed — whether that’s visualisation tools, business intelligence dashboards, or machine learning models. Airflow streamlines the whole process, giving you one tool for programmatically developing and monitoring batch data pipelines, and integrating all the pieces you use in your data stack.

Bas Harenslak and Julian de Ruiter are data engineers with extensive experience using Airflow to develop pipelines for major companies including Heineken, Unilever, and Booking.com. Bas is a committer, and both Bas and Julian are active contributors to Apache Airflow.

About the Author

Bas Harenslak and Julian de Ruiter are data engineers with extensive experience using Airflow to develop pipelines for major companies including Heineken, Unilever, and Booking.com. Bas is a committer, and both Bas and Julian are active contributors to Apache Airflow.

Amazon Page

相关文件下载地址

Formats: PDF, EPUB, ZIP | 33 MB | 2021-05-09
下载地址 Download解决验证以访问链接!
打赏
未经允许不得转载:finelybook » Data Pipelines with Apache Airflow

评论 抢沙发

觉得文章有用就打赏一下

您的打赏,我们将继续给力更多优质内容

支付宝扫一扫

微信扫一扫