Category: Database, github

It is a platform to programmatically schedule, and monitor workflows for scheduled jobs. Apache airflow makes your workflow simple, well organized, and more systematic which can be easily authored and schedules based on the requirement.

Workflow is divided into one or more than one task which relates to each other and forms a DAG (Directed Acyclic Graph).

If your task is dependent on some other task, you can set dependencies based on your requirement.

In the DAG folder, you need to upload all your python script or DAG which will get rendered into the airflow server and show in the UI, and then you can trigger it manually or if scheduled, it will trigger automatically.

Related Articles