python - How control DAG cconcurrency in airflow -


i use airflow v1.7.1.3

i have 2 dag, dag_a , dag_b. set 10 dag_a tasks @ 1 time, theoretically should execution 1 one. in reality, 10 dag_a tasks executed in parallel. concurrency parameter doesn't work. can tell me why?

here's pseudocode:

in dag_a.py

dag = dag('dag_a',             start_date=datetime.now(),             default_args=default_args,             schedule_interval=none,             concurrency=1,             max_active_runs=1) 

in dag_b.py

from fabric.api import local  dag = dag('dag_b',             start_date=datetime.now(),             default_args=default_args,             schedule_interval='0 22 */1 * *',             concurrency=1,             max_active_runs=1)   def trigger_dag_a(**context):      dag_list = []     rec in rang(1,10):         time.sleep(2)         cmd = "airflow trigger_dag dag_a"          log.info("cmd:%s"%cmd)         msg = local(cmd)    #"local" function in fabric         log.info(msg)   trigger_dag_a_proc = pythonoperator(python_callable=trigger_dag_a,                           provide_context=true,                           task_id='trigger_dag_a_proc',                           dag=dag) 

afaik, external dag triggers not respect concurrency/max_active_runs parameters of dags. applies backfills.

only dag runs scheduled scheduler respect these parameters.


Comments

Popular posts from this blog

Is there a better way to structure post methods in Class Based Views -

performance - Why is XCHG reg, reg a 3 micro-op instruction on modern Intel architectures? -

jquery - Responsive Navbar with Sub Navbar -