digitalmars.D - Task Scheduler / Task Queues/Cron jobs tool
- aberba (42/42) Jul 12 2019 Mara (https://github.com/mara/data-integration) is a task
Mara (https://github.com/mara/data-integration) is a task scheduler.. you define chain-able tasks to be scheduled for execution with support for sub-tasks. There's an Apache version called Airflow initially started at AirBnB. Came across it on HackerNews (https://news.ycombinator.com/item?id=17030102) Will be really useful to have something like this in D (scripted in D). Such a tool is used by several major enterprises per the Airflow docs (https://github.com/apache/airflow). A potential killer app that's actually needed for enterprise workflow. Its like (Mara in Python): from data_integration.commands.bash import RunBash from data_integration.pipelines import Pipeline, Task from data_integration.ui.cli import run_pipeline, run_interactively pipeline = Pipeline( id='demo', description='A small pipeline that demonstrates the interplay between pipelines, tasks and commands') pipeline.add(Task(id='ping_localhost', description='Pings localhost', commands=[RunBash('ping -c 3 localhost')])) sub_pipeline = Pipeline(id='sub_pipeline', description='Pings a number of hosts') for host in ['google', 'amazon', 'facebook']: sub_pipeline.add(Task(id=f'ping_{host}', description=f'Pings {host}', commands=[RunBash(f'ping -c 3 {host}.com')])) sub_pipeline.add_dependency('ping_amazon', 'ping_facebook') sub_pipeline.add(Task(id='ping_foo', description='Pings foo', commands=[RunBash('ping foo')]), ['ping_amazon']) pipeline.add(sub_pipeline, ['ping_localhost']) pipeline.add(Task(id='sleep', description='Sleeps for 2 seconds', commands=[RunBash('sleep 2')]), ['sub_pipeline']) Another one in Node.JS is Agenda (https://github.com/agenda/agenda) If you work in an enterprise with lots of scheduled cron jobs/task queues that sometimes depend on each other...and you need transparency and metrics, that's your tool.
Jul 12 2019