
Sean Murdock
Professor at Brigham Young University Idaho
Schedule, monitor, and manage data workflows efficiently using tools like Apache Airflow. Build data pipelines by leveraging Airflow DAGs to organize tasks and utilize AWS resources such as S3 and Redshift to process and move data effectively between systems. Engage in hands-on projects to automate and maintain complex data pipelines, streamlining operations and improving data reliability. Gain expertise in workflow automation, data integration, and error handling, enabling you to construct efficient and scalable data pipelines in production environments. Ideal for data engineers and professionals aiming to advance their skills in managing and automating data workflows.

Subscription · Monthly
8 skills
6 prerequisites
Prior to enrolling, you should have the following knowledge:
You will also need to be able to communicate fluently and professionally in written and spoken English.
1 instructor
Unlike typical professors, our instructors come from Fortune 500 and Global 2000 companies and have demonstrated leadership and expertise in their professions:

Sean Murdock
Professor at Brigham Young University Idaho

Subscription · Monthly