Player FM - Internet Radio Done Right
Checked 19h ago
הוסף לפני four שנים
תוכן מסופק על ידי The Data Flowcast. כל תוכן הפודקאסטים כולל פרקים, גרפיקה ותיאורי פודקאסטים מועלים ומסופקים ישירות על ידי The Data Flowcast או שותף פלטפורמת הפודקאסט שלהם. אם אתה מאמין שמישהו משתמש ביצירה שלך המוגנת בזכויות יוצרים ללא רשותך, אתה יכול לעקוב אחר התהליך המתואר כאן https://he.player.fm/legal.
Player FM - אפליקציית פודקאסט
התחל במצב לא מקוון עם האפליקציה Player FM !
התחל במצב לא מקוון עם האפליקציה Player FM !
Airflow Breeze
Manage episode 408336824 series 2948506
תוכן מסופק על ידי The Data Flowcast. כל תוכן הפודקאסטים כולל פרקים, גרפיקה ותיאורי פודקאסטים מועלים ומסופקים ישירות על ידי The Data Flowcast או שותף פלטפורמת הפודקאסט שלהם. אם אתה מאמין שמישהו משתמש ביצירה שלך המוגנת בזכויות יוצרים ללא רשותך, אתה יכול לעקוב אחר התהליך המתואר כאן https://he.player.fm/legal.
This week, we had the pleasure of meeting up with Jarek Potiuk, Principal Software Engineer at Polidea and Apache Airflow committer, to discuss his most recent contribution to the community, Airflow Breeze. Jarek deeply values developer productivity and realized while building a team of Airflow committers that, in order to open a PR on the project, passing unit tests and waiting for the CI build was a cumbersome process that could take up to a few hours. Breeze seeks to improve that experience for Airflow committers and lower the barrier-to-entry of contribution for folks that are new to the open-source community. You can read more about Airflow Breeze here: https://www.polidea.com/blog/its-a-breeze-to-develop-apache-airflow/#the-apache-airflow-projects-setup
…
continue reading
56 פרקים
Manage episode 408336824 series 2948506
תוכן מסופק על ידי The Data Flowcast. כל תוכן הפודקאסטים כולל פרקים, גרפיקה ותיאורי פודקאסטים מועלים ומסופקים ישירות על ידי The Data Flowcast או שותף פלטפורמת הפודקאסט שלהם. אם אתה מאמין שמישהו משתמש ביצירה שלך המוגנת בזכויות יוצרים ללא רשותך, אתה יכול לעקוב אחר התהליך המתואר כאן https://he.player.fm/legal.
This week, we had the pleasure of meeting up with Jarek Potiuk, Principal Software Engineer at Polidea and Apache Airflow committer, to discuss his most recent contribution to the community, Airflow Breeze. Jarek deeply values developer productivity and realized while building a team of Airflow committers that, in order to open a PR on the project, passing unit tests and waiting for the CI build was a cumbersome process that could take up to a few hours. Breeze seeks to improve that experience for Airflow committers and lower the barrier-to-entry of contribution for folks that are new to the open-source community. You can read more about Airflow Breeze here: https://www.polidea.com/blog/its-a-breeze-to-develop-apache-airflow/#the-apache-airflow-projects-setup
…
continue reading
56 פרקים
Tous les épisodes
×
1 Building an End-to-End Data Observability System at Netflix with Joseph Machado 38:54
38:54
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי38:54
Building reliable data pipelines starts with maintaining strong data quality standards and creating efficient systems for auditing, publishing and monitoring. In this episode, we explore the real-world patterns and best practices for ensuring data pipelines stay accurate, scalable and trustworthy. Joseph Machado , Senior Data Engineer at Netflix , joins us to share practical insights gleaned from supporting Netflix’s Ads business as well as over a decade of experience in the data engineering space. He discusses implementing audit publish patterns, building observability dashboards, defining in-band and separate data quality checks, and optimizing data validation across large-scale systems. Key Takeaways: . (03:14) Supporting data privacy and engineering efficiency within data systems. (10:41) Validating outputs with reconciliation checks to catch transformation issues. (16:06) Applying standardized patterns for auditing, validating and publishing data. (19:28) Capturing historical check results to monitor system health and improvements. (21:29) Treating data quality and availability as separate monitoring concerns. (26:26) Using containerization strategies to streamline pipeline executions. (29:47) Leveraging orchestration platforms for better visibility and retry capability. (31:59) Managing business pressure without sacrificing data quality practices. (35:46) Starting simple with quality checks and evolving toward more complex frameworks. Resources Mentioned: Joseph Machado https://www.linkedin.com/in/josephmachado1991/ Netflix | LinkedIn https://www.linkedin.com/company/netflix/ Netflix | Website https://www.netflix.com/browse Start Data Engineering https://www.startdataengineering.com/ Apache Airflow https://airflow.apache.org/ dbt Labs https://www.getdbt.com/ Great Expectations https://greatexpectations.io/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 Why Developer Experience Shapes Data Pipeline Standards at Next Insurance with Snir Israeli 30:28
30:28
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי30:28
Creating consistency across data pipelines is critical for scaling engineering teams and ensuring long-term maintainability. In this episode, Snir Israeli , Senior Data Engineer at Next Insurance , shares how enforcing coding standards and investing in developer experience transformed their approach to data engineering. He explains how implementing automated code checks, clear documentation practices and a scoring system helped drive alignment across teams, improve collaboration and reduce technical debt in a fast-growing data environment. Key Takeaways: (02:59) Inconsistencies in code style create challenges for collaboration and maintenance. (04:22) Programmatically enforcing rules helps teams scale their best practices. (08:55) Performance improvements in data pipelines lead to infrastructure cost savings. (13:22) Developer experience is essential for driving adoption of internal tools. (19:44) Dashboards can operationalize standards enforcement and track progress over time. (22:49) Standardization accelerates onboarding and reduces friction in code reviews. (25:39) Linting rules require ongoing maintenance as tools and platforms evolve. (27:47) Starting small and involving the team leads to better adoption and long-term success. Resources Mentioned: Snir Israeli https://www.linkedin.com/in/snir-israeli/ Next Insurance | LinkedIn https://www.linkedin.com/company/nextinsurance/ Next Insurance | Website https://www.nextinsurance.com/ Apache Airflow https://airflow.apache.org/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 Data Quality and Observability at Tekmetric with Ipsa Trivedi 22:49
22:49
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי22:49
Airflow’s adaptability is driving Tekmetric’s ability to unify complex data workflows, deliver accurate insights and support both internal operations and customer-facing services — all within a rapidly growing startup environment. In this episode, Ipsa Trivedi , Lead Data Engineer at Tekmetric , shares how her team is standardizing pipelines while supporting unique customer needs. She explains how Airflow enables end-to-end data services, simplifies orchestration across varied sources and supports scalable customization. Ipsa also highlights early wins with Airflow, its intuitive UI and the team's roadmap toward data quality, observability and a future self-serve data platform. Key Takeaways: (02:26) Powering auto shops nationwide with a unified platform. (05:17) A new data team was formed to centralize and scale insights. (07:23) Flexible, open source and made to fit — Airflow wins. (10:42) Pipelines handle anything from email to AWS. (12:15) Custom DAGs fit every team’s unique needs. (17:01) Data quality checks are built into the plan. (18:17) Self-serve data mesh is the end goal. (19:59) Airflow now fits so well, there's nothing left on the wishlist. Resources Mentioned: Ipsa Trivedi https://www.linkedin.com/in/ipsatrivedi/ Tekmetric | LinkedIn https://www.linkedin.com/company/tekmetric/ Tekmetric | Website https://www.tekmetric.com/ Apache Airflow https://airflow.apache.org/ AWS RDS https://aws.amazon.com/free/database/?trk=fc551e06-56b0-418c-9ddd-5c9dba18569b&sc_channel=ps&ef_id=CjwKCAjwzMi_BhACEiwAX4YZULS4jV2Xpnpcac_Q3eS9BAg-klKUDyCt6XSdOul8BLHkmWzFFh4NXRoCGhQQAvD_BwE:G:s&s_kwcid=AL!4422!3!548989592596!e!!g!!amazon%20sql%20database!11543056228!112002958549&gclid=CjwKCAjwzMi_BhACEiwAX4YZULS4jV2Xpnpcac_Q3eS9BAg-klKUDyCt6XSdOul8BLHkmWzFFh4NXRoCGhQQAvD_BwE Astro by Astronomer https://www.astronomer.io/product/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 Introducing Apache Airflow® 3 with Vikram Koka and Jed Cunningham 27:28
27:28
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי27:28
The Airflow 3.0 release marks a significant leap forward in modern data orchestration, introducing architectural upgrades that improve scalability, flexibility and long-term maintainability. In this episode, we welcome Vikram Koka , Chief Strategy Officer at Astronomer , and Jed Cunningham , Principal Software Engineer at Astronomer , to discuss the architectural foundations, new features and future implications of this milestone release. They unpack the rationale behind DAG versioning and task execution interface, explain how Airflow now integrates more seamlessly within broader data ecosystems and share how these changes lay the groundwork for multi-cloud deployments, language-agnostic workflows and stronger enterprise security. Key Takeaways: (02:28) Modern orchestration demands new infrastructure approaches. (05:02) Removing legacy components strengthens system stability. (06:26) Major releases provide the opportunity to reduce technical debt. (08:31) Frontend and API modernization enable long-term adaptability. (09:36) Event-based triggers expand integration possibilities. (11:54) Version control improves visibility and execution reliability. (14:57) Centralized access to workflow definitions increases flexibility. (21:49) Decoupled architecture supports distributed and secure deployments. (26:17) Community collaboration is essential for sustainable growth. Resources Mentioned: Astronomer Website https://www.astronomer.io Apache Airflow https://airflow.apache.org/ Git Bundle https://git-scm.com/book/en/v2/Git-Tools-Bundling FastAPI https://fastapi.tiangolo.com/ React https://react.dev/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 Airflow in Action: Powering Instacart's Complex Ecosystem 25:14
25:14
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי25:14
The evolution of data orchestration at Instacart highlights the journey from fragmented systems to robust, standardized infrastructure. This transformation has enabled scalability, reliability and democratization of tools for diverse user personas. In this episode, we’re joined by Anant Agarwal , Software Engineer at Instacart , who shares insights into Instacart's Airflow journey, from its early adoption in 2019 to the present-day centralized cluster approach. Anant discusses the challenges of managing disparate clusters, the implementation of remote executors, and the strategic standardization of infrastructure and DAG patterns to streamline workflows. Key Takeaways: (03:49) The impact of external events on business growth and technological evolution. (04:31) Challenges of managing decentralized systems across multiple teams. (06:14) The importance of standardizing infrastructure and processes for scalability. (09:51) Strategies for implementing efficient and repeatable deployment practices. (12:17) Addressing diverse user personas with tailored solutions. (14:47) Leveraging remote execution to enhance flexibility and scalability. (18:36) Benefits of transitioning to a centralized system for organization-wide use. (20:57) Maintaining an upgrade cadence to stay aligned with the latest advancements. (23:35) Anticipation for new features and improvements in upcoming software versions. Resources Mentioned: Anant Agarwal https://www.linkedin.com/in/anantag/ Instacart | LinkedIn https://www.linkedin.com/company/instacart/ Instacart | Website https://www.instacart.com Apache Airflow https://airflow.apache.org/ AWS Amazon https://aws.amazon.com/ecs/ Terraform https://www.terraform.io/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 From ETL to Airflow: Transforming Data Engineering at Deloitte Digital with Raviteja Tholupunoori 27:42
27:42
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי27:42
Data orchestration at scale presents unique challenges, especially when aiming for flexibility and efficiency across cloud environments. Choosing the right tools and frameworks can make all the difference. In this episode, Raviteja Tholupunoori, Senior Engineer at Deloitte Digital , joins us to explore how Airflow enhances orchestration, scalability and cost efficiency in enterprise data workflows. Key Takeaways: (01:45) Early challenges in data orchestration before implementing Airflow. (02:42) Comparing Airflow with ETL tools like Talend and why flexibility matters. (04:24) The role of Airflow in enabling cloud-agnostic data processing. (05:45) Key lessons from managing dynamic DAGs at scale. (13:15) How hybrid executors improve performance and efficiency. (14:13) Best practices for testing and monitoring workflows with Airflow. (15:13) The importance of mocking mechanisms when testing DAGs. (17:57) How Prometheus, Grafana and Loki support Airflow monitoring. (22:03) Cost considerations when running Airflow on self-managed infrastructure. (23:14) Airflow’s latest features, including hybrid executors and dark mode. Resources Mentioned: Raviteja Tholupunoori https://www.linkedin.com/in/raviteja0096/?originalSubdomain=in Deloitte Digital https://www.linkedin.com/company/deloitte-digital/ Apache Airflow https://airflow.apache.org/ Grafana https://grafana.com/solutions/apache-airflow/monitor/ Astronomer Presents: Exploring Apache Airflow® 3 Roadshows https://www.astronomer.io/events/roadshow/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 A Deep Dive Into the 2025 State of Airflow Survey Results with Tamara Fingerlin of Astronomer 23:26
23:26
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי23:26
The 2025 State of Airflow report sheds light on how global users are adopting, evolving and innovating with Apache Airflow. With over 5,000 responses from 116 countries, the survey reveals critical insights into Airflows’ role in business operations, new use cases and what’s ahead for the community. In this episode, Tamara Fingerlin , Developer Advocate at Astronomer , walks us through her process of analyzing survey data, key trends from the report and what to expect from Airflow 3.0. Key Takeaways: (02:14) The State of Airflow report combines anonymized telemetry and survey results. (03:25) The survey received thousands of responses from many countries, showcasing global reach. (04:49) The survey process involves multiple steps, from question selection to report creation. (09:00) Many users expect to increase Airflow usage for revenue-generating or external use cases. (11:04) Experienced users tend to utilize Airflow more for advanced use cases like MLOps. (15:13) UI improvements offer enhanced navigation and error visibility. (18:15) Architectural changes enable new capabilities like remote execution and language support. (19:40) Long-requested features will be available in the new major release. (21:00) Future aspirations include integrating data visualization capabilities into the UI. Resources Mentioned: Tamara Fingerlin https://www.linkedin.com/in/tamara-janina-fingerlin/ Astronomer | LinkedIn https://www.linkedin.com/company/astronomer/ Astronomer | Website https://www.astronomer.io Apache Airflow https://airflow.apache.org/ 2025 State of Airflow Webinar https://www.astronomer.io/airflow/state-of-airflow/ Airflow Slack https://apache-airflow-slack.herokuapp.com/ Astronomer Presents: Exploring Apache Airflow® 3 Roadshows https://www.astronomer.io/events/roadshow/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 Airflow’s Role in the Rise of DataOps with Andy Byron 26:15
26:15
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי26:15
The orchestration layer is evolving into a critical component of the modern data stack. Understanding its role in DataOps is key to optimizing workflows, improving reliability and reducing complexity. In this episode, Andy Byron , CEO at Astronomer , discusses the rapid growth of Apache Airflow, the increasing importance of orchestration and how Astronomer is shaping the future of DataOps. Key Takeaways: (01:54) Orchestration is central to modern data workflows. (03:16) Airflow 3.0 will enhance usability and flexibility. (05:14) AI-driven workloads demand zero-downtime orchestration. (08:13) DataOps relies on orchestration for seamless operations. (11:05) Integration across ingestion, transformation and governance is key. (17:24) The future of DataOps is consolidation and automation. (19:13) Enterprises use Airflow to process massive data volumes. (23:20) Product innovation is driven by customer needs and feedback. Resources Mentioned: Andy Byron https://www.linkedin.com/in/andy-byron-417a429/ Astronomer | LinkedIn https://www.linkedin.com/company/astronomer/ Astronomer | Website https://www.astronomer.io Apache Airflow https://airflow.apache.org/ State of Airflow Webinar https://www.astronomer.io/events/webinars/the-state-of-airflow-2025-video/ Astronomer Observe https://www.astronomer.io/product/observe/ Astronomer Roadshow: Exploring Apache Airflow 3 | London https://www.astronomer.io/events/roadshow/london/ Astronomer Roadshow: Exploring Apache Airflow 3 | New York https://www.astronomer.io/events/roadshow/new-york/ Astronomer Roadshow: Exploring Apache Airflow 3 | Sydney https://www.astronomer.io/events/roadshow/sydney/ Astronomer Roadshow: Exploring Apache Airflow 3 | San Francisco https://www.astronomer.io/events/roadshow/san-francisco/ Astronomer Roadshow: Exploring Apache Airflow 3 | Chicago https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 The Software Risk That Affects Everyone and How To Address It with Michael Winser and Jarek Potiuk 28:27
28:27
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי28:27
The security of open-source software is a growing concern, especially as dependencies and regulations become more complex, making it essential to understand how to manage software supply chains effectively. In this episode, we sit down with Michael Winser , Co-Founder at Alpha-Omega and Security Strategy Ambassador at Eclipse Foundation , and Jarek Potiuk , Member of the Security Committee at the Apache Software Foundation , to discuss the challenges of securing Airflow’s dependencies, the evolving landscape of open-source security and how contributors can help strengthen the ecosystem. Key Takeaways: (02:43) Jarek quit his full-time engineer position and uses Airflow as a freelancer. (04:32) Michael finds happiness in having meaningful work with open-source security. (07:01) Software supply chain security focuses on correctness, integrity and availability. (08:44) Airflow’s 790 dependencies present a unique security challenge. (09:43) Airflow’s security team has significantly improved its vulnerability response. (10:22) The transition to Airflow 3 emphasizes enterprise security readiness. (16:20) The ‘Three Fs’ approach: fix it, fork it, or forget it. (18:45) Dependency health is often more critical than fixing known vulnerabilities. (23:32) The ‘Three Fs’ in action. (26:26) Open-source contributors play a key role in supply chain security. Resources Mentioned: Michael Winser - https://www.linkedin.com/in/michaelw/ Jarek Potiuk - https://www.linkedin.com/in/jarekpotiuk/ Apache Airflow - https://airflow.apache.org/ Apache Software Foundation | LinkedIn - https://www.linkedin.com/company/the-apache-software-foundation/ Apache Software Foundation | Website - https://www.apache.org/ Eclipse Foundation | LinkedIn - https://www.linkedin.com/company/eclipse-foundation/ Eclipse Foundation | Website - https://www.eclipse.org/org/foundation/ OpenSSF Working Groups - https://openssf.org/community/openssf-working-groups/ Astronomer Roadshow: Exploring Apache Airflow 3 | London https://www.astronomer.io/events/roadshow/london/ Astronomer Roadshow: Exploring Apache Airflow 3 | New York https://www.astronomer.io/events/roadshow/new-york/ Astronomer Roadshow: Exploring Apache Airflow 3 | Sydney https://www.astronomer.io/events/roadshow/sydney/ Astronomer Roadshow: Exploring Apache Airflow 3 | San Francisco https://www.astronomer.io/events/roadshow/san-francisco/ Astronomer Roadshow: Exploring Apache Airflow 3 | Chicago https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 Building Scalable ML Infrastructure at Outerbounds with Savin Goyal 36:46
36:46
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי36:46
Machine learning is changing fast, and companies need better tools to handle AI workloads. The right infrastructure helps data scientists focus on solving problems instead of managing complex systems. In this episode, we talk with Savin Goyal , Co-Founder and CTO at Outerbounds , about building ML infrastructure, how orchestration makes workflows easier and how Metaflow and Airflow work together to simplify data science. Key Takeaways: (02:02) Savin spent years building AI and ML infrastructure, including at Netflix. (04:05) ML engineering was not a defined role a decade ago. (08:17) Modernizing AI and ML requires balancing new tools with existing strengths. (10:28) ML workloads can be long-running or require heavy computation. (15:29) Different teams at Netflix used multiple orchestration systems for specific needs. (20:10) Stable APIs prevent rework and keep projects moving. (21:07) Metaflow simplifies ML workflows by optimizing data and compute interactions. (25:53) Limited local computing power makes running ML workloads challenging. (27:43) Airflow UI monitors pipelines, while Metaflow UI gives ML insights. (33:13) The most successful data professionals focus on business impact, not just technology. Resources Mentioned: Savin Goyal - https://www.linkedin.com/in/savingoyal/ Outerbounds - https://www.linkedin.com/company/outerbounds/ Apache Airflow - https://airflow.apache.org/ Metaflow - https://metaflow.org/ Netflix’s Maestro Orchestration System - https://netflixtechblog.com/maestro-netflixs-workflow-orchestrator-ee13a06f9c78?gi=8e6a067a92e9#:~:text=Maestro%20is%20a%20fully%20managed,data%20between%20different%20storages%2C%20etc. TensorFlow - https://www.tensorflow.org/ PyTorch - https://pytorch.org/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 Customizing Airflow for Complex Data Environments at Stripe with Nick Bilozerov and Sharadh Krishnamurthy 27:40
27:40
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי27:40
Keeping data pipelines reliable at scale requires more than just the right tools — it demands constant innovation. In this episode, Nick Bilozerov , Senior Data Engineer at Stripe , and Sharadh Krishnamurthy , Engineering Manager at Stripe, discuss how Stripe customizes Airflow for its needs, the evolution of its data orchestration framework and the transition to Airflow 2. They also share insights on scaling data workflows while maintaining performance, reliability and developer experience. Key Takeaways: (02:04) Stripe’s mission is to grow the GDP of the internet by supporting businesses with payments and data. (05:08) 80% of Stripe engineers use data orchestration, making scalability critical. (06:06) Airflow powers business reports, regulatory needs and ML workflows. (08:02) Custom task frameworks improve dependencies and validation. (08:50) "User scope mode" enables local testing without production impact. (10:39) Migrating to Airflow 2 improves isolation, safety and scalability. (16:40) Monolithic DAGs caused database issues, prompting a service-based shift. (19:24) Frequent Airflow upgrades ensure stability and access to new features. (21:38) DAG versioning and backfill improvements enhance developer experience. (23:38) Greater UI customization would offer more flexibility. Resources Mentioned: Nick Bilozerov - https://www.linkedin.com/in/nick-bilozerov/ Sharadh Krishnamurthy - https://www.linkedin.com/in/sharadhk/ Apache Airflow - https://airflow.apache.org/ Stripe | LinkedIn - https://www.linkedin.com/company/stripe/ Stripe | Website - https://stripe.com/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 Harnessing Airflow for Data-Driven Policy Research at CSET with Jennifer Melot 17:54
17:54
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי17:54
Turning complex datasets into meaningful analysis requires robust data infrastructure and seamless orchestration. In this episode, we’re joined by Jennifer Melot , Technical Lead at the Center for Security and Emerging Technology (CSET) at Georgetown University, to explore how Airflow powers data-driven insights in technology policy research. Jennifer shares how her team automates workflows to support analysts in navigating complex datasets. Key Takeaways: (02:04) CSET provides data-driven analysis to inform government decision-makers. (03:54) ETL pipelines merge multiple data sources for more comprehensive insights. (04:20) Airflow is central to automating and streamlining large-scale data ingestion. (05:11) Larger-scale databases create challenges that require scalable solutions. (07:20) Dynamic DAG generation simplifies Airflow adoption for non-engineers. (12:13) DAG Factory and dynamic task mapping can improve workflow efficiency. (15:46) Tracking data lineage helps teams understand dependencies across DAGs. (16:14) New Airflow features enhance visibility and debugging for complex pipelines. Resources Mentioned: Jennifer Melot - https://www.linkedin.com/in/jennifer-melot-aa710144/ Center for Security and Emerging Technology (CSET) - https://www.linkedin.com/company/georgetown-cset/ Apache Airflow - https://airflow.apache.org/ Zenodo - https://zenodo.org/ OpenLineage - https://openlineage.io/ Cloud Dataplex - https://cloud.google.com/dataplex Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 Leveraging Airflow To Build Scalable and Reliable Data Platforms at 99acres.com with Samyak Jain 25:08
25:08
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי25:08
Data orchestration is evolving rapidly, with dynamic workflows becoming the cornerstone of modern data engineering. In this episode, we are joined by Samyak Jain , Senior Software Engineer - Big Data at 99acres.com . Samyak shares insights from his journey with Apache Airflow, exploring how his team built a self-service platform that enables non-technical teams to launch data pipelines and marketing campaigns seamlessly. Key Takeaways: (02:02) Starting a career in data engineering by troubleshooting Airflow pipelines. (04:27) Building self-service portals with Airflow as the backend engine. (05:34) Utilizing API endpoints to trigger dynamic DAGs with parameterized templates. (09:31) Managing a dynamic environment with over 1,400 active DAGs. (11:14) Implementing fault tolerance by segmenting data workflows into distinct layers. (14:15) Tracking and optimizing query costs in AWS Athena to save $7K monthly. (16:22) Automating cost monitoring with real-time alerts for high-cost queries. (17:15) Streamlining Airflow metadata cleanup to prevent performance bottlenecks. (21:30) Efficiently handling one-time and recurring marketing campaigns using Airflow. (24:18) Advocating for Airflow features that improve resource management and ownership tracking. Resources Mentioned: Samyak Jain - https://www.linkedin.com/in/samyak-jain-ab5830169/ 99acres.com - https://www.linkedin.com/company/99acres/ Apache Airflow - https://airflow.apache.org/ AWS Athena - https://aws.amazon.com/athena/ Kafka - https://kafka.apache.org/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 Hybrid Testing Solutions for Autonomous Driving at Bosch with Jens Scheffler and Christian Schilling 33:45
33:45
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי33:45
Testing autonomous vehicles demands precision, scalability and powerful orchestration tools — enter Apache Airflow, a key component of Bosch’s cutting-edge testing framework. In this episode, we sit down with Jens Scheffler , Test Execution Cluster Technical Architect, and Christian Schilling , Product Owner Open Loop Testing Automated Driving, both at Bosch , to explore how Bosch harnesses Airflow to streamline complex testing scenarios. They share insights on scaling workflows, integrating hybrid infrastructures and ensuring vehicle safety through rigorous automated testing. Key Takeaways: (01:35) Airflow orchestrates millions of test hours for autonomous systems. (03:15) Jens scales distributed systems with Kubernetes for job orchestration. (06:02) Airflow runs hundreds of tests simultaneously. (06:44) Virtual testing reduces costs and on-road trials. (12:19) Unified APIs and GUIs streamline operations. (15:05) Self-service setups empower Bosch teams. (18:00) Physical hardware integration ensures real-world timing. (20:30) Dynamic task mapping scales workflows efficiently. (25:22) Open-source contributions improve stability. (31:06) Edge and Celery executors power Bosch's hybrid scheduling. Resources Mentioned: Jens Scheffler - https://www.linkedin.com/in/jens-scheffler/ Christian Schilling - https://www.linkedin.com/in/christian-schilling-a5078831a/ Bosch - https://www.linkedin.com/company/bosch/ Apache Airflow - https://airflow.apache.org/ Kubernetes - https://kubernetes.io GitHub - https://github.com Edge Executor - https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/executor/index.html Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 Overcoming Airflow Scaling Challenges at Monzo Bank with Jonathan Rainer 43:39
43:39
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי43:39
Scaling a data orchestration platform to manage thousands of tasks daily demands innovative solutions and strategic problem-solving. In this episode, we explore the complexities of scaling Airflow and the challenges of orchestrating thousands of tasks in dynamic data environments. Jonathan Rainer , Former Platform Engineer at Monzo Bank , joins us to share his journey optimizing data pipelines, overcoming UI limitations and ensuring DAG consistency in high-stakes scenarios. Key Takeaways: (03:11) Using Airflow to schedule computation in BigQuery. (07:02) How DAGs with 8,000+ tasks were managed nightly. (08:18) Ensuring accuracy in regulatory reporting for banking. (11:35) Handling task inconsistency and DAG failures with automation. (16:09) Building a service to resolve DAG consistency issues in Airflow. (25:05) Challenges with scaling the Airflow UI for thousands of tasks. (27:03) The role of upstream and downstream task management in Airflow. (37:33) The importance of operational metrics for monitoring Airflow health. (39:19) Balancing new tools with root cause analysis to address scaling issues. (41:35) Why scaling solutions require both technical and leadership buy-in Resources Mentioned: Jonathan Rainer - https://www.linkedin.com/in/jonathan-rainer/ Monzo Bank - https://www.linkedin.com/company/monzo-bank/ Apache Airflow - https://airflow.apache.org/ BigQuery - https://airflow.apache.org/docs/apache-airflow-providers-google/stable/operators/cloud/bigquery.html Kubernetes - https://kubernetes.io/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 Orchestrating Analytics and AI Workflows at Telia with Arjun Anandkumar 26:00
26:00
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי26:00
T he future of data engineering lies in seamless orchestration and automation. In this episode, Arjun Anandkumar , Data Engineer at Telia , shares how his team uses Airflow to drive analytics and AI workflows. He highlights the challenges of scaling data platforms and how adopting best practices can simplify complex processes for teams across the organization. Arjun also discusses the transformative role of tools like Cosmos and Terraform in enhancing efficiency and collaboration. Key Takeaways: (02:16) Telia operates across the Nordics and Baltics, focusing on telecom and energy services. (03:45) Airflow runs dbt models seamlessly with Cosmos on AWS MWAA. (05:47) Cosmos improves visibility and orchestration in Airflow. (07:00) Medallion Architecture organizes data into bronze, silver and gold layers. (08:34) Task group challenges highlight the need for adaptable workflows. (15:04) Scaling managed services requires trial, error and tailored tweaks. (19:46) Terraform scales infrastructure, while YAML templates manage DAGs efficiently. (20:00) Templated DAGs and robust testing enhance platform management. (24:15) Open-source resources drive innovation in Airflow practices. Resources Mentioned: Arjun Anandkumar - https://www.linkedin.com/in/arjunanand1/?originalSubdomain=dk Telia - https://www.linkedin.com/company/teliacompany/ Apache Airflow - https://airflow.apache.org/ Cosmos by Astronomer - https://www.astronomer.io/cosmos/ Terraform - https://www.terraform.io/ Medallion Architecture by Databricks - https://www.databricks.com/glossary/medallion-architecture Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 The Role of Airflow in Finance Transformation at Etraveli Group with Mihir Samant 21:19
21:19
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי21:19
Transforming bottlenecked finance processes into streamlined, automated systems requires the right tools and a forward-thinking approach. In this episode, Mihir Samant , Senior Data Analyst at Etraveli Group , joins us to share how his team leverages Airflow to revolutionize finance automation. With extensive experience in data workflows and a passion for open-source tools, Mihir provides valuable insights into building efficient, scalable systems. We explore the transformative power of Airflow in automating workflows and enhancing data orchestration within the finance domain. Key Takeaways: (02:14) Etraveli Group specializes in selling affordable flight tickets and ancillary services. (03:56) Mihir’s finance automation team uses Airflow to tackle month-end bottlenecks. (06:00) Airflow's flexibility enables end-to-end automation for finance workflows. (07:00) Open-source Airflow tools offer cost-effective solutions for new teams. (08:46) Sensors and dynamic DAGs are pivotal features for optimizing tasks. (13:30) GitSync simplifies development by syncing environments seamlessly. (16:27) Plans include integrating Databricks for more advanced data handling. (17:58) Airflow and Databricks offer multiple flexible methods to trigger workflows and execute SQL queries seamlessly. Resources Mentioned: Mihir Samant - https://www.linkedin.com/in/misamant/?originalSubdomain=ca Etraveli Group - https://www.linkedin.com/company/etraveli-group/ Apache Airflow - https://airflow.apache.org/ Docker - https://www.docker.com/ Databricks - https://www.databricks.com/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 Inside Ford’s Data Transformation: Advanced Orchestration Strategies with Vasantha Kosuri-Marshall 38:54
38:54
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי38:54
Data engineering is entering a new era, where orchestration and automation are redefining how large-scale projects operate. This episode features Vasantha Kosuri-Marshall , Data and ML Ops Engineer at Ford Motor Company . Vasantha shares her expertise in managing complex data pipelines. She takes us through Ford's transition to cloud platforms, the adoption of Airflow and the intricate challenges of orchestrating data in a diverse environment. Key Takeaways: (03:10) Vasantha’s transition to the Advanced Driving Assist Systems team at Ford. (05:42) Early adoption of Airflow to orchestrate complex data pipelines. (09:29) Ford's move from on-premise data solutions to Google Cloud Platform. (12:03) The importance of Airflow's scheduling capabilities for efficient data management. (16:12) Using Kubernetes to scale Airflow for large-scale data processing. (19:59) Vasantha’s experience in overcoming challenges with legacy orchestration tools. (22:22) Integration of data engineering and data science pipelines at Ford. (28:03) How deferrable operators in Airflow improve performance and save costs. (32:12) Vasantha’s insights into tuning Airflow properties for thousands of DAGs. (36:09) The significance of monitoring and observability in managing Airflow instances. Resources Mentioned: Vasantha Kosuri-Marshall - https://www.linkedin.com/in/vasantha-kosuri-marshall-0b0aab188/ Apache Airflow - https://airflow.apache.org/ Google Cloud Platform (GCP) - https://cloud.google.com/ Ford Motor Company | LinkedIn - https://www.linkedin.com/company/ford-motor-company/ Ford Motor Company | Website - https://www.ford.com/ Astronomer - https://www.astronomer.io/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 Powering Finance With Advanced Data Solutions at Ramp with Ryan Delgado 24:35
24:35
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי24:35
Data is the backbone of every modern business, but unlocking its full potential requires the right tools and strategies. In this episode, Ryan Delgado , Director of Engineering at Ramp , joins us to explore how innovative data platforms can transform business operations and fuel growth. He shares insights on integrating Apache Airflow, optimizing data workflows and leveraging analytics to enhance customer experiences. Key Takeaways: (01:52) Data is the lifeblood of Ramp, touching every vertical in the company. (03:18) Ramp’s data platform team enables high-velocity scaling through tailored tools. (05:27) Airflow powers Ramp’s enterprise data warehouse integrations for advanced analytics. (07:55) Centralizing data in Snowflake simplifies storage and analytics pipelines. (12:08) Machine learning models at Ramp integrate seamlessly with Airflow for operational excellence. (14:11) Leveraging Airflow datasets eliminates inefficiencies in DAG dependencies. (17:22) Platforms evolve from solving narrow business problems to scaling organizationally. (18:55) ClickHouse enhances Ramp’s OLAP capabilities with 100x performance improvements. (19:47) Ramp’s OLAP platform improves performance by reducing joins and leveraging ClickHouse. (21:46) Ryan envisions a lighter-weight, more Python-native future for Airflow. Resources Mentioned: Ryan Delgado - https://www.linkedin.com/in/ryan-delgado-69544568/ Ramp - https://www.linkedin.com/company/ramp/ Apache Airflow - https://airflow.apache.org/ Snowflake - https://www.snowflake.com/ ClickHouse - https://clickhouse.com/ dbt - https://www.getdbt.com/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 Exploring the Power of Airflow 3 at Astronomer with Amogh Desai 30:24
30:24
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי30:24
What does it take to go from fixing a broken link to becoming a committer for one of the world’s leading open-source projects? Amogh Desai , Senior Software Engineer at Astronomer , takes us through his journey with Apache Airflow. From small contributions to building meaningful connections in the open-source community, Amogh’s story provides actionable insights for anyone on the cusp of their open-source journey. Key Takeaways: (02:09) Building data engineering platforms at Cloudera with Kubernetes. (04:00) Brainstorming led to contributing to Apache Airflow. (05:17) Starting small with link fixes, progressing to Breeze development. (07:00) Becoming a committer for Apache Airflow in September 2023. (09:51) The steep learning curve for contributing to Airflow. (16:30) Using GitHub’s “good-first-issue” label to get started. (18:15) Setting up a development environment with Breeze. (22:00) Open-source contributions enhance your resume and career. (24:51) Amogh’s advice: Start small and stay consistent. (28:12) Engage with the community via Slack, email lists and meetups. Resources Mentioned: Amogh Desai - https://www.linkedin.com/in/amogh-desai-385141157/?originalSubdomain=in%20%20https://www.linkedin.com/company/astronomer/ Astronomer - https://www.linkedin.com/company/astronomer/ Apache Airflow GitHub Repository - https://github.com/apache/airflow Contributors Quick Guide - https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst Breeze Development Tool - https://github.com/apache/airflow/tree/main/dev/breeze Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 Using Airflow To Power Machine Learning Pipelines at Optimove with Vasyl Vasyuta 24:11
24:11
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי24:11
Data orchestration and machine learning are shaping how organizations handle massive datasets and drive customer-focused strategies. Tools like Apache Airflow are central to this transformation. In this episode, Vasyl Vasyuta , R&D Team Leader at Optimove , joins us to discuss how his team leverages Airflow to optimize data processing, orchestrate machine learning models and create personalized customer experiences. Key Takeaways: (01:59) Optimove tailors marketing notifications with personalized customer journeys. (04:25) Airflow orchestrates Snowflake procedures for massive datasets. (05:11) DAGs manage workflows with branching and replay plugins. (05:41) The "Joystick" plugin enables seamless data replays. (09:33) Airflow supports MLOps for customer data grouping. (11:15) Machine learning predicts customer behavior for better campaigns. (13:20) Thousands of DAGs run every five minutes for data processing. (15:36) Custom versioning allows rollbacks and gradual rollouts. (18:00) Airflow logs enhance operational observability. (23:00) DAG versioning in Airflow 3.0 could boost efficiency. Resources Mentioned: Vasyl Vasyuta - https://www.linkedin.com/in/vasyl-vasyuta-3270b54a/ Optimove - https://www.linkedin.com/company/optimove/ Apache Airflow - https://airflow.apache.org/ Snowflake - https://www.snowflake.com/ Datadog - https://www.datadoghq.com/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 Maximizing Business Impact Through Data at GlossGenius with Katie Bauer 25:49
25:49
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי25:49
Bridging the gap between data teams and business priorities is essential for maximizing impact and building value-driven workflows. Katie Bauer , Senior Director of Data at GlossGenius , joins us to share her principles for creating effective, aligned data teams. In this episode, Katie draws from her experience at GlossGenius, Reddit and Twitter to highlight the common pitfalls data teams face and how to overcome them. She offers practical strategies for aligning team efforts with organizational goals and fostering collaboration with stakeholders. Key Takeaways: (02:36) GlossGenius provides an all-in-one platform for beauty professionals. (03:59) Airflow orchestrates data and MLOps workflows at GlossGenius. (04:41) Focusing on value helps data teams achieve greater impact. (06:23) Aligning team priorities with company goals minimizes friction. (08:44) Building strong stakeholder relationships requires curiosity. (12:46) Treating roles as flexible fosters team innovation. (13:21) Adapting to new technologies improves effectiveness. (18:28) Acting like your time is valuable earns respect. (23:38) Proactive data initiatives drive strategic value. (24:20) Usage data offers critical insights into tool effectiveness. Resources Mentioned: Katie Bauer - https://www.linkedin.com/in/mkatiebauer/ GlossGenius - https://www.linkedin.com/company/glossgenius/ Apache Airflow - https://airflow.apache.org/ DBT - https://www.getdbt.com/ Cosmos - https://cosmos.apache.org/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 Optimizing Large-Scale Deployments at LinkedIn with Rahul Gade 27:47
27:47
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי27:47
Scaling deployments for a billion users demands innovation, precision and resilience. In this episode, we dive into how LinkedIn optimizes its continuous deployment process using Apache Airflow. Rahul Gade , Staff Software Engineer at LinkedIn , shares his insights on building scalable systems and democratizing deployments for over 10,000 engineers. Rahul discusses the challenges of managing large-scale deployments across 6,000 services and how his team leverages Airflow to enhance efficiency, reliability and user accessibility. Key Takeaways: (01:36) LinkedIn minimizes human involvement in production to reduce errors. (02:00) Airflow powers LinkedIn’s Continuous Deployment platform. (05:43) Continuous deployment adoption grew from 8% to a targeted 80%. (11:25) Kubernetes ensures scalability and flexibility for deployments. (12:04) A custom UI offers real-time deployment transparency. (16:23) No-code YAML workflows simplify deployment tasks. (17:18) Canaries and metrics ensure safe deployments across fabrics. (20:45) A gateway service ensures redundancy across Airflow clusters. (24:22) Abstractions let engineers focus on development, not logistics. (25:20) Multi-language support in Airflow 3.0 simplifies adoption. Resources Mentioned: Rahul Gade - https://www.linkedin.com/in/rahul-gade-68666818/ LinkedIn - https://www.linkedin.com/company/linkedin/ Apache Airflow - https://airflow.apache.org/ Kubernetes - https://kubernetes.io/ Open Policy Agent (OPA) - https://www.openpolicyagent.org/ Backstage - https://backstage.io/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 How Uber Manages 1 Million Daily Tasks Using Airflow, with Shobhit Shah and Sumit Maheshwari 28:44
28:44
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי28:44
When data orchestration reaches Uber’s scale, innovation becomes a necessity, not a luxury. In this episode, we discuss the innovations behind Uber’s unique Airflow setup. With our guests Shobhit Shah and Sumit Maheshwari , both Staff Software Engineers at Uber , we explore how their team manages one of the largest data workflow systems in the world. Shobhit and Sumit walk us through the evolution of Uber’s Airflow implementation, detailing the custom solutions that support 200,000 daily pipelines. They discuss Uber's approach to tackling complex challenges in data orchestration, disaster recovery and scaling to meet the company’s extensive data needs. Key Takeaways: (02:03) Airflow as a service streamlines Uber’s data workflows. (06:16) Serialization boosts security and reduces errors. (10:05) Java-based scheduler improves system reliability. (13:40) Custom recovery model supports emergency pipeline switching. (15:58) No-code UI allows easy pipeline creation for non-coders. (18:12) Backfill feature enables historical data processing. (22:06) Regular updates keep Uber aligned with Airflow advancements. (26:07) Plans to leverage Airflow’s latest features. Resources Mentioned: Shobhit Shah - https://www.linkedin.com/in/shahshobhit/ Sumit Maheshwar - https://www.linkedin.com/in/maheshwarisumit/ Uber - https://www.linkedin.com/company/uber-com/ Apache Airflow - https://airflow.apache.org/ Airflow Summit - https://airflowsummit.org/ Uber - https://www.uber.com/tw/en/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 Building Resilient Data Systems for Modern Enterprises at Astrafy with Andrea Bombino 28:29
28:29
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי28:29
Efficient data orchestration is the backbone of modern analytics and AI-driven workflows. Without the right tools, even the best data can fall short of its potential. In this episode, Andrea Bombino , Co-Founder and Head of Analytics Engineering at Astrafy , shares insights into his team’s approach to optimizing data transformation and orchestration using tools like datasets and Pub/Sub to drive real-time processing. Andrea explains how they leverage Apache Airflow and Google Cloud to power dynamic data workflows. Key Takeaways: (01:55) Astrafy helps companies manage data using Google Cloud. (04:36) Airflow is central to Astrafy’s data engineering efforts. (07:17) Datasets and Pub/Sub are used for real-time workflows. (09:59) Pub/Sub links multiple Airflow environments. (12:40) Datasets eliminate the need for constant monitoring. (15:22) Airflow updates have improved large-scale data operations. (18:03) New Airflow API features make dataset updates easier. (20:45) Real-time orchestration speeds up data processing for clients. (23:26) Pub/Sub enhances flexibility across cloud environments. (26:08) Future Airflow features will offer more control over data workflows. Resources Mentioned: Andrea Bombino - https://www.linkedin.com/in/andrea-bombino/ Astrafy - https://www.linkedin.com/company/astrafy/ Apache Airflow - https://airflow.apache.org/ Google Cloud - https://cloud.google.com/ dbt - https://www.getdbt.com/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 Inside Airflow 3: Redefining Data Engineering with Vikram Koka 30:08
30:08
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי30:08
Data orchestration is evolving faster than ever and Apache Airflow 3 is set to revolutionize how enterprises handle complex workflows. In this episode, we dive into the exciting advancements with Vikram Koka , Chief Strategy Officer at Astronomer and PMC Member at The Apache Software Foundation . Vikram shares his insights on the evolution of Airflow and its pivotal role in shaping modern data-driven workflows, particularly with the upcoming release of Airflow 3. Key Takeaways: (02:36) Vikram leads Astronomer’s engineering and open-source teams for Airflow. (05:26) Airflow enables reliable data ingestion and curation. (08:17) Enterprises use Airflow for mission-critical data pipelines. (11:08) Airflow 3 introduces major architectural updates. (13:58) Multi-cloud and edge deployments are supported in Airflow 3. (16:49) Event-driven scheduling makes Airflow more dynamic. (19:40) Tasks in Airflow 3 can run in any language. (22:30) Multilingual task support is crucial for enterprises. (25:21) Data assets and event-based integration enhance orchestration. (28:12) Community feedback plays a vital role in Airflow 3. Resources Mentioned: Vikram Koka - https://www.linkedin.com/in/vikramkoka/ Astronomer - https://www.linkedin.com/company/astronomer/ The Apache Software Foundation LinkedIn - https://www.linkedin.com/company/the-apache-software-foundation/ Apache Airflow LinkedIn - https://www.linkedin.com/company/apache-airflow/ Apache Airflow - https://airflow.apache.org/ Astronomer - https://www.astronomer.io/ The Apache Software Foundation - https://www.apache.org/ Join the Airflow slack and/or Dev list - https://airflow.apache.org/community/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 Building a Data-Driven HR Platform at 15Five with Guy Dassa 20:25
20:25
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי20:25
Data and AI are revolutionizing HR, empowering leaders to measure performance and drive strategic decisions like never before. In this episode, we explore the transformation of HR technology with Guy Dassa , Chief Technology Officer at 15Five , as he shares insights into their evolving data platform. Guy discusses how 15Five equips HR leaders with tools to measure and take action on team performance, engagement and retention. He explains their data-driven approach, highlighting how Apache Airflow supports their data ingestion, transformation, and AI integration. Key Takeaways: (01:54) 15Five acts as a command center for HR leaders. (03:40) Tools like performance reviews, engagement surveys, and an insights dashboard guide actionable HR steps. (05:33) Data visualization, insights, and action recommendations enhance HR effectiveness to improve their people's outcomes. (07:08) Strict data confidentiality and sanitized AI model training. (09:21) Airflow is central to data transformation and enrichment. (11:15) Airflow enrichment DAGs integrate AI models. (13:33) Integration of Airflow and DBT enables efficient data transformation. (15:28) Synchronization challenges arise with reverse ETL processes. (17:10) Future plans include deeper Airflow integration with AI. (19:31) Emphasizing the need for DAG versioning and improved dependency visibility. Resources Mentioned: Guy Dassa - https://www.linkedin.com/in/guydassa/ 15Five - https://www.linkedin.com/company/15five/ Apache Airflow - https://airflow.apache.org/ MLflow - https://mlflow.org/ DBT - https://www.getdbt.com/ Kubernetes - https://kubernetes.io/ RedShift - https://aws.amazon.com/redshift/ 15Five - https://www.15five.com/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 The Intersection of AI and Data Management at Dosu with Devin Stein 20:18
20:18
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי20:18
Unlocking engineering productivity goes beyond coding — it’s about managing knowledge efficiently. In this episode, we explore the innovative ways in which Dosu leverages Airflow for data orchestration and supports the Airflow project. Devin Stein , Founder of Dosu , shares his insights on how engineering teams can focus on value-added work by automating knowledge management. Devin dives into Dosu’s purpose, the significance of AI in their product, and why they chose Airflow as the backbone for scheduling and data management. Key Takeaways: (01:33) Dosu's mission to democratize engineering knowledge. (05:00) AI is central to Dosu's product for structuring engineering knowledge. (06:23) The importance of maintaining up-to-date data for AI effectiveness. (07:55) How Airflow supports Dosu’s data ingestion and automation processes. (08:45) The reasoning behind choosing Airflow over other orchestrators. (11:00) Airflow enables Dosu to manage both traditional ETL and dynamic workflows. (13:04) Dosu assists the Airflow project by auto-labeling issues and discussions. (14:56) Thoughtful collaboration with the Airflow community to introduce AI tools. (16:37) The potential of Airflow to handle more dynamic, scheduled workflows in the future. (18:00) Challenges and custom solutions for implementing dynamic workflows in Airflow. Resources Mentioned: Apache Airflow - https://airflow.apache.org/ Dosu Website - https://dosu.dev/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 AI-Powered Vehicle Automation at Ford Motor Company with Serjesh Sharma 26:11
26:11
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי26:11
Harnessing data at scale is the key to driving innovation in autonomous vehicle technology. In this episode, we uncover how advanced orchestration tools are transforming machine learning operations in the automotive industry. Serjesh Sharma, Supervisor ADAS Machine Learning Operations (MLOps) at Ford Motor Company, joins us to discuss the challenges and innovations his team faces working to enhance vehicle safety and automation. Serjesh shares insights into the intricate data processes that support Ford’s Advanced Driver Assistance Systems (ADAS) and how his team leverages Apache Airflow to manage massive data loads efficiently. Key Takeaways: (01:44) ADAS involves advanced features like pre-collision assist and self-driving capabilities. (04:47) Ensuring sensor accuracy and vehicle safety requires extensive data processing. (05:08) The combination of on-prem and cloud infrastructure optimizes data handling. (09:27) Ford processes around one petabyte of data per week, using both CPUs and GPUs. (10:33) Implementing software engineering best practices to improve scalability and reliability. (15:18) GitHub Issues streamline onboarding and infrastructure provisioning. (17:00) Airflow's modular design allows Ford to manage complex data pipelines. (19:00) Kubernetes pod operators help optimize resource usage for CPU-intensive tasks. (20:35) Ford's scale challenges led to customized Airflow configurations for high concurrency. (21:02) Advanced orchestration tools are pivotal in managing vast data landscapes in automotive innovation. Resources Mentioned: Serjesh Sharma - www.linkedin.com/in/serjeshsharma/ Ford Motor Company - www.linkedin.com/company/ford-motor-company/ Apache Airflow - airflow.apache.org/ Kubernetes - kubernetes.io/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 From Task Failures to Operational Excellence at GumGum with Brendan Frick 24:06
24:06
הפעל מאוחר יותר
הפעל מאוחר יותר
רשימות
לייק
אהבתי24:06
Data failures are inevitable but how you manage them can define the success of your operations. In this episode, we dive deep into the challenges of data engineering and AI with Brendan Frick, Senior Engineering Manager, Data at GumGum. Brendan shares his unique approach to managing task failures and DAG issues in a high-stakes ad-tech environment. Brendan discusses how GumGum leverages Apache Airflow to streamline data processes, ensuring efficient data movement and orchestration while minimizing disruptions in their operations. Key Takeaways: (02:02) Brendan’s role at GumGum and its approach to ad tech. (04:27) How GumGum uses Airflow for daily data orchestration, moving data from S3 to warehouses. (07:02) Handling task failures in Airflow using Jira for actionable, developer-friendly responses. (09:13) Transitioning from email alerts to a more structured system with Jira and PagerDuty. (11:40) Monitoring task retry rates as a key metric to identify potential issues early. (14:15) Utilizing Looker dashboards to track and analyze task performance and retry rates. (16:39) Transitioning from Kubernetes operator to a more reliable system for data processing. (19:25) The importance of automating stakeholder communication with data lineage tools like Atlan. (20:48) Implementing data contracts to ensure SLAs are met across all data processes. (22:01) The role of scalable SLAs in Airflow to ensure data reliability and meet business needs. Resources Mentioned: Brendan Frick - https://www.linkedin.com/in/brendan-frick-399345107/ GumGum - https://www.linkedin.com/company/gumgum/ Apache Airflow - https://airflow.apache.org/ Jira - https://www.atlassian.com/software/jira Atlan - https://atlan.com/ Kubernetes - https://kubernetes.io/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
ברוכים הבאים אל Player FM!
Player FM סורק את האינטרנט עבור פודקאסטים באיכות גבוהה בשבילכם כדי שתהנו מהם כרגע. זה יישום הפודקאסט הטוב ביותר והוא עובד על אנדרואיד, iPhone ואינטרנט. הירשמו לסנכרון מנויים במכשירים שונים.