Bas Harenslak is a Senior Solutions Architect in the Netherlands with a decade of experience building zero-maintenance, data-driven products that prioritize performance, scalability and reliability. He is a co-author of Data Pipelines with Apache Airflow and a committer to the widely used Apache Airflow project, contributing core refactors and improved DAG examples. His hands-on background spans cloud migrations and platform builds across AWS, GCP and Azure, containerized deployments (Kubernetes/EKS/GKE), and productionizing ML workflows and data pipelines. He combines an SRE mindset with a start-up mentality—obsessed with making systems that a user can trigger with a single click and then never worry about again. A cum laude MSc in Computer Science and an eclectic early career (including a role as an executive chauffeur) reflect both technical rigor and an unusual emphasis on professionalism and reliability.
10 years of coding experience
12 years of employment as a software developer
Master of Science (MSc), Computer Science, Cum laude, Master of Science (MSc), Computer Science, Cum laude at Leiden University
B ICT (Bachelor of Information and Communication Technology), Software Engineering, B ICT (Bachelor of Information and Communication Technology), Software Engineering at De Haagse Hogeschool / The Hague University of Applied Sciences
VWO (Pre University Education), VWO (Pre University Education) at Maerlant Lyceum
Contributions:1 review, 89 commits, 23 PRs in 3 years 7 months
Contributions summary:Bas primarily contributed to building data pipelines using Apache Airflow, as evidenced by the creation of DAGs for various data processing tasks. They implemented DAGs for downloading and processing data from external sources, specifically for rocket launch data and pageview data. The user also worked on setting up different execution environments, creating setups for sequential and local executors. Their work involved scripting and configuration to support the overall data pipeline infrastructure.
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
Role in this project:
Back-end & DevOps Engineer
Contributions:134 reviews, 211 PRs, 57 pushes in 7 years 2 months
Contributions summary:Bas refactored and improved various example DAGs within the Apache Airflow project. The user's contributions included refactoring code and addressing typos to improve code quality. They also moved models and features like DagPickle, Kube classes, SkipMixin, Connection and errors out of models.py, contributing to a better architecture. Furthermore, they added features to the `BranchPythonOperator` for increased usability.
monitorpythonschedulerapacheprogrammatically
Find and Hire Top DevelopersWe’ve analyzed the programming source code of over 60 million software developers on GitHub and scored them by 50,000 skills. Sign-up on Prog,AI to search for software developers.