Overview
On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 12 Month(s)
No Travel Required
Skills
Apache Airflow
Artificial Intelligence
Cloud Computing
Collaboration
Communication
Data Architecture
Job Details
- The Solutions Architect specializing in Cloud Orchestration, will work with a small gro technologists and become a trusted advisor delivering technical solutions leveraging Apache Airflow hosted on the Astronomer platform. In this role, he will engineer a wid range of use cases where, Apache Airflow sits at the center of the solution, with the go rapidly developing solutions for the use cases.
- Help guide a small group of technologists in their Apache Airflow journeys, incl identifying new use cases and onboarding new domain teams.
- Review, optimize, and tune data ingestion and extraction pipelines orchestrated Airflow.
- Development of frameworks to manage and engineer a large number of pipeline the entire domain.
- Build architecture, data flow, and operational diagrams and documents, with detailed physical and logical layers.
- Provide reference implementations of various activities, including composing da pipelines in Airflow, implementing new Airflow features, or integrating Airflow with 3rd party solutions.
- Keep up with the latest Astro and Airflow features, in order to better advise the technologies on impactful improvements they can make.
- Collaborate to build reusable assets, automation tools, and documented best practices.
- Interact with Domain and Engineering teams to channel product feedback and requirements discussions.
Skills Required:
- Experience with Apache Airflow in production environments.
- Experience in designing and implementing ETL, Data Warehousing, and ML/AI use cases.
- Proficiency in Python.
- Knowledge of Azure cloud-native data architecture.
- Demonstrated technical leadership on team projects.
- Strong oral and written communication skills.
- Willingness to learn new technologies and build reference implementations.
- Experience in migrating workflows from legacy schedulers (Tidal) to Apache Airflow.
- Experience in integrating with Azure Data Factory pipelines.
- Experience
- Snowflake and Databricks experience.
- Experience working with containers.
- SQL experience.
- Kubernetes experience, either on-premise or in the cloud.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.