Overview
Skills
Job Details
Required Skills -
Data Engineering with Python & PySpark, Databricks, Containers, Cloud, Automated testing, Agile
Job Duties -
Be an active participant in all scrum ceremonies and lead in the delivery of the strategic roadmap.
Responsible for the full application development lifecycle and support.
Provide subject matter expertise and direction, guidance, and support on complex engagements.
Collaborate with the Architecture team, Product team, Development team and other Information Technology (IT) team. Initiates process improvements for new and existing systems.
Design, build, and maintain scalable and efficient data pipelines using Python and PySpark.
Develop and optimize ETL/ELT processes to ingest, transform, and load large datasets from various sources.
Implement data quality checks and monitoring to ensure the accuracy and integrity of our data.
Job Requirements -
7+ years of Data/Software Engineering experience
5+ years hands-on development experience in Python and PySpark for large-scale data processing
4+ years experience with Containers (Docker and orchestration, ie: Kubernetes or EKS or Redhat)
4+ years experience with test automation
5+ years experience working with SQL & NoSql databases
Experience with version control systems, such as Git
Work experience in a complex enterprise application delivery, using agile development methodology
Bachelor s in computer science or equivalent work experience