Sr Python Developer & Lead

  • Auburn Hills, MI
  • Posted 3 hours ago | Updated 3 hours ago

Overview

On Site
$65 - $70
Contract - W2
Contract - 12 Month(s)

Skills

Advanced Analytics
Agile
Automated Testing
Cloud Computing
Cloudera
Collaboration
Command-line Interface
Computer Science
Confluence
Continuous Delivery
Continuous Integration
Data Analysis
Data Engineering
Data-flow Diagrams
Design Patterns
Docker
Documentation
Fluency
Git
GitHub
GitLab
Good Clinical Practice
Google Cloud Platform
IT Management
JIRA
Knowledge Sharing
Linux
Management
PySpark
Python
Quality Assurance
SQL
Mentorship
Object-Oriented Programming
Orchestration
Pair Programming
Pivotal
Testing
Unix
Version Control
Waterfall
Workflow
Streaming
Systems Design
Tableau
Technical Writing
Test-driven Development

Job Details

Title: Sr Python Developer & Lead

Location: Auburn Hills, MI

Overall Experience: 8+ years of relevant experience

Mandatory Skills: Data Engineering, Python, PySpark, CI/CD, Airflow, Workflow Orchestration

Job Requirements

The Sr. Architect & Technical Lead (SDET Lead) will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects

Key Responsibilities
1. Data Engineering: Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads.
2. Workflow Orchestration: Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability.
3. CI/CD Pipeline Development: Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions.
4. Testing & Quality: Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines.
5. Secure Development: Implement secure coding best practices and design patterns throughout the development lifecycle.
6. Collaboration: Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions.
7. Documentation: Create and maintain technical documentation, including process/data flow diagrams and system design artifacts.
8. Mentorship: Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices.
9. Troubleshooting: Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks.

Cross-Team Knowledge Sharing:
Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage.Includes all above skills, plus the following;

  • Minimum of 7+ years overall IT experience
  • Experienced in waterfall, iterative, and agile methodologies

Technical Experience:

Hands-on Data Engineering : Minimum 5+ yearsof practical experience building production-grade data pipelines using Python and PySpark.
2. Airflow Expertise: Proven track record of designing, deploying, and managing Airflow DAGs in enterprise environments.
3. CI/CD for Data Projects : Ability to build and maintain CI/CD pipelinesfor data engineering workflows, including automated testing and deployment**.
4. Cloud & Containers: Experience with containerization (Docker and cloud platforms (Google Cloud Platform) for data engineering workloads. Appreciation for twelve-factor design principles
5. Python Fluency : Ability to write object-oriented Python code manage dependencies, and follow industry best practices
6. Version Control: Proficiency with **Git** for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows).
7. Unix/Linux: Strong command-line skills** in Unix-like environments.
8. SQL : Solid understanding of SQL for data ingestion and analysis.
9. Collaborative Development : Comfortable with code reviews, pair programming and usingremote collaboration tools effectively.
10. Engineering Mindset: Writes code with an eye for maintainability and testability; excited to build production-grade software
11. Education: Bachelor s or graduate degree in Computer Science, Data Analytics or related field, or equivalent work experience.

Unique Skills

  • Graduate degree in a related field, such as Computer Science or Data Analytics
  • Familiarity with Test-Driven Development (TDD)
  • A high tolerance for OpenShift, Cloudera, Tableau, Confluence, Jira, and other enterprise tools.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.