Sr Python Developer & Lead (SDET)

Overview

On Site
Hybrid
BASED ON EXPERIENCE
Contract - W2
Contract - 6+ mo(s)

Skills

PYTHON
PYSPARK
AIRFLOW
DATA ENGINEERING
CI/CD
CLOUD
GIT
SDET

Job Details

Sr Python Developer & Lead (SDET) - 25-06712
Hybrid/Onsite in Auburn Hills, MI (3-Days per Week Onsite)
6+ Months Duration
W2 ONLY - Must be able to work directly with NTT Data | NO C2C

At NTT DATA, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees have been key factors in our company's growth and market presence. By hiring the best people and helping them grow both professionally and personally, we ensure a bright future for NTT DATA and for the people who work here.

For more than 25 years, NTT DATA Services have focused on impacting the core of your business operations with industry-leading outsourcing services and automation. With our industry-specific platforms, we deliver continuous value addition, and innovation that will improve your business outcomes. Outsourcing is not just a method of gaining a one-time cost advantage, but an effective strategy for gaining and maintaining competitive advantages when executed as part of an overall sourcing strategy.

Day to Day job Duties: (what this person will do on a daily/weekly basis)

"The Senior Data Engineer & Technical Lead (SDET Lead) will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects

Key Responsibilities

  1. Data Engineering: Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads.
  2. Workflow Orchestration: Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability.
  3. CI/CD Pipeline Development: Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions.
  4. Testing & Quality: Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines.
  5. Secure Development: Implement secure coding best practices and design patterns throughout the development lifecycle.
  6. Collaboration: Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions.
  7. Documentation: Create and maintain technical documentation, including process/data flow diagrams and system design artifacts.
  8. Mentorship: Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices.
  9. Troubleshooting: Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks.

Basic Qualifications: (what are the skills required to this job with minimum years of experience on each)

Minimum 5+ years of practical experience for the below-mentioned points

  1. 1.Hands-on Data Engineering: Minimum 5+ years of practical experience building production-grade data pipelines using Python and PySpark.
  2. Airflow Expertise: Proven track record of designing, deploying, and managing Airflow DAGs in enterprise environments. Require 5+ years of experience
  3. CI/CD for Data Projects : Ability to build and maintain CI/CD pipelines for data engineering workflows, including automated testing and deployment**. Require 5+ years of experience
  4. Cloud & Containers: Experience with containerization (Docker and cloud platforms (Google Cloud Platform) for data engineering workloads. Appreciation for twelve-factor design principles Require 5+ years of experience
  5. Python Fluency : Ability to write object-oriented Python code manage dependencies and follow industry best practices . Require 5+ years of experience
  6. Version Control: Proficiency with **Git** for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows). Require 5+ years of experience
  7. Unix/Linux: Strong command-line skills** in Unix-like environments. Require 3+ years of experience
  8. SQL : Solid understanding of SQL for data ingestion and analysis. Require 3+ years of experience
  9. Collaborative Development : Comfortable with code reviews, pair programming and using remote collaboration tools effectively. Require 3+ years of experience
  10. Engineering Mindset: Writes code with an eye for maintainability and testability; excited to build production-grade software. Require 3+ years of experience
  11. Education: Bachelor's or graduate degree in Computer Science, Data Analytics or related field, or equivalent work experience. "

    Cross-Team Knowledge Sharing: Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage. Includes all above key responsibilities and skills, plus the following.
  • Minimum of 7+ years overall IT experience
  • Experienced in waterfall, iterative, and agile methodologies "


Degree:
Graduate degree in a related field, such as Computer Science or Data Analytics
Familiarity with Test-Driven Development (TDD)
A high tolerance for OpenShift, Cloudera, Tableau, Confluence, Jira, and other enterprise tools

#LI
About NTT DATA Services:

NTT DATA Services is a recognized leader in IT and business services, including cloud, data and applications, headquartered in Texas. As part of NTT DATA, a $30 billion trusted global innovator with a combined global reach of over 80 countries, we help clients transform through business and technology consulting, industry and digital solutions, applications development and management, managed edge-to-cloud infrastructure services, BPO, systems integration and global data centers. We are committed to our clients' long-term success. Visit nttdata.com or LinkedIn to learn more.

NTT DATA Services is an equal opportunity employer and considers all applicants without regarding to race, color, religion, citizenship, national origin, ancestry, age, sex, sexual orientation, gender identity, genetic information, physical or mental disability, veteran or marital status, or any other characteristic protected by law. We are committed to creating a diverse and inclusive environment for all employees. If you need assistance or accommodation due to a disability, please inform your recruiter so that we may connect you with the appropriate team.

Where required by law, NTT DATA provides a reasonable range of compensation for specific roles. The starting hourly range for this remote role is $65 to $70. This range reflects the minimum and maximum target compensation for the position across all US locations. Actual compensation will depend on several factors, including the candidate's actual work location, relevant experience, technical skills, and other qualifications.

This position is eligible for company benefits that will depend on the nature of the role offered. Company benefits may include medical, dental, and vision insurance, flexible spending or health savings account, life, and AD&D insurance, short-and long-term disability coverage, paid time off, employee assistance, participation in a 401k program with company match, and additional voluntary or legally required benefits.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About NTT DATA Americas, Inc