Position: Senior DevOps / Data Engineer
Work Location: Coopersburg, PA (Onsite)
Type of Employment: Contract
Experience Required: 12+ Years
Start Date: Immediate
Role Overview:
We are seeking a hybrid DevOps + Data Engineering profile, with a stronger emphasis on DevOps capabilities. The role owner will be required to modify & manage ETL pipelines as part of platform operations. The role owner should have hands on DevOps experience along with good understanding of data pipelines.
Key Result Areas and Activities
- Modify and manage data pipelines in DevOps setup
- Good understating of CI/CD pipelines
- Development responsibilities
- Software Asset Maintenance & Upgrades
? Upgrade and maintain third-party components such as ingress controllers, service meshes, monitoring agents, infrastructure libraries, and cloud-native tools.
? Apply the latest versions and security patches to ensure compliance, stability, and performance.
- Infrastructure as Code (IaC) Enhancements
? Update and enhance IaC scripts to support version upgrades across development, QA, and production environments.
? Validate changes through sandbox testing before deployment to production.
- Compatibility & Dependency Management
? Ensure upgraded components remain compatible with dependent services and applications.
? Identify and mitigate potential breaking changes or dependency conflicts.
- Application Code Adjustments
? Implement necessary code changes in supported languages (e.g., Python) to accommodate new versions or configuration requirements.
? Address minor and moderate changes required for compatibility with upgraded components.
? Update existing unit tests in response to the application code changes.
- Security & Compliance
? Apply immediate fixes for vulnerabilities.
? Maintain adherence to organizational security and governance guidelines.
- Testing & Validation
? Create and execute test strategies for validating the upgrades.
? Execute existing unit tests and manual test cases post-upgrade.
? Conduct functional testing of impacted applications to ensure end-to-end stability.
? Validate application behaviour after code changes and infrastructure updates.
- Reporting & Governance
? Provide weekly status reports detailing software versions, security posture, upgrade activities, and testing outcomes.
? Participate in regular reviews and acceptance processes.
Work and Technical Experience
Must-Have Skill Set
- Terraform
- CI/CD Pipelines
- Platform upgrades and maintenance
- QA exposure integration & platform testing
- Deep understanding of cloud data services (AWS) and migration strategies.
- Strong proficiency in ETL/ELT pipelines and framework development using Python. Development may not be needed but understanding is required
- Modifying & monitoring data pipelines for batch and real-time processing.
- Exceptional communication skills for engaging executives and non-technical stakeholders.
- Knowledge of containerization (Docker, Kubernetes) and orchestration for data workloads.
Good-to-Have Skill Set
- Certifications in cloud platforms (AWS) and data engineering.
- Experience with advanced analytics and machine learning pipelines.
- Prior consulting experience or leading large-scale data transformation programs.
- Knowledge of data extraction from SAP OData Services.
- Experience with multiple relational and NoSQL databases (Must have - Redshift, RDS and Athena).
- Experience with BI tools integration with enterprise data platforms.
Qualification:
- Bachelor's degree in computer science, engineering, or related field (master's degree is a plus)
- Demonstrated continued learning through one or more technical certifications or related methods
- At least 12+ years of relevant experience; two years may be substituted for a master's degree
Key expectations:
- Please submit a maximum of 3 quality submissions per week.
- Key Skills: strong hands-on experience with DevOps (Platform Upgrades, CI/CD Pipeline, IAC - Terraform) and Data Engineering (Managing ETL Pipelines, Python knowledge).