STRATEGIC STAFFING SOLUTIONS HAS AN OPENING!
This is a Contract Opportunity with our company that MUST be worked on a W2 Only. No C2C eligibility for this position. Visa Sponsorship is Available! The details are below.
Beware of scams. S3 never asks for money during its onboarding process.
Job Title: Senior Database Engineer
Contract Length:18+ Month contract
Some on site work
Location: Charlotte, NC/ ISELIN, NJ 08830
Ref# 246059
We are seeking a Senior-level Database Engineer to design, build, and optimize large-scale data pipelines within a high-volume enterprise data environment. This role supports critical applications tied to fraud and claims analysis, working across legacy and modern cloud platforms.
The environment is undergoing a major transformation from Teradata to Google Cloud Platform (Google Cloud Platform), requiring hands-on engineering expertise in both existing and target-state architectures.
Key Responsibilities
- Design, develop, and maintain scalable ETL/data pipeline solutions
- Work with large-scale datasets (hundreds of terabytes across hundreds of tables)
- Support migration efforts from Teradata to Google Cloud Platform (BigQuery-based ecosystem)
- Build and optimize pipelines using PySpark and ETL frameworks
- Collaborate with stakeholders across fraud and analytics teams to support data needs
- Ensure performance, reliability, and data quality across pipeline workflows
- Troubleshoot and resolve production issues in distributed data environments
- Work within scheduling and orchestration tools to manage pipeline execution
Required Qualifications
- 5+ years of data engineering or software engineering experience
- Strong expertise in:
- SQL
- ETL development
- PySpark
- Hands-on experience with:
- Autosys (job scheduling)
- Ab Initio
- Experience building and maintaining large-scale data pipelines
- Ability to work in hybrid environments (on-prem + cloud)
Preferred Qualifications
- Experience with Google Cloud Platform (Google Cloud Platform), especially BigQuery
- Prior experience with Teradata
- Familiarity with Hadoop ecosystem
- Exposure to tools such as Dremio and distributed storage systems
- Cloud certifications (Google Cloud Platform preferred)
Technical Environment
- Current: Teradata-based platform
- Target: Google Cloud Platform (BigQuery ecosystem)
- Tools & Technologies:
- PySpark
- Hadoop
- Ab Initio
- Autosys
- Dremio
- S3-compatible storage systems
![]()