Overview
Skills
Job Details
STRATEGIC STAFFING SOLUTIONS HAS AN OPENING!
This is a Contract Opportunity with our company that MUST be worked on a W2 Only. No C2C eligibility for this position. Visa Sponsorship is Available! The details are below.
Beware of scams. S3 never asks for money during its onboarding process.
Job Title: Software Engineer
Location: CHARLOTTE NC 28202
Contract Length: 24+ Months
Job ref #242847
About the Role
We are seeking a highly skilled and adaptable Lead Software Engineer to join our Counterparty Credit Risk organization. This role supports: Data Services, with a focus on modernizing legacy systems, managing high-volume data pipelines, and contributing to full-stack application development.
You will a team member on business-as-usual (BAU) processes, while also contributing to the development of a new data platform over the next two years. The ideal candidate is a strong communicator, a proactive problem-solver, and comfortable working in a Kanban Agile environment.
Key Responsibilities
- Lead Agile Development: Guide and support multiple Agile teams focused on data Extract , Ingestion, and transformation.
- Modernize Legacy Systems: Migrate data pipelines from Ab Initio and Filesystem to modern technologies such as PySpark, S3, Airflow, Parquet, and Iceberg.
- Full-Stack Engineering: Design and develop scalable backend services using PySpark and Python.
- Data Platform Enablement: Support ingestion of 300+ data feeds into the platform to ensure timely nightly batch processing.
- Cross-Functional Collaboration: Partner with business stakeholders and product owners to understand requirements and deliver effective solutions.
- Agile Execution: Working with both Kanban and scrum teams and should be familiar with both and check-ins and managing tasks via Jira.
- Mentorship and Communication: Provide technical guidance, foster collaboration, and proactively seek help when encountering blockers.
- Platform Transition Support: Contribute to the migration from legacy systems to a new data platform over the next two years.
- BAU and Strategic Support: Balance business-as-usual responsibilities while contributing to long-term platform modernization.
- Documentation and Data Modeling: Maintain clear technical documentation and demonstrate a strong understanding of columnar data structures.
- Experience on the different file format systems (Parquet, ORC, AVRO).
- Experience on the code containerization deployments using Docker / Kubernetes.
- Java background would be a plus.
- Good Knowledge on large scale ETL Based frameworks
- Experience on ETL tool (AbInitio).
Required Skills & Experience
Top Technical Skills
- 3+ years of experience with PySpark, S3, Iceberg, Git, Python, Airflow, and Parquet
- 5+ years of experience with SQL
- Experience with Agile methodologies and tools like Jira
- Familiarity with Kafka
- Experience with GitHub Copilot, Web Services, Visual studio, IntelliJ, and Gradle
- Experience in monitoring tools like Grafana or Prometheus
Preferred Qualifications
- Proven experience leading Agile teams and mentoring junior developers
- Strong communication skills and the ability to collaborate with business stakeholders
- Comfortable working in both Scrum and Kanban model with frequent scrum check-ins
- Ability to identify blockers and proactively seek help when needed
- Experience working in a regulated environment with a focus on compliance and data governance.
- 2+ years of working with Ab Initio graphs and plans
Team Structure & Projects
- You will be part of a team that handles over 300+ data feeds, ensuring timely ingestion for nightly batch processing.
- Role will focus on Data Services, modernizing data ingestion pipelines.