Sr. Python/Spark Platform Engineer

Charlotte, NC, US • Posted 7 hours ago • Updated 7 hours ago
Contract W2
On-site
$53.56 - $58.7/hr
Fitment

Dice Job Match Score™

📋 Comparing job requirements...

Job Details

Skills

  • Risk Management
  • Financial Services
  • Statistical Models
  • Workflow
  • Orchestration
  • Microservices
  • Data Processing
  • Continuous Integration
  • Continuous Delivery
  • Sonar
  • Grafana
  • Debugging
  • Regulatory Compliance
  • RBAC
  • Encryption
  • Collaboration
  • Django
  • Google Cloud Platform
  • Google Cloud
  • Cloud Computing
  • Apache Spark
  • PySpark
  • Kubernetes
  • Docker
  • Python
  • Scala
  • Java
  • GitHub
  • Cloud Storage
  • Amazon S3
  • GCS
  • HDFS
  • Computer Science

Summary

Job Description Software Engineer 3
Senior Cloud Engineer - Apache Spark / Google Cloud Platform / Python / Microservices (S3)

Overview
This role supports the Model Risk Management platform used to run statistical risk models and large-scale data workloads. The engineer will help maintain and enhance a cloud-native platform used by statisticians and data scientists working with millions of customer accounts. The platform runs on Google Cloud Platform and supports distributed data processing using Apache Spark.
Experience with large-scale datasets, Google Cloud Platform services, Spark, and containerized microservices is essential. Experience in financial services is preferred but not required.

Key Responsibilities
Platform Engineering
Deploy, configure, and maintain OpenShift clusters or Google Cloud Platform projects for Spark workloads
Support platform capabilities for statistical model execution

Distributed Data Processing
Design and implement large-scale data processing workflows using Apache Spark
Tune Spark jobs using Kubernetes orchestration and auto-scaling

Application & Tooling
Build Python/Django services and microservices supporting platform users
Configure tooling and internal applications for data processing and model execution

Automation & CI/CD
Build and maintain CI/CD pipelines using GitHub Actions, Sonar, Harness, Helm

Monitoring & Troubleshooting
Monitor Spark jobs and cluster health using Prometheus, Grafana, and Google Cloud Platform tools
Debug distributed systems and optimize resource utilization

Security & Compliance
Implement RBAC and encryption for data in transit and at rest

Collaboration
Work closely with cross-functional teams to define requirements and support deployments

Qualifications
Experience
3+ years with Apache Spark
2+ years Django development
1+ years creating/maintaining conda environments
2+ years with OpenShift/Kubernetes
Experience working with Google Cloud Platform or another major cloud provider

Technical Skills
Spark frameworks (PySpark, Scala, or Java)
OpenShift/Kubernetes administration
Docker container experience
Python, Scala, or Java programming
Experience with GitHub Actions, Helm, Harness
Knowledge of distributed systems and cloud storage (S3, GCS, HDFS)

Education
Bachelor's degree in Computer Science, Engineering, or related field
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10105282
  • Position Id: 870577
  • Posted 7 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Hybrid in Charlotte, North Carolina

Today

Easy Apply

Contract

Depends on Experience

Charlotte, North Carolina

Today

Easy Apply

Full-time

USD 55.00 - 60.00 per hour

Charlotte, North Carolina

Today

Contract

USD 69.00 - 74.00 per hour

Charlotte, North Carolina

Today

Easy Apply

Full-time, Contract

USD0 - USD0

Search all similar jobs