Overview
Skills
Job Details
STRATEGIC STAFFING SLUTINS (S3) HAS AN OPENING!
Strategic Staffing Solutions is currently looking for a Python/Cloud full-stack engineer for a W2 contractor opportunity with one of our largest clients!
Candidates should be willing to work on our W2 ONLY, NO c2c.
Job Title: Python/Cloud Full stack engineer
Role Type: W2 only
Duration: 12 months
Schedule: Charlotte, NC
W2 hurly rate: up to 75$
PROJECT DETAILS:
Seeking a skilled Engineer to join our team with expertise in RedHat OpenShift and Apache Spark. This role focuses on designing, deploying, and managing scalable data processing solutions in a cloud-native environment. You will work closely with data scientists, software engineers, and the DevOps team to ensure robust, high-performance data pipelines and analytics platforms.
Responsibilities:
- Platform Management: Deploy, configure, and maintain OpenShift clusters to support containerized Spark Applications
- Data Pipeline Development: Design and implement large-scale data processing workflows using Apache Spark
- Optimization: Tune Spark jobs for performance, leveraging OpenShift's resource management capabilities(e.g., Kubernetes orchestration, auto-scaling).
- Integration: Integrate spark with other data sources (e.g., Kafka, s3, cloud storage) and sinks (e.g., databases, data lakes)
- CI/CD Implementation: Build and maintain CI/CD pipelines for deploying Spark application in OpenShift using tools like GitHub actions, Sonar, Harness.
- Monitoring & Troubleshooting: Monitor cluster health, Spark job performance, and resource utilization using OpenShift tools (e.g., Prometheus, Grafana) and resolve issues proactively
- Security: Ensure compliance with security standards, Implementing role-based access control(RBAC) and encryption for data in transit and at rest.
- Collaboration: Work with cross-functional teams to define requirements, Architect solutions, and support production deployments.
Qualifications:
- 5+ years working on Apache Spark for big data processing.
- 3+ years of Django development experience.
- 2+ years of creating and maintaining conda environments.
- 4+ years managing containerized environments with OpenShift or Kubernetes
Technical Skills:
- Proficiency in Spark frameworks (Python/PySpark, Scala, or Java)
- Hands-on experience with OpenShift administration (e.g., cluster setup, networking, storage)
- Proficiency in creating and maintaining conda environments and dependencies
- Familiarity with Docker and Kubernetes concepts (e.g., pods, deployments, services and images)
- Knowledge of distributed systems, cloud platforms (AWS, Google Cloud Platform, Azure), and data storage solutions (e.g., S3, HDFS)
- Programming: Strong coding skills in Python, Scala, or Java; experience with shell scripting is a plus.
- Tools: Experience with Git Actions, Helm, Harness, and CI/CD tools.
- Problem-solving: Ability to debug complex issues across distributed systems and optimize resource usage.
- Education: Bachelor's degree in Computer Science, Engineering or a related field.