Senior Cloud Developer / Cloud Data Platform Architect )/ Software Engineer :: REMOTE

Overview

Remote
$DOE
Full Time
Part Time
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 6

Skills

Java
AWS
Python
Data

Job Details

Job Title: Senior Cloud Developer / Full Stack Developer with Data/ Cloud Data Platform Architect (AWS, Spark, Python)/ Software Engineer

Location: Remote
Duration: Long-term contract
Visa: / GC

Need Fintech Experience

Must Have : Java , PYthon , Data and AWS

About the Role

We are seeking a Software Engineer to lead the design and implementation of cloud-native, large-scale data platforms. The ideal candidate will have extensive experience in AWS, PySpark, Java, Python, and Terraform, with a proven track record of building high-performance, event-driven data solutions.

You'll collaborate closely with architects, DevOps engineers, and product teams to design secure, scalable, and resilient ETL pipelines that power enterprise-grade applications and analytics systems.

Key Responsibilities

  • Architect, build, and maintain data pipelines leveraging AWS Glue, Lambda, Step Functions, EMR, ECS, and Kinesis.
  • Develop ETL and ELT frameworks using PySpark / Spark / Databricks for large-scale data processing.
  • Lead migration of legacy batch processes to event-driven, real-time architectures on AWS.
  • Design and implement Terraform-based infrastructure for modular, reusable AWS resource provisioning.
  • Optimize performance and scalability of distributed data processing pipelines.
  • Integrate and orchestrate data ingestion from multiple systems (S3, Kafka, APIs, databases).
  • Work with Snowflake / Redshift / Postgres for warehousing and data modeling.
  • Partner with architecture, product, and analytics teams to define data governance, lineage, and quality frameworks.
  • Implement CI/CD pipelines with Jenkins, Pytest, and Junit to ensure test-driven deployments.
  • Mentor junior engineers, review code, and promote best practices for data engineering and DevOps.

Required Skills & Experience

  • 15+ years of total software engineering experience, with 8+ years in Data Engineering / Big Data / Cloud roles.
  • Hands-on experience with AWS (Lambda, Glue, Step Functions, EMR, ECS, S3, Kinesis, SQS, SNS).
  • Experience in Python, Java
  • Strong knowledge of Spark / PySpark for batch and stream processing.
  • Experience with Terraform, Docker, Jenkins, and CI/CD automation.
  • Working knowledge of Kafka or Kinesis for data streaming.
  • Familiarity with Snowflake / Redshift / Postgres / DynamoDB.
  • Expertise in event-driven architectures and microservices.
  • Strong understanding of data lake, data warehouse, and lakehouse design principles.
  • Proven experience delivering secure, scalable, and cost-optimized AWS solutions.

Must have

  • AWS Certified Solutions Architect or AWS Data Analytics certification.
  • Background in FinTech, Banking, or Government data platforms.
  • Experience with Databricks, Airflow, or similar orchestration tools.
  • Experience with Spring Boot / Java-based microservices.
  • Experience in test-driven development (TDD/BDD) using Cucumber, Junit, or Pytest.

Education

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • MBA or relevant technical management experience a plus.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.