Overview
Hybrid2- 3 days onsite
Depends on Experience
Contract - W2
Contract - 24 Month(s)
No Travel Required
Skills
Python
AWS
Lambda
Step Functions
S3
Glue
Job Details
Job Title: Python AWS Cloud Developer
Job Location: Dallas, TX (Locals only)
Duration: 24 Months+ With possibility of extension.
Overview:
Seeking a Python Developer to join the Data Lake team. This is a hands-on role focused on Python-based application development within AWS, supporting large-scale data movement and cloud-native orchestration workflows.
The successful candidate should be:
- A backend Python developer who happens to work with data not a data engineer who happens to code
- Comfortable jumping into a complex, Lambda-driven environment with existing workflows
- Can handle end-to-end application updates across multiple cloud services
- A proactive communicator who thrives in remote collaboration settings
- Excited by solving backend challenges and improving cloud workflows
Key Responsibilities
- Design, build, and maintain Python-based, serverless applications within AWS
- Develop and optimize Lambda-heavy workflows using AWS services like Lambda, Step Functions, S3, and Glue
- Tune existing pipelines to improve performance, reliability, and maintainability, especially for data movement across zones (Raw, Kafka, Refined, Replication)
- Update and manage shared Python modules used across multiple functions
- Refactor legacy code and streamline workflows by removing unnecessary processing steps and reducing payload size
- Collaborate with teammates to break down large stories into deliverable pieces and ensure smooth development cycles
- Participate in code reviews, testing, and deployment activities as part of an agile, cloud-native team
Requirements:
- 5 7+ years of hands-on experience with Python as your primary language
- Moderate to strong experience with AWS services, especially:
- Lambda
- Step Functions
- S3
- Glue (nice to have)
- Experience building and supporting cloud-based, event-driven workflows
- Comfort working with large, sometimes legacy, codebases and optimizing performance
- Strong understanding of cloud-native design and modern application practices
- Knowledge of cloud architecture best practices
- Familiarity with CI/CD and version control (e.g., Git, CodePipeline)
- Ability to work independently and communicate effectively in a remote team
- Must have great collaboration skills, personality fit and excellent communication skills
Nice to Have:
- Experience with data pipelines or file ingestion systems
- Exposure to Data Lake architectures or tools like Databricks (not required, but helpful)
Example of Real Work Being Done:
A current senior level developer on the team is:
- Refactoring 7+ Lambda functions, a Step Function, and a Glue function to streamline a single process
- Updating a common module shared across Lambdas to reduce performance overhead
- Tuning the Refined Zone + S3 replication workflows end-to-end to reduce payload size and improve reliability
- Managing orchestration for 3 core data flows: Raw ingestion, Kafka ingestion, and S3 replication
- Migrating and upgrading legacy Python code (e.g., moving to 2.2-compatible modules)
- Collaborating remotely with peers on large stories broken into smaller deliverables
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.