Overview
Skills
Job Details
Role: Python AWS Cloud Developer
Experience: Must have 9+ years of U.S.A work experience.
Communication: Must have great communication skills to interact with leaders and other Southwest Airlines employees.
Hours/Time Zone: Must be able to work 8 am CST 5 pm CST! (CST & EST candidates only)
Travel: Must be able to travel 1 week to cLIENT Headquarters in Dallas to work onsite. This could be quarterly or monthly depending on the manager. (Expenses paid by SWA/Odyssey)
Vacation: No international travel / long vacations for the first 90 days.
Environment: Agile/Scrum/2-week sprints; Daily Video conference meetings
Interview Process: multi-step process all via Microsoft Teams video calls.
Overview:
Seeking a Python Developer to join the Data Lake team. This is a hands-on role focused on Python-based application development within AWS, supporting large-scale data movement and cloud-native orchestration workflows.
The successful candidate should be:
- A backend Python developer who happens to work with data - not a data engineer who happens to code
- Comfortable jumping into a complex, Lambda-driven environment with existing workflows
- Can handle end-to-end application updates across multiple cloud services
- A proactive communicator who thrives in remote collaboration settings
- Excited by solving backend challenges and improving cloud workflows
Requirements:
- 5 7+ years of hands-on experience with Python as your primary language
- Moderate to strong experience with AWS services, especially:
- Lambda
- Step Functions
- S3
- Glue (nice to have)
- Experience building and supporting cloud-based, event-driven workflows
- Comfort working with large, sometimes legacy, codebases and optimizing performance
- Strong understanding of cloud-native design and modern application practices
- Knowledge of cloud architecture best practices
- Familiarity with CI/CD and version control (e.g., Git, CodePipeline)
- Ability to work independently and communicate effectively in a remote team
- Must have great collaboration skills, personality fit and excellent communication skills
Key Responsibilities
- Design, build, and maintain Python-based, serverless applications within AWS
- Develop and optimize Lambda-heavy workflows using AWS services like Lambda, Step Functions, S3, and Glue
- Tune existing pipelines to improve performance, reliability, and maintainability, especially for data movement across zones (Raw, Kafka, Refined, Replication)
- Update and manage shared Python modules used across multiple functions
- Refactor legacy code and streamline workflows by removing unnecessary processing steps and reducing payload size
- Collaborate with teammates to break down large stories into deliverable pieces and ensure smooth development cycles
- Participate in code reviews, testing, and deployment activities as part of an agile, cloud-native team
Nice to Have:
- Experience with data pipelines or file ingestion systems
- Exposure to Data Lake architectures or tools like Databricks (not required, but helpful)