Overview
Remote
Depends on Experience
Contract - W2
Skills
API
Python
Python Scripting
SQL
ETL
ELT
data pipeline
GCP
AWS
Azure
Microservices
CI/CD
Big Data
cloud
TensorFlow
PyTorch
Artificial Intelligence
AI
Automated Testing
Data Lake
Data Visualization
Data Validation
Data Structure
Machine Learning
ML
Google Cloud Platform
Extract
Transform
Load
Continuous Delivery
Amazon Web Services
Healthcare
Health care
Job Details
Years of experience: 5+ years
Reason for opening: effort to move offshore to onshore. Has to increase FTE headcount from 10% to 40%
Requirements:
- 5+ years experience as an integration / software engineer with the following:
- Python
- SQL
- ETL/ELT and building high-volume data pipelines
- Hands-on experience building modern data pipelines within a major cloud platform (Google Cloud Platform, AWS, Azure)
- Building MicroServices
- CI/CD, observability, and automated testing
Job description / Day to day:
Summary:
As a Software Engineer, you'll play a crucial role in building and delivering high-quality software that enhances customer experiences. You'll be involved in all phases of software engineering, from requirements analysis to deployment, while adhering to agile software development methodologies. Collaboration is key, as you'll work closely with cross-functional teams to deliver integrated solutions that meet the evolving needs of our business.
As a Software Engineer, you will:
- Build integration process to fast APIs and systems APIs using Python script
- Build MicroServices leveraging various programming languages and streaming solutions.
- Own the features you develop from engineering to supporting in production
- Follow software best practices, such as CI/CD, observability, and automated testing
What you will do
- Designs scalable and efficient data pipelines to extract, transform, and load data from various sources into data warehouses or data lakes.
- Implements data validation and quality checks to identify and address data anomalies or errors.
- Designs data warehousing and data lake solutions that facilitate data storage, retrieval, and analysis through Mongo.
- Documents data engineering processes, workflows, and systems for reference and knowledge-sharing purposes.
- Implements data quality checks and validation processes to ensure the accuracy, completeness, and consistency of the data.
- Identifies opportunities to streamline data engineering processes, improve efficiency, and enhance the quality of deliverables.
- Provides guidance and mentorship to junior data engineers to help them develop their technical skills and grow in their roles.
- Coordinate with product teams to align technical delivery with business goals and customer needs.
- Contribute to the implementation of AI-powered tools and automation strategies that enhance performance and engagement.
- Participate in code reviews, architecture discussions, and agile ceremonies.
- Foster a culture of innovation, technical excellence, and continuous improvement.
Required:
- 5+ years of experience with Python or a comparable scripting language
- 5+ years of experience with SQL, NoSQL
- 5+ years of experience with ETL/ELT and building high-volume data pipelines.
- 5+ years of experience with reporting/analytic tools
- 5+ years of experience with Query optimization, data structures, transformation, metadata, dependency, and workload management
- 5+ years of experience with Big Data and cloud architecture
- 5+ years of hands-on experience building modern data pipelines within a major cloud platform (Google Cloud Platform, AWS, Azure)
- 3-5 years of experience with deployment/scaling of apps on containerized environment
- 5+ years of experience with real-time and streaming technology
- 3+ year(s) of soliciting complex requirements and managing relationships with key stakeholders.
- 3+ year(s) of experience independently managing deliverables.
Preferred:
- Strong programming skills in Python and experience with machine learning libraries such as scikit-learn, TensorFlow,and PyTorch
- Promote proper implementation of SAFe process techniques
- Experience working with distributed teams, working across multiple time zones and geographies
- Strong understanding of delivery practices
- Experience leading large technical programs with responsibility of end to end planning
- Experience in designing and building data engineering solutions in cloud environments (preferably Google Cloud Platform)
- Experience with Git, CI/CD pipeline, and other DevOps principles/best practices
- ML/AI Experience is a plus
- Understanding of software development methodologies including waterfall and agile.
- Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources.
- Knowledge of API development
- Experience with complex systems and solving challenging analytical problems.
- Strong collaboration and communication skills within and across teams
- Knowledge of data visualization and reporting
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.