Overview
Hybrid2 days onsite and 3 days remote
Depends on Experience
Full Time
No Travel Required
Skills
AWS
Databricks
Scala
Python
Pyspark
SQL
S3
EC2
Job Details
Position: AWS Data Engineer
Location: Quincy MA (Hybrid Onsite)
Client: Financial Services (State Street)
Note: C2C not applicable
Job Details:
we are looking for candidates who have hands on experience with migration of legacy applications from On-prem to AWS Public Cloud using AWS services such as EC2, ALB/ELB, S3, FSX, Opensearch, Kibana, CloudWatch, grafana, AWS Secret Manager with CyberArk, KMS and OIDC/ SAML integration with Azure, etc. Having expertise in Databricks is a plus.
Minimum 10+ years of overall experience is mandatory
Roles & Responsibilities
- Recognize the current application infrastructure and suggest new concepts to improve performance
- Document the best practices and strategies associated with application deployment and infrastructure support
- Produce reusable, efficient, and scalable programs, and cost-effective migration strategies
- Develop Data Engineering and Machine Learning pipelines in Databricks and different AWS services, including S3, EC2, API, RDS, Kinesis/Kafka and Lambda.
- Work jointly with the IT team and other departments to migrate data engineering and Machine Learning applications to Databricks/AWS
- Comfortable to work on tight timelines, when required.
Skill Sets Required
- Good decision-making and problem-solving skills
- Solid understanding of Databricks fundamentals/architecture and have hands on experience in Databricks modules (Data Engineering, Machine Learning and SQL warehouse).
- Knowledge of medallion architecture, DLT and unity catalog within Databricks.
- Knowledge of Machine learning model development process.
- Experience in migrating data from on-prem Hadoop to Databricks/AWS
- Understanding of core AWS services, uses, and AWS architecture best practices
- Hands-on experience in different domains, like database architecture, business intelligence, machine learning, advanced analytics, big data, etc.
- Solid knowledge on Airflow
- Solid knowledge on CI/CD pipeline in AWS technologies
- Application migration of RDBMS, java/python applications, model code, elastic etc.
- Solid programming background on scala, python
- Experience with Docker and Kubernetes is a plus
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.