Senior Data Architect - AWS, Data Lake, Terraform

    • $190,000 - $215,000

    • Full Time

    • Work from home

Skills

  • AWS
  • Data Lake
  • Terraform

Job Description

RG19764

Position:     Senior Data Architect - AWS, Data Lake, Terraform
Location:     Greenwich, CT  
Compensation: $190,000 - $215,000 + Bonus + Excellent Benefits + Paid Relocation

This is a hybrid position where you can work remotely 2 days per week.

Job Description:

As Senior Data Architect – Team Lead, you will collaborate with the enterprise architect to develop a program, run as process owner, build, and configure acquire tooling to design solutions with performance, maintainability, scalability, and reliability in mind.
The ideal candidate will have demonstrated leadership skills with the ability to mentor, influence, and partner with engineering teams to deliver scalable robust solutions.

Responsibilities:

Development of patterns, standards and best practices in the data engineering and solutions space.
Data architecture strategy, design and best practices data platforms and related solutions.
Review and identify data solutions in collaboration with the technology teams across the enterprise.
Work with engineering teams to implement architectures.
Influence product and delivery requirements, roadmaps, and vision.
Manage data engineering team under the Enterprise Architecture organization.
Deliver enterprise solutions / architectures with emphasis on data privacy, integrity, scalability, and availability.
Produce organized and comprehensive system architectures and the related documentation.
Design and build functioning proof of concepts for proposed architectures.

Qualifications:

Bachelor’s or Master's degree in Computer Science or a related degree.
7+ years of hands-on cloud engineering experience with public clouds (AWS required).
5+ years of experience of Data Lake platform implementation and Terraform.
3+ years of IT project/team management experience.
Experience Building and leading engineering teams.
Experience Designing, building, and running enterprise distributed systems, real-time ETL, data pipelines.
Experience with AWS services such as Athena, Glue, Lambda, S3, DynamoDB, NoSQL, Relational Database Service (RDS), Amazon EMR and Amazon Redshift.
Experience with RDBMS, Spark, Hadoop, Kafka.
Experience with Programming languages such as Java and/or Python.
Experience with CI/CD pipelines and infrastructure as code (IaC).
Experience with Serverless big data solutions in AWS.