Overview
Remote
$140,000 - $150,000
Full Time
Skills
SQL
Python
Spark
Data Modeling
AWS
Azure
GCP
Job Details
Hi,
Data Warehouse Engineer / Architect
Location: Remote
Mandatory Skills
- The candidate's overall IT experience should not exceed 15 years. The duration will be calculated from the date of graduation or post-graduation whichever is earlier and from the start date of their first project. If the total experience exceeds 15 years, the profile will not be considered.
- Hands-on coding is mandatory. During the first round of interviews, the panel will require the candidate to write code using Spark with Python, so the candidate should be well-prepared.
- Additionally, the candidate must provide a brief summary of their experience with Snowflake, Databricks, BigQuery, and Data Fabric, including the percentage of hands-on coding involvement in their most recent project.
Job Description:
- Ideal candidate should be 80% technical & rest 20% be able to work customer & offshore team members
- Cloud DWs: Snowflake / Databricks / BigQuery / Data Fabric / Redshift
- Mandatory Skills:
o SQL
o Python
- Spark
- Should be good at Performance Tuning
- Have Data Modelling experience / knowledge
- Experience on at least one cloud platform (AWS / Azure / Google Cloud Platform)
- Preferably have experience working on large-scale & long running engagements
Have an engg. bent of mind & should have experience working with offshore teams
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.