Sr Cloud Data Engineer (13+ Years profile only)

Overview

Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 6 Month(s)
75% Travel

Skills

SQL
Snowflake
AWS
ETL
CI/CD
DevOps
Python
Docker
Power BI/ Tableaue

Job Details

Sr Cloud Data Engineer
Boston Office or Plano Tx, or Indianapolis or Portsmouth (Hybrid - 2 days in a week)
6+ Months Contract
Job Responsibilities
  • SQL Expertise: Demonstrate expert-level proficiency in SQL for querying and managing data, including the use of complex joins, subqueries, window functions, and performance tuning techniques.
  • Snowflake Proficiency: Utilize advanced capabilities of the Snowflake Cloud Data Platform, including Snowflake SQL, schema design, Snowpipe, streams/tasks, and integration with other systems.
  • Data Warehousing & Modelling: Apply strong knowledge of dimensional modelling techniques (e.g., star and snowflake schemas), design of fact and dimension tables, and implementation of ETL/ELT frameworks aligned with data architecture best practices.
  • CI/CD & GitHub Actions: Use Git and GitHub for version control and implement GitHub Actions to automate CI/CD pipelines, including testing, deployment workflows, and environment management.
  • Power BI Reporting: Develop and publish data visualizations and dashboards using Microsoft Power BI, supporting both operational and strategic reporting needs.
  • Python Scripting: Write Python scripts for automation and data manipulation tasks, leveraging libraries such as pandas for lightweight data transformations when required.
  • DevOps & Tooling Familiarity: Demonstrate working knowledge of DevOps tools and practices, including containerization with Docker, monitoring tools, and relevant IDEs or project management platforms to ensure efficient and reliable delivery.
  • Collaborates with the Product Owner, Scrum Master, Subject Matter Experts and Development team to define and analyze user stories tracked in Jira.
  • Demonstrate solutions and articulates business value to business partners at sprint showcases.
  • Actively participate in investigating platform and perform end to end testing if needed.
  • Prior experience working on Agile scrum teams in a scaled framework is preferred.
  • Develop and maintain professional relationships with all customers.
  • Assist all customers and provide development support for all Applications and perform tests on all installation process for infrastructure.
  • Perform all tests on production/dev applications and prepare recovery procedures for all applications and provide upgrade to same.
  • Experience in Investment Domain
  • Good knowledge of batch processing
Required Skills:
  • Minimum of 13 years of experience in development with 7+ years of experience in data engineering, with a proven track record of designing, building, and optimizing scalable data pipelines and architectures
  • Expert-level proficiency in SQL and strong experience in data transformation, automation, and orchestration
  • Deep understanding of data modeling, data warehousing, and data architecture best practices
  • Hands-on Experience with modern public cloud-based data platforms Snowflake (preferable), AWS, Azure
  • Advanced proficiency in AWS services including (but not limited to): S3, Glue, Lambda, EC2, and Athena
  • Proficient with ETL/ELT data pipelines, patterns for loading Data Warehouses, Lakes
  • Solid experience with DevOps practices and automation frameworks, including CI/CD pipelines, GitHub Actions, Bamboo, and pipeline-as-code principles
  • Skilled in BI/reporting tools (e.g., Power BI, Tableau)
  • Masters or Engineering degree
Tools/Technologies
  • SQL: Expert use of SQL for querying and managing data, including complex joins, subqueries, window functions, and performance tuning.
  • Snowflake: Advanced use of Snowflake Cloud Data Platform for data warehousing, including Snowflake SQL, schema design, Snowpipe, streams/tasks, and integration capabilities.
  • Data Warehousing & Modelling: Dimensional modeling techniques, star and snowflake schemas, fact and dimension table design, ETL/ELT frameworks, and general data architecture best practices.
  • GitHub Actions & CI/CD: Git and GitHub for version control, with GitHub Actions for automating CI/CD pipelines (testing, deployment workflows, environment management).
  • Power BI: Microsoft Power BI for creating data visualizations and dashboards, basic report development and publishing processes.
  • Python: Scripting with Python for automation and data manipulation tasks (using libraries like pandas for simple transforms, if required).
  • DevOps/Other Tools: Familiarity with development and DevOps tools such as Docker (for containerizing data tools), monitoring tools, and any relevant IDEs or project management tools to facilitate efficient delivery (optional, as needed per project).
Bonus skills:
  • Previous experience in Investments / Asset Management, Finance Data modeling, is highly desirable.
  • Snowflake or AWS certification
  • Cognizance of security concerns, from access control and authentication to secured processes.
  • A comprehensive understanding of agile environments and the ability to adapt to rapidly changing circumstances.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.