Overview
Skills
Job Details
Senior Cloud Data Engineer
Location/ Work Mode: Hybrid Boston Office ( 1st Choice) , Portsmouth NH, Plano Tx, Indianapolis IN 2 days from Office.
Job Type: Contract
Duration : 6 months subject to extension
Compensation:
Interview Process-- Recruitment Prescreening- 20 minutes intro videocall, Technical 30 -45 minutes video call, Customer call 45- 60 minutes video call.
Project Starting :- First week of July 2025.
- Experience: Minimum of 13 years of experience in development with 7+ years of experience in data engineering, with a proven track record of designing, building, and optimizing scalable data pipelines and architectures
- Expert-level proficiency in SQL and strong experience in data transformation, automation, and orchestration
- SQL: Expert use of SQL for querying and managing data, including complex joins, subqueries, window functions, and performance tuning.
- Snowflake: Advanced use of Snowflake Cloud Data Platform for data warehousing, including Snowflake SQL, schema design, Snowpipe, streams/tasks, and integration capabilities.
Role: | Senior Cloud Data Engineer |
Employment Type: Contract | Contract Minimum 6 months subject to extension |
Work location: Remote/Onsite (mention no. of Onsite days) | Hybrid/ ..2 days from Boston Office or Plano Tx, or Indianapolis or Portsmouth |
About the Role | looking for a Senior Cloud Data Engineer. Developer .This position will be a highly skilled, senior member of the team requiring deep technical expertise in SQL, Snowflake, AWS, Power BI ,Data Warehouse Design, and CI/CD pipelines, with the ability to work independently and proactively in a client-facing environment. . |
Job Responsibilities | SQL Expertise: Demonstrate expert-level proficiency in SQL for querying and managing data, including the use of complex joins, subqueries, window functions, and performance tuning techniques.
Snowflake Proficiency: Utilize advanced capabilities of the Snowflake Cloud Data Platform, including Snowflake SQL, schema design, Snowpipe, streams/tasks, and integration with other systems.
Data Warehousing & Modelling: Apply strong knowledge of dimensional modelling techniques (e.g., star and snowflake schemas), design of fact and dimension tables, and implementation of ETL/ELT frameworks aligned with data architecture best practices.
CI/CD & GitHub Actions: Use Git and GitHub for version control and implement GitHub Actions to automate CI/CD pipelines, including testing, deployment workflows, and environment management.
Power BI Reporting: Develop and publish data visualizations and dashboards using Microsoft Power BI, supporting both operational and strategic reporting needs.
Python Scripting: Write Python scripts for automation and data manipulation tasks, leveraging libraries such as pandas for lightweight data transformations when required.
DevOps & Tooling Familiarity: Demonstrate working knowledge of DevOps tools and practices, including containerization with Docker, monitoring tools, and relevant IDEs or project management platforms to ensure efficient and reliable delivery.
Collaborates with the Product Owner, Scrum Master, Subject Matter Experts and Development team to define and analyze user stories tracked in Jira.
Demonstrate solutions and articulates business value to business partners at sprint showcases.
Actively participate in investigating platform and perform end to end testing if needed.
Prior experience working on Agile scrum teams in a scaled framework is preferred.
Develop and maintain professional relationships with all customers.
Assist all customers and provide development support for all Applications and perform tests on all installation process for infrastructure.
Perform all tests on production/dev applications and prepare recovery procedures for all applications and provide upgrade to same.
Experience in Investment Domain
Good knowledge of batch processing |
Required Education | Masters or Engineering degree |
Required Experience |
|
Work-shift Timings (If required) |
|