With out Snowflake Experience, Please Dont share me the resume.
Job Title: Data Engineer
Location: REMOTE
Duration : Long Term
Hands-on expertise in Snowflake data architecture and data engineering
Role : Data engineer
Experience Range: 9 - 12 years of experience, including significant hands-on expertise in Snowflake data architecture and data engineering
Key Responsibilities:
1. Design and implement scalable Snowflake data architectures to support enterprise data warehousing and analytics needs
2. Optimize Snowflake performance through advanced tuning, warehousing strategies, and efficient data sharing solutions
3. Develop robust data pipelines using Python and DBT, including modeling, testing, macros, and snapshot management
4. Implement and enforce security best practices such as RBAC, data masking, and row-level security across cloud data platforms
5. Architect and manage AWS-based data solutions leveraging S3, Redshift, Lambda, Glue, EC2, and IAM for secure and reliable data operations
6. Orchestrate and monitor complex data workflows using Apache Airflow, including DAG design, operator configuration, and scheduling
7. Utilize version control systems such as Git to manage codebase and facilitate collaborative data engineering workflows
8. Integrate and process high-volume data using Apache ecosystem tools such as Spark, Kafka, and Hive, with an understanding of Hadoop environments
Required Skills:
1. Advanced hands-on experience with Snowflake, including performance tuning and warehousing strategies
2. Expertise in Snowflake security features such as RBAC, data masking, and row-level security
3. Proficiency in advanced Python programming for data engineering tasks
4. In-depth knowledge of DBT for data modeling, testing, macros, and snapshot management
5. Strong experience with AWS services including S3, Redshift, Lambda, Glue, EC2, and IAM
6. Extensive experience designing and managing Apache Airflow DAGs and scheduling workflows
7. Proficiency in version control using Git for collaborative development
8. Hands-on experience with Apache Spark, Kafka, and Hive
9. Solid understanding of Hadoop ecosystem
10. Expertise in SQL (basic and advanced), including SnowSQL, PLSQL, and T-SQL
11. Strong requirement understanding, presentation, and documentation skills; ability to translate business needs into clear, structured functional/technical documents and present them effectively to stakeholders.
Preferred Skills:
1. Experience with Salesforce Data Cloud integration
2. Familiarity with data cataloging tools such as Alation
3. Exposure to real-time streaming architectures
4. Experience working in multi-cloud environments
5. Knowledge of DevOps or DataOps practices
6. Certifications in data cloud technologies
Desired Qualifications:
1. Bachelor s or Master s degree in Computer Science, Information Technology, Engineering, or a related field
2. Relevant certifications in Snowflake, AWS, or data engineering technologies are highly desirable
EDUCATION
Bachelor s Degree in Computer Science, Information Systems, Engineering or related field or equivalent work experience.