Must Have Technical/Functional Skills
• Programming Languages: Proficient in SQL, Pyspark, Python
• Data Pipeline & ETL: Proficiency in building ETL/ELT pipelines using tools like Apache Airflow, AWS EC2, EMR, S3, dbt, Azure Data factory
• Database Management: Expertise in data warehousing solutions like Snowflake or Redshift, Databricks
• Cloud Platforms: Working knowledge of cloud providers (AWS) is essential.
• Data Modeling: Designing efficient schemas (star/snowflake) for dimensional modelling and semantic models for Visualization tools
• Data Governance & Security: Implementing data encryption, masking, and role-based access control.
• Problem-Solving & Troubleshooting: Identifying bottlenecks in data workflows and creating scalable, robust solutions.
• Collaboration: Working with data scientists and analysts to understand business needs and deliver actionable data.
• Version Control: Using tools like Git for managing code for data pipelines.
Roles & Responsibilities
• Designing and implementing through data architecture strategies
• Analyzing complex issues to create innovative solutions
• Upholding standards in project deliverables
• Developing a thorough understanding of the business environment
• Managing and navigating complex scenarios effectively
• Building and maintaining excellent client relationship
• Enhancing personal brand and technical skills
• Mentoring and guiding junior team members
• Implementing end to end solutions catering to the business requirements
Generic Managerial Skills, If any
• Track the deliverables assigned and deliver on time with great quality
• Participation in Scrum calls and sprint planning and agile ceremonies
• Offshore coordination