Data Engineer – 4 roles – in person interview
Location-Alpharetta, GA – 3 days/week onsite
Must have recent Investment Banking & Financial Experience
Job Description:
Job Functions/Duties and Responsibilities:
Solve complex engineering problems and lead system design and development activities.
Understand business processes, a bigger picture and core ideas behind the developed software.
Define engineering guidelines and quality control pipeline; perform code reviews.
Advocate and advance toward cutting edge engineering practices.
Working in the Agile development methodologies, collaborating with business and technology teams located globally.
Actively contribute and participate in sprint grooming and planning discussions, daily stand-ups, and Agile ceremonies
Work with various teams and stakeholders across geography and time zones.
Skills Required:
5–7 years of experience in data-focused roles.
Strong proficiency in developing and maintaining database objects, including stored procedures and functions.
Demonstrated experience in database design and development, advanced SQL/PL/SQL, and performance tuning (SQL Server preferred).
Hands-on expertise with enterprise database platforms such as DB2, SQL Server, Oracle, and/or Teradata.
Solid ETL and BI experience, including tools such as Informatica and Tableau or Power BI.
Excellent communication skills with the ability to partner effectively with both business users and IT teams across regions, and to lead delivery of IT outcomes.
Proven ability to collaborate with stakeholders and partners across geographies and time zones.
Strong growth mindset with a track record of personal excellence, collaboration, and continuous improvement.
Understanding of large-scale enterprise application requirements, including security controls, entitlements, and related governance considerations.
Experience working within Agile delivery methodologies (e.g., Scrum/Kanban).
Skills Desired:
Proficiency with Git and modern source control/development workflows (e.g., branching, code reviews, CI-friendly practices).
Strong computer science fundamentals, including scalable, resilient system design and sound engineering principles.
Hands-on experience building and maintaining BI dashboards using Tableau and/or Power BI.
Experience working with the Snowflake data platform (e.g., data modeling, performance considerations, and best practices).
Proven delivery experience in Agile environments (e.g., Scrum/Kanban), including iterative development and stakeholder collaboration.
Strong Python skills applied to data use cases, including:
Data engineering and automation (data extraction/transformation, process automation, database/API integrations; e.g., pandas, SQL Alchemy)
Analytics and reporting (data analysis, reusable workflows; e.g., pandas, NumPy)
ETL pipeline support (orchestration support, reusable utilities, and data quality validation/checks)
Working knowledge of database fundamentals, covering both relational and NoSQL technologies.
Basic familiarity with DevOps concepts and practices (e.g., CI/CD fundamentals, environment management, and deployment hygiene).