Job Title: Data Engineer
Location: New York (Hybrid)
Skills Apptio, Big Data, ETL-Big Data/Data Warehousing, MS SQL, Oracle PL/SQL, Stored Proc Coding, Python, PySpark, Node/Node.JS, RDBMS, Data Analytics
Job Description
Individual contributor to an SDLC life cycle based agile team
Data Engineering Export - Concept, Strategy and end to end data as a product/service philosophy
Expert in Python
Capable of devising ETL pipelines using Python from scratch
Expert Airflow and understands how to develop DAGs
Expert in Spark and Pyspark
Willing to learn new cloud-based business apps and tools
Hands on proficiency and expert-level knowledge in at least one object-oriented programming language in C, C++ Or Java
Deep knowledge in RDBMS concepts and data engineering as a concept and service(data warehousing, data lakes and master data management)
Expert at analyzing and developing queries in SQL in various dialects (SQL Server,DB2, Oracle)
Hands on experience in manipulating databases via DML code packages, stored procedures, triggers and materialized views
Hands on experience in developing reports and dashboards using tools such as Qlik, Tableau or Power BI
Hands on experience in integrating two systems to process data through various channels (SOAP, Rest, ETL and SSIS)
Hands on experience with Node Js and JSON
Ability to independently write efficient and reusable code for ETL pipelines
Expert in data modelling concepts such as schemas and entity relationship