Position : Data Engineer Location: REMOTE
Looking for Data Engineers with Oracle experience that can assist in technical assessment and design of new data pipelines that perform mastering of data, data quality fixes, and application of business rules to be executed in Snowflake. Communication skills are very important and design skills/orientation is also key (will work closely with the external client). We currently have a contract opportunity for a Data Engineer to lend specific subject matter knowledge around Data Architect Design.
- Assist in technical assessment and design of new data pipelines that perform mastering of data, data quality fixes, and application of business rules to be executed in Snowflake
- Build and maintain complex data models
- Assess and research the current implementation of the platforms and define the course of action for modernization
- Providing technical vision and leadership; hands-on technology solutions implementation to meet business requirements
- Collaborate with peers and the leadership team on process improvement ideas, policy and procedure enhancements, and opportunities to improve the customer service experience
- Collect existing consumption requirement documents, collate them, rationalize them, and identify any major gaps in consumption requirements.
- Finalize consumption requirements, data catalog and access methods and process design
- Catalog available data subject areas and entities in Enterprise Data Lake in Snowflake & S3
- Catalog data subject areas and entities in source systems (Primarily Oracle)
- Create high level current state processing model documenting business rules, mastering, and calculations performed across current data pipelines
- Document fiduciary non-functional requirements relating to timing of data & recovery requirements
- Based on consumption requirements perform a gap assessment of the EDL subject areas given the catalog of subject areas, entities / relationships in existing source systems and based on consumption requirements
- Create a target state design of data processing pipelines by describing the processing, reference data handling, data mastering, and data quality remediation steps performed in the target state platform
- Design landing and API consumption patterns to support consumption requirements
- Bachelor’s Degree or higher in a technology-related field
- 8+ years of related experience in data engineering, analysis, data warehouses, and data lakes.
- Strong understanding and experience of methodologies like data warehousing, data visualization, and data integration
- Solid experience and understanding of designing and operationalization large-scale data and analytics solutions on Snowflake Cloud Data Warehouse are required
- Experience with extracting, transforming, and loading (ETL) data from structured and unstructured data sources
- Experience scripting and scheduling data load packages
- Experience with AWS S3, Lambda, Spring batch, Oracle Planning tools preferred.
If you are interested or know someone who is looking out for new opportunity please contact me.
Thanks & Regards
Xoriant is an equal opportunity employer. No person shall be excluded from consideration for employment because of race, ethnicity, religion, caste, gender, gender identity, sexual orientation, marital status, national origin, age, disability or veteran status.