Overview
On Site
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - 12 Month(s)
Skills
Apache Kafka
Data Lake
Databricks
Data Governance
Google Cloud Platform
Microsoft Azure
Machine Learning (ML)
SQL
Microsoft SSIS
Scripting
Job Details
Role: Senior Data Engineer
Location: Columbus, OH (Onsite)
Experience: 10+ Years
Duration: Long Term
Job Description:
Senior Data Engineer, Strong Azure and ADF, hands-on with Snowflake
- Design, develop and maintain scalable data pipelines
- Develop data ingestion and integrations (REST, SOAP, SFTP, MQ, etc.) processes
- Take ownership of building data pipelines
- Actively engage in technology discovery and implementation for both on-prem and in Cloud (i.e. Azure or AWS) to build solution for future systems
- Develop high performance scripts in SQL/Python/etc. to achieve objectives of enterprise data, BI and analytics need.
- Incorporate standards and best practices into engineering solutions
- Manage code versions in source control and coordinate changes across team
- Participate in architecture design and discussions
- Provide logical and physical data design, and database modeling
- Be part of the Agile team to collaborate and to help shape requirements
- Solve complex data issues around data integration, unusable data elements, unstructured data sets, and other data processing incidents
- Supports the development and design of the internal data integration framework
- Works with system owners to resolve source data issues and refine transformation rules
- Partner with enterprise teams, data scientist, architects to define requirements and solution
Key Qualifications :
- Have a B.A./B.S. and 5-8 years of relevant work experience; or an equivalent in education and experience
- Must have excellent experience with Snowflake
- Hands on experience with Microsoft Stack SSIS, SQL, etc.
- Possess strong analytical skills with the ability to analyze raw data, draw conclusions, and develop actionable recommendations
- Experience with the Agile development process preferred
- Proven track-record of excellence and consistently delivered past project successfully
- Hands on experience with Azure data factory V2, Azure Databricks, SQLDW or Snowflake, Azure analysis services and Cosmos DB
- Experience with Python or Scala.
- Understanding of continuous integration and continuous deployment on Azure
- Experience with large scale data lake or warehouse implementation on any of the public cloud (AWS, Azure, Google Cloud Platform)
- Have excellent interpersonal and written/verbal communication skills
- Manage financial information in a confidential and professional manner
- Be highly motivated and flexible
- Effectively handle multiple projects simultaneously and pay close attention to detail
- Have experience in a multi-dimensional data environment
- Google Cloud Platform Professional Data Engineer Certification or similar.
- Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on Google Cloud Platform.
- Industry experience in healthcare, retail, or financial services.
- Understanding of data governance, security, and compliance standards in the cloud.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.