Overview
On Site
$70 - $75
Contract - W2
Contract - 12 Month(s)
Skills
Python
SQL
Snowflake
Google BigQuery
Amazon Redshift.
EC2
S3
Lambda
SQS
SNS
Matillion
Dell Boomi
Informatica Cloud
Talend
AWS Glue
Numpy
Pandas
Job Details
We are seeking an experienced Sr. Data Engineer, Data Solutions to join our team.The ideal candidate for this role will be responsible for developing Python modules using Numpy, Pandas, and dynamic programming in AWS and Snowflake. They will expand and optimize our ETL and data pipeline architecture and data flow across our Business Portfolio. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our data product owners, developers, data architects, data analysts, and data scientists on BI and Analytic initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company s data architecture to support our next generation of products and data initiatives.
The candidate will also assist in issue resolution, job orchestration, automation, and continuous improvement of our data integration processes.
Key Responsibilities
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Cloud Integration ETL Tools, Cloud Data Warehouse, SQL, and AWS.
- Design and develop ELT, ETL, event-driven data integration architecture solutions.
- Work with Data Analysts, Data Architects, BI Architects, Data Scientists, and Data Product Owners to establish an understanding of source data and determine data transformation/integration requirements.
- Troubleshoot and tune complex SQL queries.
- Utilize On-Prem and Cloud-based ETL platforms, Cloud Data Warehouse, AWS, GitHub, various scripting languages, SQL, querying tools, data quality tools, and metadata management tools.
- Develop data validation processes to ensure data quality.
- Demonstrated ability to work individually and as part of a team in a collaborative manner.
Preferred Skills & Qualifications
- Bachelor's degree (or foreign equivalent) in Computer Science, Computer Engineering, or a related field.
- 8+ years experience with data engineering, ETL, data warehouse, data mart, and data lake development.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), and familiarity with a variety of databases.
- Experience working with Cloud Data Warehouses like Snowflake, Google BigQuery, Amazon Redshift.
- Experience with AWS cloud services: EC2, S3, Lambda, SQS, SNS, etc.
- Experience with Cloud Integration Tools like Matillion, Dell Boomi, Informatica Cloud, Talend, AWS Glue.
- Experience with GitHub and its integration with ETL tools for version control.
- Familiarity with modern data management tools and platforms in0
- Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc., is a plus.
- Experience with Agile/Scrum methodologies is valuable.0
- Must speak English.
- Ability to work with business as well as IT stakeholders.
- Strong written and oral communication skills with the ability to work with users, peers, and management.
- Strong interpersonal skills.
- Ability to work independently and as part of a team to successfully execute projects.
- Highly motivated, self-starter with problem-solving skills.
- Ability to multitask and meet aggressive deadlines efficiently and effectively.
- Extraordinary attention to detail and accuracy.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.