Overview
Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 1 Year(s)
No Travel Required
Skills
Advanced Analytics
Agile
Analytics
Apache Hadoop
Big Data
Business Intelligence
Informatica PowerCenter
Informatica
IBM Cognos
SQL
Shell Scripting
Unix
Tableau
Optimization
EMC GreenPlum
Data Warehouse
Data Management
Warehouse
VLDB
Qlik Sense
Reporting
Distribution
MPP
Database Design
Job Details
Description & Requirements
Our Team:
Company runs on data.As the Data Management & Analytics team within Engineering, we support our organization s needs around managing data efficiently and enabling everyone across the company to make informed decisions by providing insights into the data.
We are responsible for ingesting and preparing massive amounts of data for reporting, dashboards, self-service and advanced analytics.
A key objective of this role is for you to help to build and support enterprise level data analytics programs leveraging traditional warehouse technologies, Informatica, MPP databases and Hadoop.
In order to be successful:
- You should have a working knowledge of industry standard Data Infrastructure (e.g. Warehouse, BI, Analytics, Big-Data, etc.) tools with the goal of providing end users with analytics at the speed of thought.
- You should be proficient at developing, architecting, standardizing and supporting technology platforms using Industry leading ETL solutions.
- You should thrive in building scalable and high throughput systems
- You should have experience with agile BI & ETL practices to assist with the interim Data preparation for Data Discovery & self-service needs.
- You must have strong communication, presentation, problem-solving, and trouble-shooting skills.
- You should be highly motivated to drive innovations company-wide.
You ll need to Have:
- 5+ years of experience in designing and developing ETL pipelines leveraging Informatica PowerCenter/IDMC.
- Strong understanding of data warehousing methodologies, ETL processing and dimensional data modeling.
- Design, implement, test and maintain ETL components for multiple applications
- Advanced SQL capabilities are required. Knowledge of database design techniques and experience working with extremely large data volumes is a plus.
- Demonstrated experience and ability to work with business users to gather requirements and manage scope.
- Experience programming in a Linux/UNIX environment including shell scripting.
- Experience working in a big data environment with technologies such as Greenplum, Hadoop and HIVE
- BA, BS, MS, PhD in Computer Science, Engineering or related technology field
We d love to see:
- Experience with large database and DW Implementation (20+ TBs)
- Understanding of VLDB performance aspects, such as table partitioning, sharding, table distribution and optimization techniques
- Knowledge of reporting tools such as QlikSense, Tableau, Cognos
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.