Overview
On Site
$Competitive
Accepts corp to corp applications
Contract - W2
Skills
Advanced Analytics
Master Data Management
Data Quality
Data Marts
Data Integration
Information Management
Agile
Data Management
Application Development
Enterprise Architecture
RDBMS
Flat File
Parallel Computing
Microsoft SQL Server
Real-time
Change Data Capture
Data Mining
Unstructured Data
Data Collection
Analytical Skill
Research
TensorFlow
PyTorch
Artificial Intelligence
Amazon SageMaker
scikit-learn
Apache MXNet
Microsoft Azure
Analytics
Data Modeling
Logical Data Model
Regulatory Compliance
Meta-data Management
Mapping
Data Governance
As-is Process
Data Architecture
Deep Learning
Database Administration
Oracle
Amazon Redshift
MySQL
Snow Flake Schema
Microsoft
Data Warehouse
Extract
Transform
Load
Informatica PowerCenter
Reporting
Qlik Sense
Tableau
Microsoft Power BI
Management
Programming Languages
R
Python
Database
SQL
Apache Hive
Apache Pig
Scala
Java
C++
Regression Analysis
Statistics
Machine Learning (ML)
k-nearest neighbors
Solaris Volume Manager
Support Vector Machine
Mathematics
Calculus
Linear Algebra
SAP BASIS
Algorithms
Optimization
Job Details
Job Description:
The position is responsible for designing and implementing strategic data initiatives including enterprise data warehousing, metadata management, and machine learning solutions.
The candidate must have extensive experience in data architecture, ETL development, and advanced analytics using Qlik Sense.
The position is responsible for designing and implementing strategic data initiatives including enterprise data warehousing, metadata management, and machine learning solutions.
The candidate must have extensive experience in data architecture, ETL development, and advanced analytics using Qlik Sense.
Responsibilities:
Designs and implements strategic data initiatives, such as enterprise data warehouse (EDW), master data, data governance, data quality, metadata management, and data marts.
Administers and monitors automated and manual data integration and Data/ETL jobs to verify execution and measure performance.
Designs and implements the Airports Authority-wide metadata and information management program, including development of enterprise conceptual, logical, and physical data models. Provides data architecture support to major application development initiatives in an agile environment.
Implement program standards and procedures to support data warehouse administration. Provides technical advice on data management and design to Application Development and Enterprise Architecture project teams, end-users, and business stakeholders.
Develops design and Data/Extract, Transform, and Load (ETL) coding of Source Dependent Extracts (Relational DB, APIs, flat files etc.,), Source Independent Loads, and Post-Load Processes from source to target systems for operational data stores and dimensional data warehouses into an in-memory or Massively Parallel Processing platform or a Structure Query Language (SQL) Server/Oracle Data Warehouse, using near real-time loads and Change Data Capture.
Data mining or extracting usable data from valuable data sources. Using machine learning tools to select features, create and optimize classifiers. Carrying out preprocessing of structured and unstructured data.
Enhancing data collection procedures to include all relevant information for developing analytic systems.
Processing, cleansing, and validating the integrity of data to be used for analysis.
Analyzing large amounts of information to find patterns and solutions. Developing prediction systems and machine learning algorithms.
Deliver a ML project from beginning to end, including understanding the business need, aggregating data, exploring data, building & validating predictive models, and deploying completed models with concept-drift monitoring and retraining to deliver business impact to the organization.
Research and implement novel ML approaches by using AI services, ML platforms and frameworks (e.g., TensorFlow, PyTorch, OpenNN, H2O.ai SparkML, SageMaker, scikit- learn, MXNet, Azure Synapse Analytics, Google BigQuery). Implementation includes descriptive, predictive, and prescriptive analytics.
Designs and implements strategic data initiatives, such as enterprise data warehouse (EDW), master data, data governance, data quality, metadata management, and data marts.
Administers and monitors automated and manual data integration and Data/ETL jobs to verify execution and measure performance.
Designs and implements the Airports Authority-wide metadata and information management program, including development of enterprise conceptual, logical, and physical data models. Provides data architecture support to major application development initiatives in an agile environment.
Implement program standards and procedures to support data warehouse administration. Provides technical advice on data management and design to Application Development and Enterprise Architecture project teams, end-users, and business stakeholders.
Develops design and Data/Extract, Transform, and Load (ETL) coding of Source Dependent Extracts (Relational DB, APIs, flat files etc.,), Source Independent Loads, and Post-Load Processes from source to target systems for operational data stores and dimensional data warehouses into an in-memory or Massively Parallel Processing platform or a Structure Query Language (SQL) Server/Oracle Data Warehouse, using near real-time loads and Change Data Capture.
Data mining or extracting usable data from valuable data sources. Using machine learning tools to select features, create and optimize classifiers. Carrying out preprocessing of structured and unstructured data.
Enhancing data collection procedures to include all relevant information for developing analytic systems.
Processing, cleansing, and validating the integrity of data to be used for analysis.
Analyzing large amounts of information to find patterns and solutions. Developing prediction systems and machine learning algorithms.
Deliver a ML project from beginning to end, including understanding the business need, aggregating data, exploring data, building & validating predictive models, and deploying completed models with concept-drift monitoring and retraining to deliver business impact to the organization.
Research and implement novel ML approaches by using AI services, ML platforms and frameworks (e.g., TensorFlow, PyTorch, OpenNN, H2O.ai SparkML, SageMaker, scikit- learn, MXNet, Azure Synapse Analytics, Google BigQuery). Implementation includes descriptive, predictive, and prescriptive analytics.
Minimum Qualifications:
Eight years of progressively responsible experience in data warehousing and integration, including experience applying data modeling techniques and ETL coding.
Knowledge of and skill in developing and implementing a corporate data architecture program with responsibility for enterprise conceptual/logical data modeling, data policies, standards and compliance monitoring, metadata mapping, data governance, and as- is/target data architecture to participate in strategic data initiatives.
Experience in an ML engineer or data scientist role building and deploying ML models or hands on experience developing deep learning models.
Knowledge of and skill in working with database management systems, such as Oracle, Redshift, MySQL, Snowflake and Microsoft Structured Query Language (SQL), and data warehouse solutions, and ETL tools (like Informatica Power Center, IICS) and reporting tools (like Qlik Sense, Tableau and PowerBI) to centrally manage and analyze data originating from disparate source systems.
Programming Skills knowledge of statistical programming languages like R, Python, and database query languages like SQL, Hive, Pig is desirable. Familiarity with Scala, Java, or C++ is an added advantage.
Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators, etc. Proficiency in statistics is essential for data-driven companies.
Machine Learning good knowledge of machine learning methods like k-Nearest Neighbors, Naive Bayes, SVM, Decision Forests.
Strong Math Skills (Multivariable Calculus and Linear Algebra) - understanding the fundamentals of Multivariable Calculus and Linear Algebra is important as they form the basis of a lot of predictive performance or algorithm optimization techniques.
Eight years of progressively responsible experience in data warehousing and integration, including experience applying data modeling techniques and ETL coding.
Knowledge of and skill in developing and implementing a corporate data architecture program with responsibility for enterprise conceptual/logical data modeling, data policies, standards and compliance monitoring, metadata mapping, data governance, and as- is/target data architecture to participate in strategic data initiatives.
Experience in an ML engineer or data scientist role building and deploying ML models or hands on experience developing deep learning models.
Knowledge of and skill in working with database management systems, such as Oracle, Redshift, MySQL, Snowflake and Microsoft Structured Query Language (SQL), and data warehouse solutions, and ETL tools (like Informatica Power Center, IICS) and reporting tools (like Qlik Sense, Tableau and PowerBI) to centrally manage and analyze data originating from disparate source systems.
Programming Skills knowledge of statistical programming languages like R, Python, and database query languages like SQL, Hive, Pig is desirable. Familiarity with Scala, Java, or C++ is an added advantage.
Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators, etc. Proficiency in statistics is essential for data-driven companies.
Machine Learning good knowledge of machine learning methods like k-Nearest Neighbors, Naive Bayes, SVM, Decision Forests.
Strong Math Skills (Multivariable Calculus and Linear Algebra) - understanding the fundamentals of Multivariable Calculus and Linear Algebra is important as they form the basis of a lot of predictive performance or algorithm optimization techniques.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.