Overview
Skills
Job Details
. Minimum of Bachelor s (4-year) degree in Computer Science, Analytics/Data Science,
Information Systems, Business Administration, Public Health Informatics, or another
related field
2. Current data and/or analytics certification such as Certified Data Management
Professional (CDMP). Eighteen or more hours of participation in webinars or conferences
over the last 3 years related to data and analytics may be substituted for the certification.
3. Five or more years of experience collaborating with various lines of business and the
analyst/data science community; must demonstrate an understanding of general
business operations related to healthcare and administration, the ability to translate use
case requirements into data and analytics (D&A) solutions, and the ability to articulate
D&A solutions, tool usage and data models to a non-technical audience.
4. Five or more years of experience modeling, engineering, implementing, and supporting
data warehouses (both normalized and dimensional) including two or more years of
experience with Snowflake Data Warehouse.
5. Three or more years of experience designing, engineering, implementing, and supporting
cloud-based data lakes, including lake layers and bucket structures in AWS S3 and related
Apache Foundation tools such as Parquet.
6. Two or more years of experience modeling, engineering, implementing, and supporting
cloud data lakehouse structures (Delta Lake, Hudi, and Iceberg) utilizing Databricks.
7. Seven or more years of experience in data modeling (including Entity Relationship,
Logical, Conceptual, and Physical models) for analytical purposes, and data
profiling/reverse engineering for both schema-on-read and schema-on-write
environments. The experience must include advanced proficiency with Erwin data
modeler.
8. Five or more years of experience with data pipeline/Integration tools, including source to
target mapping, coding, data quality and enrichment transforms, observability,
orchestration, performance optimization, and testing, via various methods (i.e.,
ETL/ELT/batch, CDC, Streaming) using Informatica tools.
.1. This experience must include two or more years of experience using Informatica
IDMC suite of tools including Cloud Data Integration (CDI), Cloud Data Quality
(CDQ), Cloud Data Profiling (CDP), and Cloud Data Ingestion and Replication (CDIR).
9. Five or more years of experience with SQL programming, three or more years of
experience with Python or a similar object-oriented high-level programming language.
Experience with AWS Lambda functions is helpful.
10. Five or more years of experience engineering relational (both row and columnar store)
and NoSQL (i.e., Document, Graph, Vector, Key-Value) databases.
5.15. Two or more years of experience implementing and supporting Master Data Management
(MDM) and Reference Data Management (RDM) in Informatica IDMC (Cloud) Customer
and Reference 360 SaaS.
5.16. Four or more years of experience supporting cloud-based Analytics & Business
Intelligence (ABI) tools: Qlik and Tableau, including secure data warehouse connections,
performance and user access controls.
5.17. Two or more years of experience supporting cloud-based Data Science and Machine
Learning platforms (DSML): Dataiku, including statistical model life-cycle management,
endpoints, and machine learning.
5.18. Must possess excellent communication skills, including verbal, written and diagrammin