Role: Senior Data Architect – AWS & Snowflake (Healthcare Analytics)
Location: Tallahassee, FL - Hybrid
Job Description:
· Bachelor’s (4-year) degree in Computer Science, Analytics/Data Science, Information Systems, Business Administration, Public Health Informatics, or a related field.
· Current data and/or analytics certification such as Certified Data Management Professional (CDMP) or substitution with 18 or more hours of webinars or conferences in data and analytics within the last 3 years.
· Minimum 5+ years of experience interfacing directly with various lines of business and ability to articulate technical architectures, solutions, and data models to non-technical audiences, preferably in healthcare.
· Minimum 6+ years of experience architecting, engineering, implementing, and supporting data warehouses including 2+ years with Snowflake Data Warehouse.
· Minimum 3+ years of experience architecting and supporting cloud-based data lakes including AWS S3 lake layers, bucket structures, and Parquet.
· Minimum 2+ years of experience with cloud data lakehouse architectures using Databricks, Snowflake, AWS, Delta Lake, Hudi, or Iceberg.
· Minimum 10+ years of experience in data modeling including ER, conceptual, logical, and physical models with schema-on-read and schema-on-write, including proficiency with ERwin Data Modeler.
· Minimum 6+ years of experience with data pipeline and integration tools using ETL/batch, CDC, and streaming with Informatica, AWS Glue, MuleSoft, Spark, Kinesis, or Kafka.
· Minimum 6+ years SQL programming experience.
· Minimum 3+ years Python or similar object-oriented language experience.
· Minimum 1+ year AWS Lambda development experience
· Minimum 5+ years of experience architecting relational and NoSQL databases including document, graph, vector, and key-value stores.
· Minimum 3+ years of experience working with cloud infrastructure and security teams to design AWS data ecosystems.
· Minimum 3+ years of experience architecting data protection, DLP, RBAC/ABAC, and HIPAA compliance.
· Minimum 3+ years of experience designing internal and external data sharing hubs and governing APIs.
· Minimum 2+ years of experience utilizing DevOps or DataOps processes.
· Minimum 5+ years of experience in data and analytics testing, QA, and automation.
· Minimum 3+ years of experience with data governance solutions such as Data Quality, Catalog, Metadata, and Lineage using Informatica IDMC, Collibra, or Precisely.
· Minimum 2+ years of experience with Master Data Management using Informatica MDM, Semarchy, or Reltio.
· Minimum 4+ years of experience with Analytics & BI tools such as Qlik, Tableau, or PowerBI.
· Minimum 2+ years of experience with DS/ML platforms such as SAS Viya, AWS Sage Maker, or Dataiku.