***We are unable to sponsor as this is a permanent full time role***
A prestigious company is on the search for a Sr. Enterprise Data Architect. This role is revolved around developing data architecture while defining and enforcing data architecture standards, governance, and controls. This person needs experience with developing architecture reference models for data governance and control. For this role, the candidate must have 7+ years of data architecture experience and 3+ years working with big data architecture.
- Create, maintain, and govern architectural views and blueprints depicting the Business and IT landscape in its current, transitional, and future state.
- Recommend long-term direction on strategic advancements within the technical portfolio.
- Define and maintain standards for artifacts containing architectural content within the operating model.
- Build a Community of Practice for solutions architecture while leveraging architectural tools, processes, and practices.
- Offer insight, guidance, and direction on the usage of emerging trends and technical capabilities.
- Strong cloud data architecture knowledge with experience developing architecture strategies and plans to enable cloud data transformation, MDM, data governance, and data science capabilities.
- Design reusable data architecture and best practices to support batch/streaming ingestion, efficient batch, real-time, and near real-time integration/ETL, integrating quality rules, and structuring data for analytic consumption by end uses.
- Ability to lead software evaluations including RFP development, capabilities assessment, formal scoring models, and delivery of executive presentations supporting a final recommendation.
- Well versed in the Data domains (Data Warehousing, Data Governance, MDM, Data Quality, Data Catalog, Analytics, BI, Operational Data Store, Metadata, Unstructured Data, ETL, ESB).
- Experience with cloud data technologies such as Azure data factory, Azure storage, Azure data lake storage, Azure data bricks, Azure AD, Azure ML etc.
- Experience with big data technologies such as Hadoop, Spark, Sqoop, Hive, Flume, Storm, and Kafka.
- Bachelor's degree and at least 15 years of experience in information technology OR,
- Master's degree and at least 12 years of experience in information technology OR,
- At least 17 years of experience in information technology.
- AND 2 years lead experience.