Data / BI Architect

  • Oakland County, MI
  • Posted 2 days ago | Updated 2 days ago

Overview

On Site
$60 - $90
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 1 Year(s)

Skills

Azure Data Lake Storage
Synapse Analytics
BigQuery
Databricks
Snowflake
Oracle
PostgreSQL
Business Objects
Tableau
Crystal

Job Details


Title: Data / BI Architect
Client: Public Sector
Location: Oakland County, MI (Hybrid)

Skills Required

Experience in architecting data solutions that can be used for descriptive, diagnostic, predictive & prescriptive analytic solutions.
Work closely with business & IT stakeholders to gather req & translate business needs into tech specifications, including identification of data sources.
Architect & implement scalable, secure & efficient data solutions, including data warehouses, data lakes, and/or data marts.
Design conceptual, logical & physical data models.
Evaluate, recommend & implement tools aligned with recommended architecture, including visualization tools aligned with business needs.
Design, develop & test data pipelines, integrations to source mgt & ETL / ELT processes to move data from various sources into the data warehouse.
Design, create & maintain an enterprise-wide data catalog, automating metadata ingestion, establishing data dictionaries, and ensuring that all data assets are properly documented & tagged.
Enforce data governance policies through the data catalog, ensuring data quality, security & compliance.
Enable self-service data discovery for users by curating & organizing data assets in an intuitive way.
Monitor & optimize BI systems & data pipelines to ensure high performance, reliability & cost-effectiveness.
Provide technical guidance & mentorship across the organization, establishing best practices for data mgt & BI development.

Data Platforms: DW & lake concepts incl. dimensional modeling & cloud services (S3, AWS Redshift, RDS, Azure Data Lake Storage, Synapse Analytics, BigQuery, Databricks, Snowflake, Informatica); Databases: SQL & relational/non-relational (SQL Server, Oracle, PostgreSQL, MongoDB); BI Tools: Power BI, Business Objects, Tableau, Crystal, Looker; ETL/ELT: Cloud native (AWS Glue, Azure Data Factory, Google Cloud Dataflow) & in-warehouse transform tools (Fivetran, Talend, dbt); Big Data Tech: Hadoop, Spark, Kafka; Programming/API: Python, Keras, Scikit-learn, R, XML; ML/DL/Analytic Engines: TensorFlow, PyTorch, Trillium, Apache Spark; Modeling Tools: MS Visio, ER/Studio, PowerDesigner; Source systems incl. on-prem, cloud, & SaaS

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Dilytics Inc