Overview
On Site
$DOE
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 12-24 month(s)
Skills
Oracle
Tableau
Snowflake
Kafka
PostgreSQL
Data Warehousing & Data Lake Platforms (AWS Redshift
Azure Synapse
BigQuery
Databricks)
ETL/ELT Development (AWS Glue
Azure Data Factory
Talend
dbt
Informatica)
BI & Reporting Tools (Power BI
Looker
Business Objects)
Big Data & Analytics (Hadoop
Spark
ML/DL with TensorFlow & PyTorch)
Database Management (SQL Server
MongoDB)
Job Details
Job Title: Data / BI Architect
Duration: 12 months
Location: Oakland, MI
Key responsibilities:
- Designs, develops & maintains the overall data strategy ensuring the data is accessible, reliable & secure for analysis and decision-making.
- Stakeholder Collaboration: Work closely with business & IT stakeholders to gather req & translate business needs into tech specifications, including identification of data sources.
- Data Arch Design & Data Modeling: Architect & implement scalable, secure & efficient data solutions, including data warehouses, data lakes, and/or data marts.
- Design conceptual, logical & physical data models. Tool and Platform Selection: Evaluate, recommend & implement tools aligned with recommended architecture, including visualization tools aligned with business needs.
- ETL/ELT Pipeline Mgt: Design, develop & test data pipelines, integrations to source mgt & ETL / ELT processes to move data from various sources into the data warehouse.
- Data Catalog & Metadata Mgt: Design, create & maintain an enterprise-wide data catalog, automating metadata ingestion, establishing data dictionaries, and ensuring that all data assets are properly documented & tagged.
- Data Governance and Discovery: Enforce data governance policies through the data catalog, ensuring data quality, security & compliance. Enable self-service data discovery for users by curating & organizing data assets in an intuitive way.
- Performance Optimization: Monitor & optimize BI systems & data pipelines to ensure high performance, reliability & cost-effectiveness.
- Technical Leadership: Provide technical guidance & mentorship across the organization, establishing best practices for data mgt & BI development.
Required Qualification:
- 5+ years of experience with Data Platforms including Data Warehouse & Data Lake concepts, dimensional modeling, and cloud services (S3, AWS Redshift, RDS, Azure Data Lake Storage, Synapse Analytics, BigQuery, Databricks, Snowflake, Informatica).
- Strong expertise with Databases (SQL Server, Oracle, PostgreSQL, MongoDB) in both relational and non-relational environments.
- Hands-on experience with BI & Reporting Tools such as Power BI, Business Objects, Tableau, Crystal Reports, and Looker.
- Proficiency in ETL/ELT tools, including cloud-native solutions (AWS Glue, Azure Data Factory, Google Cloud Dataflow) and in-warehouse transformation tools (Fivetran, Talend, dbt).
- Knowledge of Big Data technologies such as Hadoop, Spark, and Kafka.
- Skilled in Programming & API development (Python, R, XML) with exposure to machine learning libraries/frameworks (Keras, Scikit-learn).
- Experience with ML/DL & Analytics Engines including TensorFlow, PyTorch, Trillium, and Apache Spark.
- Proficiency with Data Modeling Tools (MS Visio, ER/Studio, PowerDesigner).
- Familiarity with source systems across on-premises, cloud, and SaaS environments.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.