DATA ARCHITECT

Overview

On Site
Depends on Experience
Contract - W2

Skills

Java
Tableau
Spring Framework
Microservices
Power BI
BigQuery
Databricks
PostgreSQL
python
Kubernetes
Docker

Job Details

Needs:
Full end to end Data Lifecyle
PIPELINE Engineering- SPARK or PYSPARK
Languages- JAVA OR PYTHON OR SCALA
Visualization- POWER BI, ALTERYX, TABLEAU
SOME AI
NICE TO HAVES:
DREMIO
ALTERYX

FINANCIAL EXPERIENCE
2 round interview both virtual
Job Description:
8+ years of hands-on experience as a Data Architect working on innovative, scalable, data-heavy, on-demand enterprise applications and guiding data analysts & engineers

5+ years

- hands-on experience working on Greenfield solutions, leading data tracks and building data sourcing, integration from scratch

-working closely with partners in Infosec, Infrastructure and Operations and establish data & system integration best practices

-identifying, managing and remediating IT risk and tech debt

7+ years of hands-on experience building and delivering self-serve solutions leveraging various approved on-prem and cloud capabilities

Split time between leading design, implementation & software engineers and performing below hands-on role:

7+ years of back-end and ETL experience in consuming, producing and integration with enterprise data providers in batch or on-demand

5+ years of experience with Data Analytics & Reporting using PowerBI, SAS Viya, Tableau

5+ years of experience with cloud-based data solutions like Openshift Data Foundation (ODF), Openshift Data Access (ODA), BigQuery, Snowflake, Talend Cloud, Databricks

5+ years of experience with ETL tools like Alteryx, Talend, Xceptor, Apache Camel, Kafka Streams

4+ years of experience with data virtualization using tools like Dremio, BigQuery Omni, Redhat Virtualization

7+ years of experience with Rest API, Apigee, Kafka, JSON to consume and produce data topics

7+ years of experience with data modeling, metadata management, data governance, data lineage and data dictionary

7+ years of experience with MS SQL, MYSQL, PostgreSQL, MongoDB, Teradata, Apache Spark & deployment tools like Liquibase

3+ years of experience with Java 8+, Spring Framework, SpringBoot and Microservices

3+ years of experience with Docker, Kubernetes, Openshift containers

3+ years of experience with Cloud Poviders like Openshift (OCP), Google Cloud Platform, Azure

3+ years of experience with CI/CD tools like Jenkins, Harness, UCD, GitHub, Maven, Gradle

3+ years of experience with DevSecOps tools like SonarQube, GitLab, Checkmarx, Black Duck

3+ years of experience with Logging, Monitoring tools ELK Stack, Splunk, AppDynamics

5+ years of experience Proficiency in scripting languages such as Python, Bash, Shell

3+ years of experience with tests frameworks using Jasmine, Karma, Selenium, Junit, Jmeter, Rest Assured, Postman

3+ years of experience working in Agile environment using Scrum/Kanban

3+ years of experience working in Jira, Confluence

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.