Overview
On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 6 Month(s)
Able to Provide Sponsorship
Skills
Data Architect
SQL
Python
Databricks
Power BI
Spark
Kafka
Data bricks
Pyspark
ETL
Job Details
Role Data Architect
Location: Santa Clara, CA or Remote
Job Description:
- Design, develop, and maintain robust end-to-end data pipelines for various data sources (e.g., databases, APIs, cloud storage).
- Extract, transform, and load (ETL) data using appropriate tools and technologies (e.g., SQL, Python, Databricks).
- Optimize data pipelines for performance, scalability, and reliability
- Develop interactive and insightful Power BI dashboards and reports to visualize key business metrics and trends.
- Spark
Pls look for candidates who has excellent abilities on the following and very good communication skills.
- SQL
- Python
- Databricks
- PowerBI
- Big Data (Spark and Kafka specifically)
We will run coding test on SQL and Python before taking the candidate to client round.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.