Overview
HybridWill neeed to be onsite at a minimum every Wednesday, and 2-3 days a week as needed.
$70 - $80
Full Time
Skills
Google Cloud Platform
Google Cloud
GCP
BigQuery
Cloud Dataflow
Cloud Storage.
SQL
Data Modeling
Data Warehousing
ETL
ELT
Python
R
Java
Github
Job Details
A large government agency in Downtown Los Angeles is looking to hire a Database Architect for a 12-month contract. The position will be hybrid and will need to be on-site at a minimum every Wednesday, and up to 2-3 days as week.
The contractor will be responsible for designing, developing, and maintaining data pipelines, data models, and reporting solutions using Google Cloud Platform's BigQuery and Looker.
Duties to be performed
The contractor is expected to perform the following duties under the direction of the database section manager:
- Design and Develop Data Pipelines:
- Create and maintain scalable data pipelines using tools like Cloudrun, Cloud Dataflow, Cloud Functions, or other relevant ETL/ELT technologies to ingest, process, and store data in BigQuery.
- Build and Optimize Data Warehouses:
- Design, implement, and optimize BigQuery-based data warehouses to support reporting and analytics needs.
- Ensure Data Quality and Integrity:
- Implement data quality checks and validation processes to ensure the accuracy and consistency of data within BigQuery.Responsible for building and maintaining our cloud data infrastructure, including databases, and data pipelines
- Manage Data Security and Governance:
- Implement and maintain data security and governance policies to ensure compliance and protect sensitive data.
- Design and Develop Looker Dashboards and Visualizations:
- Build and maintain interactive dashboards, and visualizations using Looker's reporting and analytics platform.
- Manage Code using Git:
- Utilize Git for version control, branching, merging, and pull requests
- Interface with various project team members and vendor representatives to resolve database related software and hardware problems
- Have excellent written and verbal communications with a diverse group of people
- Perform other related duties incidental to the work described herein
Required Skills
- SQL Proficiency: Strong SQL skills for data querying, manipulation, and analysis.
- Cloud Platform Expertise: Experience with Google Cloud Platform (Google Cloud Platform) services like BigQuery, Cloud Dataflow, and Cloud Storage.
- Data Modeling and Warehousing: Knowledge of data modeling techniques and data warehousing concepts.
- ETL/ELT Processes: Experience with building and optimizing Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) pipelines.
- Programming Languages: Proficiency in programming languages like Python, R or Java.
- Strong experience using Github for version control and continuous integration and deployment.
Education and Certification
- 4-year B.S. degree in Computer Science or related field
- Equivalent coursework or technical training will be considered
- Certification in Google Data Engineering is plus
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.