Data Engineer II

  • Sunnyvale, CA
  • Posted 8 hours ago | Updated 8 hours ago

Overview

On Site
USD 60.00 - 64.00 per hour
Full Time

Skills

HTML
Artificial Intelligence
Analytics
SOAP
Management
Data Storage
Database
Data Quality
Technical Writing
Data Security
FOCUS
Database Administration
API
Authentication
SQL
Database Design
Query Optimization
Relational Databases
PostgreSQL
Data Loading
Performance Tuning
Data Processing
ELT
PySpark
Data Engineering
Python
Cloud Computing
Amazon Web Services
Microsoft Azure
Google Cloud Platform
Google Cloud
Snow Flake Schema
Databricks
Data Warehouse
Problem Solving
Conflict Resolution
Communication
Collaboration
Extract
Transform
Load
Orchestration
Docker
Data Governance
Regulatory Compliance
Tableau
Dashboard
Business Intelligence
Visualization
Data Structure
Information Technology
Privacy
Finance
Credit Cards
Banking
Onboarding
Payroll
Training

Job Details

```html THIS JOB DESCRIPTION WAS CREATED BY AI, REVIEW BEFORE POSTING
Position: Data Engineer II
Location: Remote
Duration: Contract
Job ID: 168661

Job Overview:
We are seeking an experienced and highly skilled Data Engineer II for a contract position. This role focuses on building robust data pipelines, integrating data from various APIs, and managing critical data infrastructure. The successful candidate will work hands-on with Databricks, Snowflake, and relational databases to ensure data is clean, reliable, and optimized for analytics tools like Tableau. This is a project-based contract role requiring someone who can quickly integrate and deliver results.

Responsibilities:
  • Design, develop, and maintain scalable data ingestion pipelines to extract data from various third-party and internal APIs (REST, SOAP, etc.).
  • Implement efficient data transformation and loading processes (ETL/ELT) within the data platform.
  • Manage and optimize data storage and schemas in Snowflake and Postgres databases.
  • Utilize Databricks for data processing, transformation, and orchestration tasks.
  • Ensure data quality, accuracy, and integrity throughout the data pipelines.
  • Collaborate with data analysts and BI developers (particularly Tableau users) to understand data requirements and optimize data models for performance.
  • Monitor data pipelines and systems for performance issues, errors, and data discrepancies, implementing necessary fixes and improvements.
  • Develop and maintain technical documentation for data pipelines, processes, and data models.
  • Troubleshoot data-related issues and provide timely resolutions.
  • Implement best practices for data security and governance within the data platform.

Qualifications:
  • Proven experience as a Data Engineer, with a strong focus on API integration and database management.
  • Experience building data pipelines for API data ingestion, including handling authentication, error handling, and data parsing.
  • Strong proficiency in SQL and experience with database design, query optimization, and performance tuning in relational databases (e.g., Postgres).
  • Hands-on experience with Snowflake as a cloud data warehouse, including data loading, querying, and performance optimization.
  • Experience using Databricks for data processing, ETL/ELT, and pipeline orchestration (e.g., PySpark, notebooks).
  • Proficiency in a programming language commonly used for data engineering (e.g., Python).
  • Experience working with cloud platforms (e.g., AWS, Azure, Google Cloud Platform) where Snowflake and Databricks are deployed.
  • Understanding of data warehousing concepts and best practices.
  • Experience supporting BI tools (like Tableau) by providing clean, structured, and performant data sources.
  • Excellent problem-solving skills and ability to work independently in a fast-paced environment.
  • Strong communication and collaboration skills.

Desired Skills (Nice to Have):
  • Experience with specific ETL/orchestration tools (e.g., Airflow, Fivetran, dbt).
  • Familiarity with containerization technologies (e.g., Docker).
  • Experience in a contract or consulting role.
  • Knowledge of data governance and compliance standards.
  • Experience with BI tools like Tableau, including dashboard creation or assisting BI developers with visualization best practices related to data structure.

About PTR Global: PTR Global is a leading provider of information technology and workforce solutions. PTR Global has become one of the largest providers in its industry, with over 5000 professionals providing services across the U.S. and Canada. For more information visit ;br>
At PTR Global, we understand the importance of your privacy and security. We NEVER ASK job applicants to:
  • Pay any fee to be considered for, submitted to, or selected for any opportunity.
  • Purchase any product, service, or gift cards from us or for us as part of an application, interview, or selection process.
  • Provide sensitive financial information such as credit card numbers or banking information. Successfully placed or hired candidates would only be asked for banking details after accepting an offer from us during our official onboarding processes as part of payroll setup.

Pay Range: $60- $64

The specific compensation for this position will be determined by several factors, including the scope, complexity, and location of the role, as well as the cost of labor in the market; the skills, education, training, credentials, and experience of the candidate; and other conditions of employment. Our full-time consultants have access to benefits, including medical, dental, vision, and 401K contributions, as well as PTO, sick leave, and other benefits mandated by applicable state or localities where you reside or work.

#LI-KW

```
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About PTR Global