elt Jobs in pennsylvania

Refine Results
81 - 100 of 119 Jobs

Google Cloud Platform Data Engineer - Permanent Role

Oscar Technology

Remote

Full-time

A New York-based media advertising software company is currently hiring a remote Google Cloud Platform Data Engineer to join their engineering team. As part of the team, your primary responsibility will be to build and maintain ETL pipelines, monitor and clean data, and support the dashboard team with optimized data views. Permanent - $100K annual salary Location - Remote Requirements: 2+ years of data engineering experience Familiarity with Google Cloud Platform in a professional environment

Data Lake Engineer-W2

Narvee Tech Inc

Remote

Contract

Job Summary: Seeking an Data Lake Engineer to design, develop, and manage scalable cloud-based data lake solutions. You will lead data ingestion, transformation, governance, and support analytics and ML use cases across the enterprise. Key Skills: Data lake architecture (AWS Lake Formation, Azure Data Lake, or Google Cloud Platform) ETL/ELT using Spark, Glue, NiFi, ADF Python, SQL, big data tools (Hive, Presto) Real-time ingestion (Kafka, Kinesis) Data governance, cataloging, and security (IAM

Senior Data Scientist

Hexacorp

Remote or Atlanta, Georgia, USA

Contract

We are looking for a highly skilled Senior Data Engineer with hands-on experience in AVEVA PI Historian and Google Cloud Platform (Google Cloud Platform) to join our growing data engineering team. You ll be responsible for building reliable data pipelines, integrating industrial time-series data, and supporting real-time analytics for critical operations. Key Responsibilities: Design, build, and maintain ETL/ELT pipelines for time-series data from AVEVA PI and similar historian systems. Develop

Senior AVEVA PI Engineer

Hexacorp

Remote or Atlanta, Georgia, USA

Contract

We are looking for a highly skilled Senior Data Engineer with hands-on experience in AVEVA PI Historian and Google Cloud Platform (Google Cloud Platform) to join our growing data engineering team. You ll be responsible for building reliable data pipelines, integrating industrial time-series data, and supporting real-time analytics for critical operations. Key Responsibilities: Design, build, and maintain ETL/ELT pipelines for time-series data from AVEVA PI and similar historian systems. Develop

Senior Data Engineer AVEVA PI System

Hexacorp

Remote or Atlanta, Georgia, USA

Contract, Third Party

We are looking for a highly skilled Senior Data Engineer with hands-on experience in AVEVA PI Historian and Google Cloud Platform (Google Cloud Platform) to join our growing data engineering team. You ll be responsible for building reliable data pipelines, integrating industrial time-series data, and supporting real-time analytics for critical operations. Key Responsibilities: Design, build, and maintain ETL/ELT pipelines for time-series data from AVEVA PI and similar historian systems. Develop

Senior AVEVA PI Engineer

Hexacorp

Remote or Atlanta, Georgia, USA

Contract

We are looking for a highly skilled Senior Data Engineer with hands-on experience in AVEVA PI Historian and Google Cloud Platform (Google Cloud Platform) to join our growing data engineering team. You ll be responsible for building reliable data pipelines, integrating industrial time-series data, and supporting real-time analytics for critical operations. Key Responsibilities: Design, build, and maintain ETL/ELT pipelines for time-series data from AVEVA PI and similar historian systems. Develop

Senior Data Engineer

Hexacorp

Remote or Atlanta, Georgia, USA

Contract, Third Party

We are looking for a highly skilled Senior Data Engineer with hands-on experience in AVEVA PI Historian and Google Cloud Platform (Google Cloud Platform) to join our growing data engineering team. You ll be responsible for building reliable data pipelines, integrating industrial time-series data, and supporting real-time analytics for critical operations. Key Responsibilities: Design, build, and maintain ETL/ELT pipelines for time-series data from AVEVA PI and similar historian systems. Develop

Senior Devops Engineer

Hexacorp

Remote or Atlanta, Georgia, USA

Third Party, Contract

We are looking for a highly skilled Senior Data Engineer with hands-on experience in AVEVA PI Historian and Google Cloud Platform (Google Cloud Platform) to join our growing data engineering team. You ll be responsible for building reliable data pipelines, integrating industrial time-series data, and supporting real-time analytics for critical operations. Key Responsibilities: Design, build, and maintain ETL/ELT pipelines for time-series data from AVEVA PI and similar historian systems. Develop

Snowflake Data Architect - Houston, TX

Activesoft, Inc.

Remote

Contract

Minimum Qualifications 10+ years end-to-end data engineering experience, 3+ years Snowflake (enterprise scale).4+ dbt projects in production (or 3 if each >12 months, multi-domain).Advanced SQL (analytic functions, dynamic pivots, query-profile debugging).Working Python (Snowpark, pandas) for ELT helpers and test harnesses.Proven delivery of executive-grade financial dashboards or data marts must demonstrate understanding of eliminations, inter-company balances, multi-GAAP roll-ups.CI/CD with Gi

Varicent Architect

Xoriant Corporation

Remote

Contract

o Bachelor's degree o 7+ years Varicent experience including Version 10 / SAAS o 3+ years' experience deliver in an Agile manner utilize JIRA, Confluence or similar tools o Solution architect type; ability to frame and design the why and how to of solutions o Significant experience in consulting with mid- to senior level executives o Experience with Varicent Presenter, Adaptive Reports, Workflows, Webforms and Query Optimization o Experience with Varicent ELT Data Integrator Responsibilities: Co

Technical Architect/Data Architect

MSquare Systems Inc.

Remote

Full-time

Key Responsibilities: Design scalable and secure end-to-end data pipeline architectures across cloud-native platforms (AWS, Azure) Build and demonstrate technical solutions using Databricks, Snowflake, Informatica, AWS Glue, etc. Integrate data validation and observability tools such as Great Expectations, Monte Carlo, or DataGaps Optimize pipeline performance, cost-efficiency, and data governance frameworks across diverse environments Collaborate with sales teams to understand client needs and

Technical Architect

MSquare Systems Inc.

Remote

Full-time

Position: Remote & Full-TimeKey Responsibilities: Solution Architecture & Implementation Design scalable and secure end-to-end data pipeline architectures across cloud-native platforms (AWS, Azure)Build and demonstrate technical solutions using Databricks, Snowflake, Informatica, AWS Glue, etc.Integrate data validation and observability tools such as Great Expectations, Monte Carlo, or DataGapsOptimize pipeline performance, cost-efficiency, and data governance frameworks across diverse environme

Azure Databricks Data Engineer

Innorev Technologies, Inc

Remote

Contract

Job Title: Azure Databricks Data Engineer Remote Job Description: We are seeking a skilled Azure Databricks Data Engineer to join our team and lead the development of scalable data pipelines and analytics solutions on the Azure cloud platform. The ideal candidate will have hands-on experience with big data technologies, cloud integration, and data engineering best practices using Azure Databricks and related tools. Key Responsibilities: Design, develop, and maintain scalable and reliable ETL/ELT

Staff Database Engineer

Gardner Resources Consulting, LLC

Remote

Contract

Overview We are seeking an experienced Staff Database Engineer (contractor) to design, build, and optimize complex data systems. This senior-level contractor will work across multiple domains including data architecture, pipeline development, and system operations. Key Responsibilities Design and implement scalable and reliable data architectures that support large-scale data processing, transformation, and analysis. Develop, maintain, and optimize ETL/ELT pipelines using modern tools and framew

Azure Data Engineer with Databricks - W2 Only

Everest Global Solutions

Remote

Contract

Job Summary: We are seeking a skilled Azure Data Engineer with hands-on experience in Azure Databricks and Azure Data Factory (ADF) to design, develop, and maintain scalable data pipelines and analytics solutions. The ideal candidate will have a strong understanding of cloud data engineering best practices and a passion for solving complex data problems. Key Responsibilities: Design and implement ETL/ELT pipelines using Azure Data Factory and Databricks.Develop scalable data models and workflows

Data Engineer - Snowflake and Python - W2 only

Rocket

Remote

Contract

Responsibilities: Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology. Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.Lead and take ownership of assigned technical projects in a fast-paced environment.Drive continuous

Data Architect

EdgeAll

Philadelphia, Pennsylvania, USA

Contract

JD: Hands-on expertise with Databricks platform, Unity Catalog, Delta Lake, and Lakehouse architecture and streaming platforms like Kafka. Strong skills in SQL, Python, Spark, and Azure cloud environment.Proven experience migrating SAP HANA databases ,S/4 HANA and SAP applications to Azure cloud or other public clouds.Develop data models, schemas, and data pipelines to support ETL/ELT processes, data warehousing, and analytics workloads in cloud environments.Work closely with data scientists, en

Remote :: Data Architect || Perm Role

IT First Source

Remote

Full-time

Data Architect || Remote or Collegeville / Philadelphia, PA || Perm Role Job Details Required Skills: Extensive experience in data engineering, data governance, or cloud data architecture.Hands-on expertise with Databricks platform, Unity Catalog, Delta Lake, and Lakehouse architecture and streaming platforms like Kafka.Strong skills in SQL, Python, Spark, and Azure cloud environment.Proven experience migrating SAP HANA databases, S/4 HANA and SAP applications to Azure cloud or other public clo

Data Architect with AI

Parkar Consulting Group, LLC

Remote

Contract

Job Title: Data Architect with AI Location: 100% Remote Job Type: Contract Summary: We are seeking a highly skilled and experienced Data Architect with AI expertise for a remote contract opportunity. The ideal candidate will possess deep knowledge of modern data architecture and advanced AI integrations, including experience working with large language models (LLMs) and Retrieval-Augmented Generation (RAG) architectures. This role is critical to designing and implementing robust, scalable data

Enterprise Data Architect

CGT Staffing

Pittsburgh, Pennsylvania, USA

Full-time

DescriptionAs the Enterprise Data Architect, you will play a crucial role in shaping the data landscape at our organization. The Enterprise Data Architect is responsible for designing, maintaining, and evolving the enterprise-wide data architecture to ensure data consistency, accessibility, and integrity across the organization. This role bridges the gap between business strategy and technical implementation by creating and enforcing standards for data modeling, governance, and integration.RESPO