Overview
Remote
On Site
$68 - $76 hourly
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - Temp
Skills
Storage
Backup Administration
Replication
Amazon Lambda
PL/SQL
Flat File
Extract
Transform
Load
ELT
Shell
Scripting
Collaboration
Terraform
Workflow
Analytics
Agile
Data Engineering
Cloud Computing
Amazon Web Services
Amazon S3
Amazon Kinesis
Microsoft Azure
DevOps
Continuous Integration
Continuous Delivery
Git
Data Security
PostgreSQL
Amazon RDS
Remote Desktop Services
NoSQL
Amazon DynamoDB
Python
Java
SQL
Business Intelligence
Microsoft Power BI
Tableau
Machine Learning (ML)
Virtualization
Snow Flake Schema
Artificial Intelligence
Messaging
Job Details
RESPONSIBILITIES:
Kforce has a client that is seeking a Data Engineer in Phoenix, AZ. This is a critical, hands-on role where you'll help build and support foundational data infrastructure. You'll be responsible for designing, implementing, and optimizing cloud-based data pipelines and storage solutions, while documenting processes and supporting cross-functional teams.
Key Responsibilities:
* Data Engineer will administer and optimize Amazon RDS for PostgreSQL: backups, tuning, replication, patching
* Design secure, highly available PostgreSQL environments (cloud-native and hybrid)
* Implement and support DynamoDB solutions with AWS Lambda/Kinesis integration
* Build scalable ETL pipelines using Azure Data Factory
* Ingest data from Oracle, SQL Server, and flat files into AWS
* Develop and maintain data lakes on AWS S3
* As a Data Engineer, you will automate ETL/ELT processes using Python, shell scripts, and cloud-native tools
* Collaborate on CI/CD, infrastructure as code (Terraform/CloudFormation), and Git workflows
* Enable BI and analytics access to structured/semi-structured data
* Participate in Agile ceremonies and contribute to best practices in data engineering
REQUIREMENTS:
* Bachelor's degree in Computer Science, Engineering, Statistics, or related field
* 5+ years in data engineering and cloud integration
* 3+ years of experience with AWS data services and Azure Data Factory
* 3+ years of experience in data ingestion and automation using Python or Java
* 3+ years of hands-on experience with AWS (RDS, S3, Lambda, Kinesis, IAM, DynamoDB)
* 3+ years building pipelines with Azure Data Factory
* Experience with DevOps, CI/CD, Git, and data security best practices
* Deep expertise in PostgreSQL and RDS administration
* Familiarity with NoSQL systems (DynamoDB preferred)
* Strong SQL skills and working knowledge of Python or Java
* Proven ability to write efficient, optimized SQL queries
* Exposure to BI tools (Power BI, Qlik, Tableau), ML/AI, and data virtualization (e.g., Denodo) is a plus
* Bonus: Experience with Snowflake, DBT, or other modern data stack tools
The pay range is the lowest to highest compensation we reasonably in good faith believe we would pay at posting for this role. We may ultimately pay more or less than this range. Employee pay is based on factors like relevant education, qualifications, certifications, experience, skills, seniority, location, performance, union contract and business needs. This range may be modified in the future.
We offer comprehensive benefits including medical/dental/vision insurance, HSA, FSA, 401(k), and life, disability & ADD insurance to eligible employees. Salaried personnel receive paid time off. Hourly employees are not eligible for paid time off unless required by law. Hourly employees on a Service Contract Act project are eligible for paid sick leave.
Note: Pay is not considered compensation until it is earned, vested and determinable. The amount and availability of any compensation remains in Kforce's sole discretion unless and until paid and may be modified in its discretion consistent with the law.
This job is not eligible for bonuses, incentives or commissions.
Kforce is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.
By clicking ?Apply Today? you agree to receive calls, AI-generated calls, text messages or emails from Kforce and its affiliates, and service providers. Note that if you choose to communicate with Kforce via text messaging the frequency may vary, and message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You will always have the right to cease communicating via text by using key words such as STOP.
Kforce has a client that is seeking a Data Engineer in Phoenix, AZ. This is a critical, hands-on role where you'll help build and support foundational data infrastructure. You'll be responsible for designing, implementing, and optimizing cloud-based data pipelines and storage solutions, while documenting processes and supporting cross-functional teams.
Key Responsibilities:
* Data Engineer will administer and optimize Amazon RDS for PostgreSQL: backups, tuning, replication, patching
* Design secure, highly available PostgreSQL environments (cloud-native and hybrid)
* Implement and support DynamoDB solutions with AWS Lambda/Kinesis integration
* Build scalable ETL pipelines using Azure Data Factory
* Ingest data from Oracle, SQL Server, and flat files into AWS
* Develop and maintain data lakes on AWS S3
* As a Data Engineer, you will automate ETL/ELT processes using Python, shell scripts, and cloud-native tools
* Collaborate on CI/CD, infrastructure as code (Terraform/CloudFormation), and Git workflows
* Enable BI and analytics access to structured/semi-structured data
* Participate in Agile ceremonies and contribute to best practices in data engineering
REQUIREMENTS:
* Bachelor's degree in Computer Science, Engineering, Statistics, or related field
* 5+ years in data engineering and cloud integration
* 3+ years of experience with AWS data services and Azure Data Factory
* 3+ years of experience in data ingestion and automation using Python or Java
* 3+ years of hands-on experience with AWS (RDS, S3, Lambda, Kinesis, IAM, DynamoDB)
* 3+ years building pipelines with Azure Data Factory
* Experience with DevOps, CI/CD, Git, and data security best practices
* Deep expertise in PostgreSQL and RDS administration
* Familiarity with NoSQL systems (DynamoDB preferred)
* Strong SQL skills and working knowledge of Python or Java
* Proven ability to write efficient, optimized SQL queries
* Exposure to BI tools (Power BI, Qlik, Tableau), ML/AI, and data virtualization (e.g., Denodo) is a plus
* Bonus: Experience with Snowflake, DBT, or other modern data stack tools
The pay range is the lowest to highest compensation we reasonably in good faith believe we would pay at posting for this role. We may ultimately pay more or less than this range. Employee pay is based on factors like relevant education, qualifications, certifications, experience, skills, seniority, location, performance, union contract and business needs. This range may be modified in the future.
We offer comprehensive benefits including medical/dental/vision insurance, HSA, FSA, 401(k), and life, disability & ADD insurance to eligible employees. Salaried personnel receive paid time off. Hourly employees are not eligible for paid time off unless required by law. Hourly employees on a Service Contract Act project are eligible for paid sick leave.
Note: Pay is not considered compensation until it is earned, vested and determinable. The amount and availability of any compensation remains in Kforce's sole discretion unless and until paid and may be modified in its discretion consistent with the law.
This job is not eligible for bonuses, incentives or commissions.
Kforce is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.
By clicking ?Apply Today? you agree to receive calls, AI-generated calls, text messages or emails from Kforce and its affiliates, and service providers. Note that if you choose to communicate with Kforce via text messaging the frequency may vary, and message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You will always have the right to cease communicating via text by using key words such as STOP.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.