Software Engineer III

Greenwood Village, CO, US • Posted 30+ days ago • Updated 3 hours ago
Contract W2
On-site
$60 - $62 hourly
Company Branding Image
Fitment

Dice Job Match Score™

📊 Calculating match score...

Job Details

Skills

  • Adaptability
  • Conflict Resolution
  • Problem Solving
  • Regulatory Compliance
  • Cloud Computing
  • Communication
  • Collaboration
  • Knowledge Sharing
  • Offshoring
  • Mentorship
  • Documentation
  • Technical Writing
  • Python
  • Pandas
  • NumPy
  • Object-Oriented Programming
  • Code Optimization
  • SQL
  • Performance Tuning
  • Stored Procedures
  • Database
  • Scripting
  • Database Design
  • Amazon S3
  • Data Storage
  • Lifecycle Management
  • Remote Desktop Services
  • Amazon RDS
  • Database Administration
  • Amazon EC2
  • Identity Management
  • Amazon Web Services
  • Optimization
  • Data Engineering
  • Extract
  • Transform
  • Load
  • ELT
  • Apache Spark
  • Data Processing
  • PySpark
  • Data Lake
  • Warehouse
  • Data Governance
  • DevOps
  • Continuous Integration
  • Continuous Delivery
  • Terraform
  • Git
  • Workflow
  • Version Control
  • Docker
  • Google Cloud Platform
  • Google Cloud
  • Cloud Storage
  • Artificial Intelligence
  • Messaging

Summary

RESPONSIBILITIES:
Kforce has a client that is seeking a Software Engineer III in Greenwood Village, CO.

Summary:
The Software Engineer III will support data engineering and infrastructure optimization. They will work alongside the Lead Data Developer on the Data team to build and maintain data pipelines, harden existing infrastructure, and establish cross-cloud connectivity. This role requires strong technical skills in Python and SQL, combined with excellent communication, adaptability, and collaborative problem-solving abilities.

Responsibilities:
Technical Execution:
* Learn and understand existing AWS infrastructure and data pipelines
* Build and maintain Spark-based ETL/ELT pipelines using Python and SQL
* Optimize existing data processing workflows for performance and cost
* Harden existing infrastructure and improve operational stability
* Refactor and optimize current processes for better maintainability
* Troubleshoot data pipeline and connectivity issues
* Ensure security and compliance across cloud platforms

Collaboration & Communication:
* Collaborate with Lead Data Developer on architecture decisions
* Work closely with Data team, DevOps team, Frontend team, and Business Analysts
* Participate in code reviews and knowledge sharing sessions
* Communicate progress, blockers, and technical decisions clearly
* Support offshore team members and mentor junior team members

Documentation & Best Practices:
* Document processes, patterns, and best practices
* Create runbooks and troubleshooting guides
* Maintain clear and up-to-date technical documentation
* Contribute to team standards and coding guidelines

REQUIREMENTS:
Core Technical Competencies (Required):
Python (Critical):
* Strong proficiency in Python for data engineering and automation
* Experience with data processing libraries (pandas, PySpark, NumPy)
* Script development for ETL/ELT workflows
* Object-oriented programming and code optimization
* Error handling and logging best practices

SQL (Critical):
* Advanced SQL skills for data transformation and analysis
* Experience with SQL databases
* Complex query development and optimization
* Performance tuning and execution plan analysis
* Stored procedures, functions, and database scripting
* Understanding of indexing and database design

AWS Services (Highly Recommended):
Hands-on experience with core AWS services:
* S3 (data storage and lifecycle management)
* RDS (relational database management)
* Lambda (serverless computing)
* EC2 (compute instances)
* Glue (ETL service)
* Athena (query service)

* AWS IAM for security and access management
* CloudWatch for monitoring and logging
* Understanding of AWS cost optimization strategies

Data Engineering (Required):
* Strong ETL/ELT pipeline development and maintenance
* Apache Spark and distributed data processing (PySpark)
* Experience with medallion architecture or similar data patterns
* Data lake and warehouse concepts
* Understanding of data governance, quality, and security
* Ability to learn and work within existing data architectures

DevOps & Infrastructure (Nice to Have):
* CI/CD pipeline experience
* Infrastructure as Code (Terraform)
* Git workflows and version control
* Container technologies (Docker)
* Google Cloud Platform services (BigQuery, Cloud Storage)

The pay range is the lowest to highest compensation we reasonably in good faith believe we would pay at posting for this role. We may ultimately pay more or less than this range. Employee pay is based on factors like relevant education, qualifications, certifications, experience, skills, seniority, location, performance, union contract and business needs. This range may be modified in the future.

We offer comprehensive benefits including medical/dental/vision insurance, HSA, FSA, 401(k), and life, disability & ADD insurance to eligible employees. Salaried personnel receive paid time off. Hourly employees are not eligible for paid time off unless required by law. Hourly employees on a Service Contract Act project are eligible for paid sick leave.

Note: Pay is not considered compensation until it is earned, vested and determinable. The amount and availability of any compensation remains in Kforce's sole discretion unless and until paid and may be modified in its discretion consistent with the law.

This job is not eligible for bonuses, incentives or commissions.

Kforce is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.

By clicking ?Apply Today? you agree to receive calls, AI-generated calls, text messages or emails from Kforce and its affiliates, and service providers. Note that if you choose to communicate with Kforce via text messaging the frequency may vary, and message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You will always have the right to cease communicating via text by using key words such as STOP.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: kforcecx
  • Position Id: ITTVT2171091
  • Posted 30+ days ago

Company Info

About Kforce Technology Staffing

Kforce is a solutions firm specializing in technology, finance and accounting, and professional staffing services. Our KNOWLEDGEforce® empowers industry-leading companies to achieve their digital transformation goals. We curate teams of technical experts who deliver solutions custom-tailored to each client’s needs. These scalable, flexible outcomes are shaped by deep market knowledge, thought leadership and our multi-industry expertise. 

Our integrated approach is rooted in 60 years of proven success deploying highly skilled professionals on a temporary and direct-hire basis. Each year, approximately 18,000 talented experts work with the Fortune 500 and other leading companies. Together, we deliver Great Results Through Strategic Partnership and Knowledge Sharing®

NYSE: KFRC

About_Company_One
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Greenwood Village, Colorado

Today

Contract

$64.05 - $82.48 hourly

Remote or Greenwood Village, Colorado

Today

Contract

$45.09 - $58.10 hourly

Englewood, Colorado

Today

Contract

$48.88 - $62.97 hourly

Englewood, Colorado

Today

Contract

$42.50 - $57.50 hourly

Search all similar jobs