Cloud Data Engineer

Chicago, IL, US • Posted 5 hours ago • Updated 5 hours ago
Contract W2
Travel Required
Able to Sponsor
On-site
Depends on Experience
Company Branding Image
Fitment

Dice Job Match Score™

🛠️ Calibrating flux capacitors...

Job Details

Skills

  • Caching
  • Analytics
  • Apache Avro
  • Apache Kafka
  • Artificial Intelligence
  • Cloud Computing
  • Clustering
  • Continuous Delivery
  • Continuous Integration
  • Data Validation
  • Data Modeling
  • Automated Testing
  • Computer Networking
  • Design Patterns
  • Data Engineering
  • Data Wrangling
  • Dashboard
  • Dependency Injection
  • Authentication
  • Authorization
  • Orchestration
  • LinkedIn
  • Management
  • Microservices
  • Microsoft Azure
  • GitHub
  • GitLab
  • IO
  • JSON
  • Kubernetes
  • Documentation
  • Extract, Transform, Load
  • Financial Services
  • Collaboration
  • Communication
  • Snow Flake Schema
  • RESTful
  • SQL
  • SQL Tuning
  • Semantics
  • DevOps
  • Soft Skills
  • Terraform
  • Testing
  • Sprint
  • Docker
  • Agile
  • ELT
  • PySpark
  • Python
  • Pandas
  • API
  • Time Series
  • Workflow
  • Warehouse

Summary

Title: Senior Python Data Engineer

Location: Chicago, IL (Need local candidates)

Note: Need local candidates with full proof and documentation of being local, with LinkedIn profile links.

REQUIRED SKILLS

•             Python expertise with deep experience in pandas for ETL/ELT and data wrangling (vectorization, memory management, IO, time series).

•             Hands-on experience with Snowflake (SQL, performance tuning, warehouse configuration).

•             Hands-on experience with Snowpark (Python) for scalable transformations.

•             Strong FastAPI experience building production services (dependency injection, Pydantic models, async IO).

•             Practical knowledge of Kafka (consumer groups, offsets, partitions, schema management).

•             Experience designing event-driven microservices.

•             Proficiency with Docker and Kubernetes (deployment strategies, networking, volumes; service meshes a plus).

•             Solid understanding of testing, code quality, design patterns, API design, and clean architecture.

•             Experience with CI/CD (GitHub Actions, GitLab CI, or Azure DevOps).

•             Experience with IaC (Terraform or Helm preferred).

•             Familiarity with data modeling and SQL.

•             Familiarity with GitHub Copilot or similar AI-assisted coding tools.

Soft Skills

•             Strong communication skills and ability to work in a cross-functional, agile environment.

Nice to Have (Optional)

•             Financial Services industry exposure.

We are seeking a Python Developer with strong expertise in data transformation, pandas, and modern data engineering practices. The ideal candidate will design and implement scalable data pipelines and APIs, leveraging Snowflake, Snowpark, and containerized environments. Experience with FastAPI and Kubernetes is essential. Familiarity with the financial services industry is a plus.

 

Project Overview/Role

•             Design & Build Data Pipelines: Create reliable, testable data transformation workflows using Python (pandas, PySpark/Snowpark), optimizing for performance and maintainability.

•             Snowflake Engineering: Implement Snowflake objects (tables, stages, tasks), write efficient SQL, develop Snowpark-based transformations; manage performance (clustering, warehouses, caching) and cost.

•             Service Development (FastAPI): Build RESTful/JSON APIs and backend services in FastAPI to expose data and business logic; implement authentication/authorization, rate limiting, and request validation.

•             Containerization & Orchestration: Package services with Docker and deploy/operate them on Kubernetes; manage manifests, Helm charts, ConfigMaps/Secrets, health probes, autoscaling, and observability.

•             Event-Driven Architecture: Produce/consume Kafka topics; design schemas (Avro/JSON/Protobuf), ensure idempotency, implement exactly-once/at-least-once semantics where appropriate; apply stream processing patterns.

•             Quality & Reliability: Write unit/integration tests, data validation checks, and contract tests; implement CI/CD (linting, type checks, security scans, test automation) and support blue/green or canary releases.

•             Observability & Operations: Instrument services with logging, metrics, and tracing (e.g., OpenTelemetry); build dashboards and alerts.

•             Collaboration: Partner with product, analytics, and platform teams; document designs, APIs, SLAs, and runbooks; participate in reviews and sprint ceremonies.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10495938
  • Position Id: 8938718
  • Posted 5 hours ago

Company Info

About Ekcel Technologies Inc

Ekcel was started with an idea of recreating the services and solutions in the area of Telecom, IT and Networking with a new prospect. We at Ekcel believe innovation is the way moving forward to cater the ever increasing demand from the shrinking global market.

At Ekcel, we endeavor to achieve and perfect exactly this- implementing solutions for a changing world. A glimpse of the future some may say. Achieving this is not an easy task. It requires a highly dedicated and talented engineering workforce with deep knowledge, multiple skill-set and the right attitude.

Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Chicago, Illinois

3d ago

Easy Apply

Contract

50 - 65

Chicago, Illinois

Today

Easy Apply

Contract, Third Party

Depends on Experience

Chicago, Illinois

11d ago

Easy Apply

Contract, Third Party

Depends on Experience

Chicago, Illinois

Today

Contract

USD65 - USD72

Search all similar jobs