At Allstate, great things happen when our people work together to protect families and their belongings from life's uncertainties. And for more than 90 years our innovative drive has kept us a step ahead of our customers' evolving needs. From advocating for seat belts, air bags and graduated driving laws, to being an industry leader in pricing sophistication, telematics, and, more recently, device and identity protection. Job Description Founded by The Allstate Corporation in 2016, Arity is a da
Job Title: Data Architect With Canonical Modeling Location: Remote (PST Hours) Duration: 3+ months Pay Rate: $75/- Hr. on C2C Note: Candidate needs to use his own laptop. Key Skills: Data Vault, Canonical Modeling and Databricks. Position Overview: As a Data Solution Engineer/Architect Advanced Analytics within our professional services practice, you will lead the design and delivery of cloud-native analytics architectures that solve complex data challenges for enterprise clients. This is a hi
Job Title: Google Cloud Platform Engineer Location: Remote Contract: 6+ months Contract To Hire Job Description The incumbent is responsible for the definition, development, and implementation of new systems, and major enhancements to existing systems, as well as production support for systems with high complexity. The incumbent is capable of providing project leadership for major feasibility or business systems analysis studies. Required Qualifications: Bachelor's Degree or additional years of
Position Senior Data Engineer (ETL , Python and Google Cloud Platform) Location Canada (remote) Job Type Contract to hire Job Description We are looking for a Senior Data Engineer with deep expertise in ETL development, data warehousing, and strong skills in Python and SQL. The ideal candidate will have over 8 years of hands-on experience in developing robust ETL pipelines, managing large data sets, and optimizing data processing workflows. While familiarity with BigQuery is a plus, we value pro
Position: Data Engineer Location: 100% Remote (EST time zone) Contract Duration: 6+ months Only W2 Must Have Tech Stack: Python & PySpark (Spark SQL) 3+ years Airflow (or any orchestration tool) 2+ years Google Cloud Platform (BigQuery, GCS, Pub/Sub, Cloud Run, Functions, Cloud SQL) 3+ years Real-time data ingestion (Kafka, webhooks, file-based) 2+ years API integration (REST/webhooks) 2+ years Kubernetes (GKE preferred) 1 2 years BigQuery SQL & PostgreSQL 2+ years YAML/config-driven pipeline