Spark Jobs

Refine Results
1,841 - 1,860 of 2,090 Jobs

Senior Python Developer

Strategic Staffing Solutions

Charlotte, North Carolina, USA

Contract

Job Title: Senior Python Developer Location: Charlotte, NCDepartment: Data Engineering / AI & Machine LearningEmployment Type: Contract Job Summary: We are seeking a highly skilled Senior Python Developer with deep experience in distributed computing, machine learning, and data engineering. The ideal candidate will have a strong command of Python, hands-on expertise with Apache Spark and Airflow, and a background in building scalable APIs and microservices. Familiarity with cloud computing plat

Google Cloud Platform Data Engineer

Ztek Consulting

Richardson, Texas, USA

Contract, Third Party

Hi, I would like to share an excellent opening Contact Google Cloud Platform Data Engineer do go through the details and kindly send me the updated resume. Location : Richardson TX (Hybrid) Type of Hire : Contract Mode of interview : WebEx / Teams Job Description : Google Cloud Platform - Data Proc, Kubernetes Engine, Cloud Storage, BigQuery, PUB/SUB, Cloud Functions and Dataflow. Cloud Composer, ETL experience - working with large data sets, PySpark, Python, Spark SQL, DataFrames, PyTest Tha

Contract Only-Azure Databricks Lead with Pyspark Python

Corporate Biz Solutions Inc

Branchville, New Jersey, USA

Third Party, Contract

Azure Databricks Lead with Pyspark PythonBranchville, NJ onsiteContractPrimary Technologies Azure Databricks spark pythonDatabase Technologies SQL MS SQL Server Relational DatabasesAdherence to Testing Best Practices Methodologies Unit EndToEnd EtcAdherence to Application Development Documentation Best PracticesExperience in conducting Design Code ReviewsWe are an Equal Opportunity Employer. Thanks! Tara

Sr. Data Engineer With strong Google Cloud Platform Background

Aroha Technologies

US

Third Party

Title: Google Cloud Platform Data Engineer Experience: 8+ Years Location: Remote Note:Need eCommerce Domain Primary Skills- PySpark, Spark, Python, Big Data, Google Cloud Platform, Apache Beam, Dataflow, Airflow, Kafka and BigQuery Good to Have: GFO, Google Analytics Job Description: 7-10 years experience as a data warehouse engineer/architect designing and deploying data systems in a startup environment Mastery of database and data warehouse methodologies and techniques from tra

Data Engineer

VLink Inc

Seattle, Washington, USA

Third Party, Contract

Data Engineer - Seattle, WA Local to Seattle only MUST HAVE: Strong ecommerce, ETL, Azure, Snowflake Responsibilities: Technical leadership for the expansion and optimization of data collection and data pipeline architecture Subject matter expert in development and integration with Web Services and APIs Monitor and support data pipelines and ETL workflows Cloud (Azure) infrastructure administration: network configurations, access and permissions, cloud services Oversee, participate in and manage

Big Data Developer

URSI Technologies Inc.

Remote

Contract, Third Party

Role : Big Data Developer (10+ exp needed) Location : Austin, Texas Duration: 12 Months Qualifications : Required SkillsShould be able to communicate effectively with business and other stakeholders bull Demonstrate ownership.Hands-on experience designing and implementing data applications in production using Java/ Python/ R and etc on big data platformExperience with related/complementary open source software platforms and languages such as Java, Linux, Apache, Open Street Map, D3.js and etcStr

Java Developer with MongoDB

Accuro Group

McLean, Virginia, USA

Third Party

Note: We are looking for Local Candidates of McLean, VA & Plano, TX Design, develop, and maintain scalable and high-performance Java-based applications with a focus on data-intensive processes. Work extensively with MongoDB and other NoSQL databases to manage and optimize data storage and retrieval. Collaborate with cross-functional teams to architect solutions leveraging AWS services such as S3, Lambda, EC2, RDS, etc. Implement and maintain data pipelines and streaming solutions using tools li

Data Engineer

DVARN

Remote

Contract

Position: Data Engineer Location: 100% remote Must Haves: Python & PySpark (Spark SQL) 3+ years Airflow (or any orchestration tool) 2+ years Google Cloud Platform (BigQuery, GCS, Pub/Sub, Cloud Run, Functions, Cloud SQL) 3+ years Real-time data ingestion (Kafka, webhooks, file-based) 2+ years API integration (REST/webhooks) 2+ years Kubernetes (GKE preferred) 1 2 years BigQuery SQL & PostgreSQL 2+ years YAML/config-driven pipeline design 2+ years Schema transformation, hashing, DQF 2+ years CI

Full stack Engineer

Prism IT Corp

Remote

Full-time

Job Description: About the Role The Relevance Team builds machine learning models to power product experiences and item rankings across various surfaces. The Difference You Will Make Explore, shape, and develop new product experiences alongside engineers and product managers, from ideation to implementation Develop prototypes to iteratively validate ideas that push technical capabilities Integrate AI capabilities into future products Your Expertise 3+ years of frontend software development ex

Python Developer

Cynet Systems

McLean, Virginia, USA

Contract

Job Description: Responsibilities: Minimum 06+ years of IT experience in implementing Python, Spark, Pandas and related python technologies. Experience in developing highly scalable enterprise-level web applications and RESTful APIs using Micro Services. Gained experience in XXgn Patterns. Work on implementation of Industry Standard protocols related API Security including OAuth. Good exposure and progressive experience working on AWS Cloud and integrations. Good with Devops, CI/CD - Jenkins. Mu

AWS Engineer

Technology Ventures

Reston, Virginia, USA

Contract, Third Party

Job title: AWS EngineerLocation: Reston, VA Job Description: Skilled in AWS services such as EC2, SSM, CloudFormation, CloudWatch, and Lambda Experience with Spark and Amazon EMR/Hadoop, PySpark and AWS Glue Skilled in programming languages like Python, Unix Shell scripting Experience with Terraform Module creation and usage Experience automating repetitive tasks related to provisioning of AWS Services, installation, configuration, or management of infrastructure using Terraform, Gitlab, Ansible

Azure Data Engineer Consultant

Korn Ferry

Remote

Contract

Job Summary: We are seeking a highly skilled Azure Data Engineering Consultant to join our data team and drive the development and optimization of enterprise-grade data solutions. The ideal candidate will have extensive experience in enterprise ETL, strong SQL skills, and a solid background in Azure-based data engineering. Proficiency in either Python or Spark SQL is essential to succeed in this role. Key Responsibilities: Design, develop, and optimize scalable ETL pipelines and data workflows

Data Engineer (Hudson)

Aroha Technologies

California, USA

Third Party

Job Title: Senior Data Engineer Location: Remote Job Requirement. Expert in Databricks, Lakehouse architecture/Design/Implementations, Spark, building data pipelines - batched and event based. Good working knowledge of Azure services (primarily ADF, Logic Apps, Azure Functions, Azure SQL) Good understanding of GIT Solid understanding of dimensional data modeling. Experience needs to include implementation of Data warehouse/Data marts. Good grasp of relational databases, ability to understand

Google Cloud Engineer Terraform development exp

Zenosys

Remote

Contract

Hello, Zenosys is looking for Google Cloud Platform Engineer client Remote Work If you are available and interested in the below opportunity, please send me updated resume. Thanks Job Title Google Cloud Platform Engineer Location- Remote Work Duration 12+ Months Contracts Highly Chance to Extend If you are interested, please provide the following details along with the updated resume to speed up the interview process. Full Legal Name as mentioned in Drivers' license: Current Location: Willing t

Senior Data Engineer

Data Capital Inc

Sunnyvale, California, USA

Full-time

Total Experience:11+years Google Cloud Platform Experience: * 4+ years of recent Google Cloud Platform experience* Experience building data pipelines in Google Cloud Platform* Google Cloud Platform Dataproc, GCS & BIGQuery experience * 12+ years of hands-on experience with developing data warehouse solutions and data products.* 6+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required* 5

Senior Java Developer with Capital Markets Industry Background - Toronto, ON, Canada - Fulltime/Permanent Position

Activesoft, Inc.

Toronto, Ontario, Canada

Full-time

Senior Java Developer with Capital Markets Industry Background Fulltime/Permanent Position Toronto, ON, Canada The ideal candidate is an experienced data pipeline builder and data wrangler who understands the Capital Markets domain and enjoys optimizing data systems and building them from the ground up.They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products.The right candidate will be excited by the prospect of optimizing or even re-designin

Databricks Administrator

MRoads

Memphis, Tennessee, USA

Full-time

Hi, Our client is looking for Databricks Administrator in multiple locations. Title: Databricks Administrator Location: Memphis, TN, Maryville, TN, Birmingham, AL, Lafayette, LA, New Orleans, LA, Charlotte, NC, Raleigh, NC Duration: Long term Technical Competencies Strong understanding of data engineering needsStrong understanding of Azure infrastructure required by DatabricksExperience with compliance audits/audit documentationExperience using Databricks APIs with tools like PostmanUnderstandin

Senior Python Developer - 6+ Months - Princeton, NJ (3 days Onsite per week)

Avenues International, Inc.

Princeton, New Jersey, USA

Contract, Third Party

Hi , We are looking to hire suitable candidates for Senior Python Developeropportunity with one of our clients in Princeton, NJ (3 days Onsite per week). Position Name: Senior Python Developer Duration: 6+ Months Location: Princeton, NJ (3 days Onsite per week) Skills needed: 4+ years of experience programming in PythonSkills in Java is also requiredSolid understanding of database systems and SQL including Oracle and PostgreSQLAWS experience including services like S3, Lambda, CloudWatchA Degre

Databricks Engineer / Developer @ Louisville KY (Local only)

Borza Inc.

Louisville, Kentucky, USA

Contract

Hello Please find the JD Below and let me know if you have any Suitable match Job Title : Databricks Engineer / Developer Location : Louisville KY (Local only) -- Kentucky DL/ Proof is a must. and the created date should be more than 5 years. Role Description: We are seeking a skilled Databricks Engineer to design, develop, and optimize scalable data solutions using Databricks, Apache Spark, and cloud-based data platforms. In this role, you will collaborate with data scientists, analysts, and b

Data Engineer

BridgeNexus Technologies Inc

Sunnyvale, California, USA

Contract

Title: Data EngineerLocation: Sunnyvale CA(Hybrid)Duration: 6+ Months contract(Possibility Of extension) Top 4 Skills Needed or RequiredAirflowscalaGoogle cloud storageSQL analysis What are the day-to-day responsibilities:- develop the pipeline, testing, validation and release Description:-Designs, develops, and implements Hadoop eco-system-based applications to support business requirements. Follows approved life cycle methodologies, creates design documents, and performs program coding and tes