Job Title: Data Engineer Location: Remote Experience Required: 12+ Years Job Description: We are seeking a highly experienced Data Engineer with a strong background in modern data engineering tools and cloud platforms. The ideal candidate will have hands-on expertise in Databricks, DBT, AWS, and Apache Spark, and will be proficient in writing complex SQL queries to support data transformation and analytics workflows. Mandatory Skills:Databricks: Deep understanding and practical experience wit
Role: AWS Data Engineer Location: Remote We are looking for a Data Engineer with 12+ years of experience in AWS, DBT, Databricks, SQL, and Python to build and optimize scalable data pipelines. This role involves working with large datasets, implementing ETL processes, and ensuring efficient data transformation. Key Responsibilities: Design, develop, and maintain data pipelines using DBT and Databricks. Optimize SQL queries for efficient data retrieval and processing. Work with AWS services (S3,
Role: Senior Data Engineer Huge Databricks Experience* Location - REMOTE Job Summary: We are seeking a highly experienced Senior Data Engineer with 12+ years of hands-on experience in data engineering and data pipeline development. The ideal candidate will have strong expertise in Databricks, DBT, AWS, and Apache Spark, and a proven track record of working with modern data platforms in large-scale enterprise environments. This role requires a professional who is adept at building scalable dat
Position: AWS Data EngineerLocation: Remote. Fulltime *Need min 12+ years exp* Skills: RedshiftGlueStrong SQL & PythonStreaming data experienceEMR optionalSnowflake nice to have
Fully Remote - Immediate interviews for Senior AWS Data Engineer - contract Must have - AWS Glue , Airflow, Lake formation, Lambda, Python , PySpark, Terraform CI/CD Experience/Minimum Requirements Proven experience as a Data Engineer, with a strong emphasis on AWS Glue and AWS services. In-depth understanding of architecture, performance optimization techniques, and best practices. Proficiency in SQL and experience with database design principles. Hands-on expertise in designing, building, an
Greetings from Accion Labs, Our direct Client is looking for Sr. AWS Data Engineer-12 Months Contract-Remote opportunity. Primary Skills : Spark (using PySpark) , AWS Glue , AWS DynamoDB, Snowflake Responsibilities: Collaborate with cross-functional teams including Data Scientists, Analysts, and Engineers to gather data requirements and build scalable data solutions.Design, develop, and maintain complex ETL pipelines using AWS Glue and PySpark, ensuring efficient data processing across batch and
We are seeking a highly skilled AWS Data Engineer with a strong background in Red Hat Linux and expertise in building ETL pipelines to support cloud data migration initiatives. The ideal candidate will have hands-on experience with AWS Migration Services and will play a critical role in designing, developing, and maintaining scalable data integration solutions in a secure and reliable cloud environment. Key Responsibilities: Design, build, and optimize robust ETL pipelines for data migration an
Data Engineer- AWS/Hadoop/Python Chandler, AZ _ 3 days onsite Duration: 12 Months + Position is STRICTLY W2 ONLY Minimum 4 years of hand on experience with Building data pipeline using big-data stack (Hadoop, Hive, PySpark, python)Amazon AWS S3 Object storage, security, data service integration with S3Data modelling and database design.Job Scheduler AutosysPowerBI, DremioUnix/shell scripting, CICD pipelineExposure in Google Cloud Platform cloud data engineering is a plus Manager Notes: The co
Job Title: Python Developer Duration: 6 Months contract with possible extensions Location : Hybrid at Tempe, AZ Responsibilities: Design, develop, and deploy full-stack solutions on AWS, focusing on cloud architecture and end-to-end application delivery. Develop serverless applications using AWS Lambda, API Gateway, and other relevant AWS services. Implement, manage, and optimize databases using DynamoDB and other AWS storage solutions. Write clean, scalable, and efficient code in Python, with
Pay Range: $60 - $85/hr. The pay rate may differ depending on your skills, education, experience, and other qualifications. Featured Benefits: Medical Insurance in compliance with the ACA.401(k).Sick leave in compliance with applicable state, federal, and local laws.Must Have: AWS - CloudFormationAWS - DynamoAWS - LambdaAWS - S3AWS - SNS/SQSAWS Step FunctionBI ToolGen AINode.js DevelopmentPythonRedShift/SQLProfile: Senior Data Engineer with strong data and native AWS skills. Design & Implementat
Job Title: AWS Data EngineerLocation: RemoteJob Type: Full-timeJob Description:We are seeking a skilled AWS Data Engineer to design, build, and maintain scalable, high-performance data pipelines and solutions. The ideal candidate will have strong expertise in AWS services, advanced SQL, and Python, with hands-on experience in data integration, transformation, and analytics.Key Responsibilities:Design, develop, and maintain robust ETL/ELT data pipelines using AWS services such as Redshift, Glue,
Data Engineer Manager (AWS) - Remote My client is looking for a hands-on and passionate Data Engineer Manager to lead a talented team building scalable, cloud-native data platforms on AWS. This is a fantastic leadership opportunity for someone who loves mentoring engineers, driving innovation, and delivering meaningful data solutions. U.S. Citizen/ Holders Only. Details: Full-Time, Permanent Position Salary: $170k - $200k Location: 100% Remote About the Role: You'll lead a talented team of da
A worldwide consulting company is urgently hiring a Junior to Mid-level .NET Software Engineer! The ideal candidate has 3+ years of professional experience with C#, .NET Core, JavaScript, SQL, and AWS. You will be 100% hands on a team of 8 modernizing projects to the latest technology including migrating to AWS. This contract to hire role is onsite 3 x a week and has W2 benefits. Required Skills & Experience 3+ years of professional Software Engineering experience C# .NET Core SQL Server Desir
Basic Qualifications Requires a Bachelor's degree in Software Engineering, or a related Science, Engineering or Mathematics field. Also requires 5+ years of job-related experience, or a Master's degree plus 3 years of job-related experience. Agile experience preferred. CLEARANCE REQUIREMENTS: Department of Defense Secret security clearance is preferred at time of hire. Candidates must be able to obtain a Secret clearance within a reasonable amount of time from date of hire. Applicants selected
Job Title: AWS Engineer (Glue/Terraform) Location: Phoenix, Arizona (hybrid) Duration: Long term contract Job Description Engineer should have overall 6+ years of AWS cloud engineering experience. Good knowledge and Hands on to AWS , Terraform, Terraform Enterprise, Linux. Should be expert in Glue Must be AWS certified. Hybrid role 3 days in office NO remote option at all.
Oracle DBA / Migration Engineer (OnPrem-to-AWS) | 453245 DETAILS Location: 100% Remote Position Type: 2M+ Contract Hourly / Salary: to $95W2 JOB SUMMARY Vaco Technology is currently seeking an Oracle DBA / Migration Engineer (OnPrem-to-AWS) for a 2M+ Contract that is 100% remote. The Oracle DBA / Migration Engineer with strong expertise in Oracle 19C Database Administration and AWS DMS (AWS Database Migration Service). The Oracle DBA / Migration Engineer will lead the migration of OnPrem Oracle
<> Now Hiring: Data Engineer Medicaid Data Warehouse (AWS Infrastructure) Work Mode: 100% Remote Duration: Long-term (21-month contract) Role OverviewWe are seeking a senior-level Data Engineer to lead the design and maintenance of a high-performance Medicaid Data Warehouse built on AWS infrastructure. You will develop robust ETL pipelines, manage large-scale data transformation tasks using IBM DataStage, and support reporting teams by ensuring timely, accurate, and secure data delivery for ana
Hi Please find my direct client job requirement for your consideration. Title: Data Engineer Medicaid Data Warehouse (AWS) Locations: Lincoln NE Duration :24 Months Job Summary: We are seeking a highly experienced Data Engineer to support the development and maintenance of an AWS-based Data Warehouse that enables advanced business analytics and reporting in a Medicaid data environment. The engineer will be responsible for developing and managing ETL/data pipeline jobs, transforming data to align
Data Architect with AWS (InsureMO/eBao) -Iselin, NJ Remote is acceptable Job Description: We are seeking a highly experienced and motivated Solution Architect with strong hands-on expertise in AWS and Databricks, and experience integrating rating engines such as InsureMO or eBaoTech. Roles & Responsibilities: Lead architecture and design of a centralized pricing database for actuarial modeling.Define the data strategy and integration roadmap in collaboration with stakeholders.Architect scalable