Hadoop Developer Jobs in Minneapolis, MN

Refine Results
61 - 80 of 112 Jobs

Senior Data Architect - REMOTE

AMH

Remote or Las Vegas, Nevada, USA

Full-time

AMH Since 2012, we've grown to become one of the leading single-family rental companies and homebuilders in the country, recently recognized as a top employer by Fortune and Great Place To Work . At AMH, our goal is to simplify the experience of leasing a home through professional management and maintenance support, so our residents can focus on what really matters to them, wherever they are in life. The Senior Data Architect - (REMOTE) is responsible for designing, implementing, and maintainin

AI/ ML Engineer

Resource 1

Remote

Contract

Resource 1 is in need of an Artificial Intelligence (AI)/ Machine Learning (ML) Engineer for a long-term, remote contract position. Responsibilities: Architect, build, maintain and improve new and existing suite of algorithms and their underlying systemsImplement end-to-end solutions for batch and real-time algorithms along with requisite tooling around monitoring, logging, automated testing, performance testing and A/B testingEstablish scalable and efficient automated processes for data analyse

Kafka Engineer

iQuest Solutions Corp

Remote or US

Contract, Third Party

Kafka Engineer Remote Job Description Experience with Big Data solutions such as Cassandra, Google Pub/Sub, Hadoop, Spark, Kafka, ElasticSearch and Solr is a plus; Working knowledge and experience of Bitbucket, Git or Gitflow; Continuous integration and automated testing; Knowledge of Kafka Schemas and use of the Schema Registry; Strong fundamentals in Kafka client configuration and troubleshooting; Development of RESTful Services employing Spring and Spring Boot frameworks; Building Cloud Nat

Full Stack Software Engineer

iQuest Solutions Corp

Remote or New York, New York, USA

Contract

Job title: Full Stack Software Engineer US Location of work: Preference to be in the NY/NJ area but can be virtual Start Date: ASAP The Opportunity: We are seeking a strong Full Stack Software Engineer for our Cloud Service Business Networks Technical Solutions Client. The team provides subject matter expertise to deliver business solutions for financial institutions. You are great at: Time management, organizing and prioritizing tasks Communicating with a virtual globally located team Implem

Data Scientist

Charter Solutions Inc.

Remote

Full-time

Data Scientist Information Technology Job Description: Data Scientist Data Scientist Roles and Responsibilities Building algorithms and designing experiments to merge, manage, analyze, and extract data for tailored reports to colleagues, clients or management. Creating, managing, and utilizing high performance relational and NoSQL databases, such as Microsoft SQL Server, Oracle, Microsoft Access, OLAP and other software. Selecting and implementing data mining methods most relevant to compan

Data Scientist

Diverse Lynx Llc

Eagan, Minnesota, USA

Full-time

Data Scientist Location: Eagan, MN Collecting data through means such as analyzing business results or by setting up and managing new studies Transferring data into a new format to make it more appropriate for analysisSearching through large data sets for usable informationCreating new, experimental frameworks to collect dataBuilding tools to automate data collectionCorrelating similar data to find actionable resultsCreating reports and presentations for business usesSkills Required Statistics F

Spark Cluster Administrator

Aptino

Remote

Contract, Third Party

We are seeking a Spark Cluster Administrator to manage and optimize our Spark infrastructure. The ideal candidate will handle cluster configuration, performance tuning, and monitoring. Responsibilities include managing Hadoop ecosystem components, troubleshooting issues, and ensuring high availability. Experience with Spark, Hadoop, and monitoring tools is essential. Knowledge of Helm or Kubernetes is good to have.

AI / ML Engineer

Shrive Technologies LLC

Remote

Contract

Role: AI/ ML - Machine Learning Engineer Location: Remote AWS, Python, Airflow, Kedro, or Luigi Hadoop, Spark, or similar frameworks. Experience with graph databases a plus. Responsibilities: Strong experience in PythonExperience in data product development, analytical models, and model governanceExperience with AI workflow management tools such as Airflow, Kedro, or LuigiExposure statistical modeling, machine learning algorithms, and predictive analytics.Highly structured and organized work pla

Scala Developer

Resourcesoft, Inc.

Remote

Contract

Requirements: Minimum 4 years' experience in Scala.Proficiency in building REST services from the ground up.Experience with TDD and ATDD, using Cucumber-Jvm and Scala Test.Experience in using technologies like Http4s, Play2, and Akka.Experience with Kafka, ELK, Scala, and Hadoop.Experience in working with Apache Spark and AWS (Lambda, S3, Kinesis, SQS).Strong skills in OOP & Functional paradigms.Responsibilities: Develop and maintain high-quality code in Scala.Build and manage RESTful services.I

Bigdata Hadoop Engineer

Softsol Resources Inc

Remote or California City, California, USA

Contract

Please send me your resume to if you are interested and can be able to work on W2. Need local to California Job Title: Hadoop Engineer- CR243 Location: Remote but looking local to California. Duration : Long Term Job Description: MUST Have SKILLS. Strong project experience in Big Data, Cloudera Distribution 7.x, Cloud migration, RDBMS is must. Strong project experience with Amazon EMR/Databricks/Cloudera CDP is must. 4-5 experience building data pipelines using Hadoop components Sqoop, Hive, S

AWS Data Engineer

Allwyn Corporation

Remote

Contract, Third Party

Job Title: AWS Data Engineer Location: Washington D.C (Remote) AWS Hands On development, Extensive experience in cloud technologies for streaming platforms, with a focus on AWS services for data lake creation, orchestration, and analytics.Innovative problem solver with a demonstrated ability to develop intricate algorithms based on deep-dive statistical analysis, enhancing customer relationships, and personalizing interactions.Hands-on experience with cloud-based tools such as AWS EMR, EC2, Data

AWS Data Engineer---AWS Data engineer certified only

Allwyn Corporation

Remote

Contract, Third Party

Position: AWS Data engineer Location: Remote Duration: Long Term Job Description: AWS Hands On development, Extensive experience in cloud technologies for streaming platforms, with a focus on AWS services for data lake creation, orchestration, and analytics. Innovative problem solver with a demonstrated ability to develop intricate algorithms based on deep-dive statistical analysis, enhancing customer relationships, and personalizing interactions. Hands-on experience with cloud-based tools such

Data Lead

Coforge

Remote

Full-time

Experience : 9-12 years. Experience with data pipeline and workflow management tool: Google Cloud Platform Services Data Proc, and GCS. Big query, Cosmos DB, Hive SCALA is Mandatory Py-spark( in memory process) , having bigdata eco system. Hadoop, Cloudera, MAP-Reduce

MLOPS Architect (Machine Learning / AI Architect)

IDC Technologies

Remote

Contract, Third Party

Dear Applicant, Hope you are doing well We have an urgent requirement of MLOPS Architect (Machine Learning / AI Architect) with one of our global consulting clients. Kindly click to apply if you are available and interested in the job role mentioned below. Title: MLOPS Architect (Machine Learning / AI Architect) Location: Remote Role Long Term Contract Position Responsibilities: Strong experience in PythonExperience in data product development, analytical models, and model governance.Experience

MLOPS Architect (Machine Learning / AI Architect

eTeam, Inc.

Remote

Contract

AWS, Python, Airflow, Kedro, or Luigi Hadoop, Spark, or similar frameworks. Experience with graph databases a plus. Designing Cloud Architecture:As an AWS Cloud Architect, you ll be responsible for designing cloud architectures, preferably on AWS, Azure, or multi-cloud environments. Your architecture design should enable seamless scalability, flexibility, and efficient resource utilization for MLOps implementations. Data Pipeline Design:Develop data taxonomy and data pipeline designs to ensure