Remote opportunity Hadoop Admin based Accountable for storage, performance tuning and volume management of Hadoop clusters and MapReduce routinesInvolves designing, capacity arrangement, cluster set up, performance fine-tuning, monitoring, structure planning, scaling and administrationMonitor Hadoop cluster connectivity and performanceSQL and performance tuning at all levelsDocument new environments and develop standards for organization to support a robust
Hadoop/Big data administrator || Remote Position is hybrid and will require the resource to work from office 3 days a week mandatorily Description: Monitor the Hadoop platforms and debug both platform and application issues.Should be able to Automate manual tasks via ansible or PythonKnowledge on HBase is plus.Should have good Hadoop issue debugging skills.
Position: Hadoop Administrator/Developer Location: Remote Duration: 6 Months Requirement: Hadoop Administrator/Developer who is currently managing Hadoop in a Linux environment. Linux is very important. Shell scripting Recent Hortonworks experience Exposure to a data migration project from on-prem to Azure Hive, Spark, Flume, HBase, Oozie
Data Engineer Experience : 8+ only100% Remote H1B Transfer and OPT We also do H1B Transfer. (We also Sponser Visa). Employment Type: Contract W2 Must Have Skills/Attributes: Python, SQL Job Description Job Title: Data Engineer Duration: 12+ Months (Potential to extend to 12+ months). Location: Remote Qualifications : Bachelor or Master's degree in Computer Science, Engineering, Information Systems or relevant degree is required 7+ years of professional work experience designing and implemen
Job Title: Google Cloud Platform Data Architect/Data Engineer Location: Remote Contract Role Need Sr Data folks with focus on Data development. Job Description: Responsibilities: Design appropriate data models for the use in transactional and big data environments as an input into Machine Learning processing. Design and Build the necessary infrastructure for optimal ETL from a variety of data sources to be used on Google Cloud Platform services. Develop data and semantic interoperabilit
MongoDB Architect - (W2 ONLY) Remote Longterm Job Description/ Responsibilities Maintain and configure MongoDB instances. Keep clear documentation of the database setup and architecture Write procedures for backup and disaster recovery. Ensure that the databases achieve maximum performance and availability Design, indexing strategies, Configure, monitor, and deploy replica sets Upgrade databases through patches. Create roles and users and set their permissions Experience in optimizing insertions
Our client is looking to hire a Data Engineer. 2months (extendable). Immediate start 100% remote work. As our Data Engineer:You will partner with Data Scientists, ML Engineers and Application Developers to develop robust pipelines ingesting, transforming, and refining data at scale. On any given day we hope that you will:Partner closely with Data Scientists, Machine Learning Engineers, and Application Developers to understand data requirements and contribute to the design of data solutions.Desig
Role: Alloy DB Developer with Google Cloud Platform Remote Design, implement, and maintain Alloy DB solutions on the Google Cloud Platform.Collaborate with development teams to optimize data models, queries, and database performance.Perform Alloy DB installations, upgrades, and patch management following best practices.Monitor and ensure the availability, integrity, and security of Alloy DB databases.Implement and maintain backup and recovery strategies to safeguard data and ensure business cont
Jr Data Modeler Pay Rate: $45-60 per hour Duration: 6 month contract 100% Remote 2+ years of Data Modeler experience Experience building Data Sets for consumptionExperience building data sets using SQL Advanced level SQL developmentAdditional experience:Cloud Experience: Proven experience with cloud platforms like AWS (preferred), Azure, or Google Cloud Platform.SQL Expertise: Advanced proficiency in SQL programming, with a preference for Hive/Impala or Snowflake.Data Modeling: Experience buil
Position: ML Infrastructure Engineer Location: Mountain View, CA (Remote) No Corp to Corp Description: Experience in Machine Learning engineer or Infrastructure roles, with a focus on Machine Learning infrastructure.Proficiency in programming languages Python, JavaExperience with cloud platforms such as AWS, Google Cloud Platform, or Azure. Google Cloud Platform experience is strongly preferred.Familiarity with machine learning frameworks (e.g., TensorFlow, PyTorch) and libraries (e.g., scikit-l
Hello Please review the JD below and let me know your interest. Job Title: Google Cloud Platform Data Engineer / Platform Data Engineer Location: REMOTE Duration: Long Term Need minimum 12 + years of experience is a MUST. Google Cloud Platform Data Engineer - Latest Update We are seeking a talented and experienced Google Cloud Platform Data Engineer to join our team. As a Google Cloud Platform Data Engineer, you will be responsible for designing, implementing, and maintaining scalable data sol
Job Description: Professional experience in Application Design, Architect and Product Development using full SDLC Primarily using Hadoop, Java Experience in CCDA/FHIR and Expert in Distributed computing, algorithm and data analytics. Very good experience providing software solutions and Architect data modeling, delivery quality product and Excellent team player with technical, communication knowledge and various business domain knowledge. Having good experience with Hadoop, core Java with 3 plus
Lead ETL Tester 12 months Remote 12+ years of hands-on testing experience in working Bigdata technologies including Azure, DataBricks, Confluent Kafka.Must have Onsite/Offshore Lead experience.Experience in working & leading Enterprise EDW / DWH QA engagementsProficiency in programming languages, including Structured Query Language (SQL).Strong experience & expertise in creation of Test Automation frameworksExperience in QA for streaming data ingestion would be a plusKnowledge of data pipelines
Direct Client Requirement Position: Lead Software Developer- Bigdata (Cloudera, Hadoop, Scala) Location: Remote Type: Contract to Hire Overview: Who builds software solutions for innovative home health solutions that produce better outcomes and reduce overall costs through partnerships with providers and payors. The Universal Data Hub (UDH) engineering team builds the next generation of software integration components leverage NoSQL DBs, Streaming data, and Micro services. We are looking for a L
NLP Engineer (Must have strong experience with Natural Language Processing (NLP) Remote(Anywhere in USA) Contract We are seeking a highly skilled and experienced Data Architect with expertise in Natural Language Processing (NLP) to join our innovative team. As a Data Architect, you will play a crucial role in designing and implementing robust data architectures that support advanced NLP applications and drive business value. Responsibilities: Design and develop scalable and efficient data archit
Job Description - Generative AI Data Engineer SME 6 Months Remote Phone/Skype Need candidates with all the mentions & highlighted skills "Essential Requirements: 10+ years working with Enterprise-level Data Analytics, Big Data, Data Warehousing, Data Lakes, Data Lake houses, and Data Meshes. 10+ years working with data modeling tools. 5+ years building data pipelines for large customers. 10+ years working with data quality management tools, and data ETL/ELT tools. 5+ years working with data cata
Our direct, Fortune 100 client in a Northeast suburb of Cleveland is looking for a REMOTE Scrum Master for a long-term, extendable contract position. You will lead and facilitate traditional Scrum ceremonies, work with the Product Owner to manage and prioritize the product backlog, and lead the user story collection and creation process. This position is open to fully remote candidates but will follow an EST schedule. Day-to-day responsibilities: - Facilitates agile events and activities as nece
Title: Software Developer - Machine Learning Contract: 12+ Months Location: Remote Pay Rate $75/Hr on W2 Mid-level / Exp Needed 3-5 years of experience. (No C2C or Third-Party Vendors) Description: Ability to write robust code in one or more of Python, Go, and Java Proficient in core technologies like Spark, Hadoop and Hive. Experience in building real-time applications, preferably in Spark and streaming platforms like Kafka and Kinesis. Good understanding of machine learning pipelines and ma
Title: Big Data-Data Analyst Description: Work closely with ACE workstreams to help with analysis, reporting, data ingestion from many databases including the data lake Design and build the infrastructure for data extraction, preparation, and loading of data from a variety of sources using technology such as SQL and big data tools Build data and analytics tools that will offer deeper insight into the pipeline, allowing for critical discoveries surrounding key performance indicators and customer