In the Technology division, we leverage innovation to build the connections and capabilities that power our Firm, enabling our clients and colleagues to redefine markets and shape the future of our communities. This is a Cloud & Infrastructure Engineering II position at the Associate level, which is part of the job family responsible for managing and optimizing technical infrastructure and ensuring the seamless operation of IT systems to support business needs effectively. Morgan Stanley is an
Job title Unix Administrator Location Alpharetta, GA (Work from office 2-3 days a week) FTE/Fulltime Key responsibilities: System Administration: Install, configure, and maintain Linux operating systems on servers and devices. Monitor system performance, ensure the reliability and availability of systems, identify bottlenecks, and implement performance tuning strategies. Perform regular system updates, patches, and security configurations. Monitor server performance Security Management: I
Job Title: Microservices Software Engineer Location: Alpharetta, GA (1-2 days Onsite) Duration: 12 Months (Contract)- potential contract-to-hire Job Description: Required Skills: 8 years of relevant experience (preferably from a software development background).BS in Computer Science or related field.Experience developing microservices.Java, Spring frameworks, Spring boot Splunk or Data Dog RESTful API development Experience with developing and deploying applications on Azure or similar cloud
Kore.ai Administrator Location: Remote Duration: 9-12 Month Implementing Client: Infosys JD: kore.ai administrator with strong Infra/devops experience Hands on experience in python.Good Knowledge in Application Integrations, Monitoring.Good Knowledge in debugging issues.Strong experience in Unix/Linux environment administration.Proficient in managing web servers (Apache, Nginx, IIS) and understanding of web technologies.Ability to identify and mitigate network vulnerabilities and explain how
Required Skills12+ years in data engineering or architecture, with a strong focus on Databricks (at least 4-5 years) and AI/MLenablement.Deep hands-on experience with Apache Spark, Databricks (Azure/AWS), and Delta Lake.Proficiency in AI/ML pipeline integration using Databricks MLflow or custom model deployment strategies.Strong knowledge of Apache Airflow, Databricks Jobs, and cloud-native orchestration patterns.Experience with structured streaming, Kafka, and real-time analytics frameworks.Pro
Role: Regulatory Reporting DeveloperExperience: 6-9 yearsWork Location: RemoteProject Duration: 12 Month Contract Key Responsibilities:Develop and maintain regulatory reporting data pipelines and processing solutions.Design and optimize SQL queries and procedures using Oracle PL/SQL.Work with OFSAA (Oracle Financial Services Analytical Applications) for regulatory reporting implementations.Contribute to development tasks using Java where required.Implement workflow orchestration and scheduling u
Google Cloud Platform Big query Architect Exp: 15+ years No Data Engineer please We are seeking a highly skilled BigQuery Data Architect to lead the design, implementation, and optimization of our cloud data platform. The ideal candidate will have deep experience with Google Cloud Platform (Google Cloud Platform), particularly BigQuery, and possess strong data modeling, ETL/ELT pipeline, and cloud architecture expertise. Key Responsibilities: Design scalable and secure cloud-based data architect
Mainframe Developers with Kafka and AWS exp. Location: Remote (Connecticut) Start Date: ASAP Type: Contract- Only W2 candidate cand apply. No C2C,pls We are looking for 10 15 skilled professionals with a blend of mainframe and modern cloud technologies to support a critical ramp-up for a Healthcare Claims team. Required Skillsets: Mainframe: VSAM, DB2Streaming Platforms: Apache KafkaCloud: AWS (preferably experience with DocumentDB)Change Data Capture (CDC): Ideally using Precisely (similar to
The position is described below. If you want to apply, click the Apply Now button at the top or bottom of this page. After you click Apply Now and complete your application, you'll be invited to create a profile, which will let you see your application status and any communications. If you already have a profile with us, you can log in to check status. Need Help? If you have a disability and need assistance with the application, you can request a reasonable accommodation. Send an email to Acce
JOB-7352JAVA DeveloperREMOTEContractLink Technologies (LinkTechConsulting.com), a Las Vegas based IT consulting firm, is currently seeking a JAVA Developer to join our team. Employer asks for I-9 information. QUALIFICATIONS The ideal candidate must have proven experience and proficiency in the following technologies and tools: Strong experience with Java PlatformProficient in Spring BootHands-on experience with MavenProficient in AngularExperience using EclipseProficient in MS SQL Server databas
Job Title Data Architect (Databricks) Location Remote Duration: Full-time Role: We are seeking a seasoned Data Architect with deep expertise in Databricks, Lakehouse architecture, and AI/ML/GenAI enablement to lead a critical modernization initiative. The role involves transforming a legacy platform into a future-ready, scalable, cloud-native Databricks-based architecture. You will drive design and implementation of high-performance data pipelines, orchestrate data workflows, and integrate AI/ML
Design and develop RESTful and GraphQL APIs using Java and ScalaBuild event-driven services and microservices that produce and consume messages via Apache KafkaImplement backend systems using frameworks like Spring Boot, Akka HTTP, or PlayEnsure reliable, fault-tolerant Kafka integration, including schema validation, error handling, and retriesCollaborate with frontend and platform teams to define clear API contracts and integration patternsContribute to code quality, testing, and CI/CD automati
Key Responsibilities: Design, develop, and maintain scalable data pipelines using Apache Spark on DatabricksImplement data processing logic in Java 8+, leveraging functional programming and OOP best practicesOptimize Spark jobs for performance, reliability, and cost-efficiencyCollaborate with cross-functional teams to gather requirements and deliver data solutionsEnsure compliance with data security, privacy, and governance standardsTroubleshoot and debug production issues in distributed data en
Need 55% React & 45% Python Total 3 Rounds technical Video discussion Job Description: 7 to 10 years of Web development experience using Front-end UI development using HTML, CSS, JavaScript frameworks like React JS (preferably version after 14) & Python 3.7 and web frameworks like Django and Flask server-runtimes, connecting web application to backend databases like SQL (Postgres or MySQL) and NoSQL (preferably Snowflake, but not mandatory)Good at Python development tools like VS Code or anacon
Job Description: We are seeking a skilled and motivated Data Engineer to design, build, and optimize scalable data pipelines and architectures in support of advanced analytics and business intelligence initiatives. The ideal candidate will have hands-on experience with modern data platforms and a strong foundation in data modeling, pipeline orchestration, and cloud-native data services. This role involves collaborating cross-functionally to ensure the availability, integrity, and performance of
Job Title: Data Engineer (W2, Remote)Employment Type: Full-time, W2Location: Remote (U.S.-based) Job Description:We are seeking a skilled and motivated Data Engineer to join our remote team. The ideal candidate will have experience building and maintaining data infrastructure, optimizing data pipelines, and supporting data-driven applications. You will work closely with data scientists, analysts, and product teams to ensure high data quality and availability across the organization. Responsibili
Work with channels across the bank Pyramid Consulting Group Inc has a client with an immediate need for an ACTIMIZE DEVELOPER LEAD with the following background and skills: Leadership Experience Actimize IFM-x 10.1 Actimize ActOne 6.5, 6.6 Oracle 19c Linux RHEL 8.X SQL Agile Flex Skills: Apache 8, 9.x Java Contract to hire (no c2c) Hybrid in hubs or Remote
Hi, We have urgent requirements for our direct client, please go through the below Job Description. If you are interested please send me your updated word format resume to and reach me @ . Job Title: AWS & Snowflake Architect Location: Remote Duration: Full Time Job Description Role Overview The AWS & Snowflake Architect will design and develop enterprise-scale data solutions using Snowflake, focusing on AWS-native frameworks for data ingestion, transformation, and governance. This role require
Job Description: Be a critical senior member of a data engineering team focused on creating distributed analysis capabilities around a large variety of datasetsTake pride in software craftsmanship, apply a deep knowledge of algorithms and data structures to continuously improve and innovateWork with other top-level talent solving a wide range of complex and unique challenges that have real world impactExplore relevant technology stacks to find the best fit for each datasetPursue opportunities to
Job Title: Software Engineer, Python Location: Hillsboro, OR Job Type: Contract, Duration: 12+ Months Job Description: We are seeking a highly motivated Scientific Software Engineer to join a leading biotech research team focused on accelerating small-molecule drug discovery through advanced machine learning and cheminformatics solutions. In this role, you ll collaborate with computational and experimental researchers to design, develop, and deploy Python-based workflows and user interfaces that