Resource 1 is in need of an Artificial Intelligence (AI)/ Machine Learning (ML) Engineer for a long-term, remote contract position. Responsibilities: Architect, build, maintain and improve new and existing suite of algorithms and their underlying systemsImplement end-to-end solutions for batch and real-time algorithms along with requisite tooling around monitoring, logging, automated testing, performance testing and A/B testingEstablish scalable and efficient automated processes for data analyse
Triveni is a technology company located in the New York City area. Triveni utilizes agile technologies to develop solutions for our clients. We are seeking highly motivated engineers to join our team. The successful candidate will focus on the engineering and development of complex business requirements. We provide a casual work environment where hard work is rewarded. Role: BigData Spark Developer Location: Tampa, FL Type: FTE Visa- Citizens and those who are authorized to work can apply fo
This individual will be responsible for: Designing robust, componentized, and adaptable processes to scale-up our business intelligence platform.Understanding big data and cloud technologies and quickly become productive with, and provide analysis on these toolsHaving strong SQL experience and the ability to work with cloud infrastructure programs, OOD, and data pipelinesIdentifying inefficiencies in code or processes and recommend solutions.Being able to adapt quickly to changing requirements a
The Opportunity: We are seeking a full stack engineer/developer with experience in projects emphasizing big data and application migration. This person will be responsible for supporting our customers migration of hundreds of applications to Databricks including authentication, rewiring of connections, building new libraries, and adapting notebooks and code. Work with the Business Intelligence team and operational stakeholders to design and implement migration strategy of read only and transact
Databricks Developer: Java Spark Location: Remote Duration: Long Term Task Description: The Databricks Developer will be responsible for designing, developing, and maintaining scalable data processing solutions on the Databricks platform, with a focus on integrating and transforming IRS datasets such as the Information Returns Master File (IRMF), Business Master File (BMF), and Individual Master File (IMF). This role requires advanced proficiency in Java and Apache Spark, and a deep understandi
Key Responsibilities: Design, develop, and maintain scalable data pipelines using Apache Spark on DatabricksImplement data processing logic in Java 8+, leveraging functional programming and OOP best practicesOptimize Spark jobs for performance, reliability, and cost-efficiencyCollaborate with cross-functional teams to gather requirements and deliver data solutionsEnsure compliance with data security, privacy, and governance standardsTroubleshoot and debug production issues in distributed data en
Mandatory Skills:Snowflake, DBT Experience in other ETL tools & RDBMS SQL Python, Cloud Data Services, Spark BigData RDBMS , SQLHands-on experience in Snowflake & related data platforms Working knowledge of Python, especially for automation and data orchestration use cases. Experience with SAML and SCIM integrations for identity and access management Proficiency in configuring new Snowflake accounts and setting up PrivateLink for secure connectivity. Deep understanding of Snowflake sharing cap
Hands-on experience in Snowflake & related data platforms Working knowledge of Python, especially for automation and data orchestration use cases. Experience with SAML and SCIM integrations for identity and access management Proficiency in configuring new Snowflake accounts and setting up PrivateLink for secure connectivity. Deep understanding of Snowflake sharing capabilities, Listings (auto fulfillment), and Marketplace features. Hands-on experience with Snowflake cost management in multi-tena
Title: Architect (Google Cloud Platform, Python & SQL) Location: Remote Duration:FTE (Permanent) Architect with recent expertise in SQL, Python, ERD, Google Cloud Platform(All services, especially BigQuery, GCS, Cloud Function, Composer), DBT with Active/Heavy hands-on.Must have worked in Big data (100TB+)Possess the knowledge of Modern Data Technology released.Design and optimize conceptual and logical database modelsExtensive experience in data modeling.Analyze system requirements, implementin
Role: Snowflake Architect Location: NYC/Lake Mary, FL Job Description:Hands-on experience in Snowflake & related data platformsWorking knowledge of Python, especially for automation and data orchestration use cases.Experience with SAML and SCIM integrations for identity and access managementProficiency in configuring new Snowflake accounts and setting up PrivateLink for secure connectivity.Deep understanding of Snowflake sharing capabilities, Listings (auto fulfillment), and Marketplace feature
Databricks Architect Location: Remote US Job Type: Contract 6+ Months Experience Level:10+ Years About the Role We are seeking an experienced Databricks Architect to lead the design and implementation of scalable data solutions across cloud platforms. This role demands strong technical leadership and hands-on expertise in Databricks, Apache Spark, and cloud data architecture (Azure or AWS). You will work closely with stakeholders to define data strategies, optimize performance, and ensure archit
Data Engineer - AWS Cloud services Remote role 6 months Bachelor s Degree in Computer Science, Information Technology or Computer Engineering or related field or 4 years relevant experience will be considered in lieu of a degree. 5+ years experience in: Data Engineering, with a focus on AWS Cloud services. Proficiency in Amazon Web Services (AWS) Data Warehouse (Redshift) Database (RDS, DynamoDB) Visualization (Quicksight) Storage (S3) Other (Glue, Athena, CloudWatch) Strong experience with S
We are looking for more ML Engineering profiles who have experience in Python, API (flask, FastAPI), microservices, Kubernetes, Docker, good enough knowledge of data science, and model production Candidate should have Google Colab open on their system and we will test on data analysis, OOP, and model serving. We expect candidate to write their code i.e., no copy paste. We are okay if candidate wants to google syntax and check the documentation but no Gen AI tools. Core skills: microservices, pyt
Role: ML Data Scientist Location: Remote Proven experience as a Machine Learning Engineer (5+ years) Proficiency in Python programming language Experience with RDBMS & NoSQL databases (e.g., MongoDB, BigQuery, PostgreSQL) Experience with data preprocessing, feature engineering, and model evaluation Strong practical experience with Google Cloud Platform (Google Cloud Platform) services for Machine Learning Operations (MLOps), including hands-on experience with Vertex AI for model training, deplo
Location: NYC or Lake Mary, FL Locals only Duration: Long Term Job Description: Hands-on experience in Snowflake & related data platforms Working knowledge of Python, especially for automation and data orchestration use cases. Experience with SAML and SCIM integrations for identity and access management Proficiency in configuring new Snowflake accounts and setting up PrivateLink for secure connectivity. Deep understanding of Snowflake sharing capabilities, Listings (auto fulfillment), and Marke
Job Title: Data Scientist (W2, Remote)Employment Type: Full-time, W2Location: Remote (U.S.-based) Job Description:We are seeking a skilled and motivated Data Scientist to join our team remotely. The ideal candidate will have a strong foundation in data analysis, machine learning, and statistical modeling, with hands-on experience in real-world data applications. Responsibilities: Develop and deploy predictive models and machine learning algorithms Analyze large datasets to extract insights and s
Job Summary:We are seeking a highly skilled and analytical Data Scientist to join our team. You will be responsible for leveraging data to drive business insights, develop predictive models, and support data-driven decision-making across departments. The ideal candidate is proficient in data mining, statistical modeling, machine learning, and data visualization techniques. Responsibilities:Design and implement machine learning models, predictive analytics, and statistical algorithms Analyze larg
ThoughtSpot Developer Remote 6+ Months Key Result Areas and Activities: As a Senior ThoughtSpot Developer, you will be responsible for designing, developing, and maintaining ThoughtSpot applications and solutions that enable our clients to extract actionable insights from their data. You will collaborate with cross-functional teams to create intuitive and interactive data visualization solutions. This role requires a strong understanding of ThoughtSpot's architecture and capabilities, as well as
We are seeking a Principal Search Engineer with deep expertise in Java, Elasticsearch, and Solr to drive the development and optimization of our search and discovery platform. This role focuses on eCommerce search, product relevance ranking, and personalized search experiences to enhance user engagement and conversion rates. You will work with large-scale search architectures, apply machine learning techniques for relevance tuning, and collaborate with data scientists, product teams, and enginee