Job Title: Databricks Architect Duration: 12 months Location: Remote Skills Required: Architecting and Leadership:Deep understanding of modern data architecture, including Lakehouse, Medallion architecture, and Data Mesh.Data Engineering:Strong programming experience in Python; Scala is a significant plus.Proficiency in complex query tuning.Experience building Slowly Changing Dimensions (SCD) TYPEs.Familiarity with DBT.Experience with structured streaming.Knowledge of data formats like Iceberg
Position overview: We are looking for a hands-on Lead Data Engineer to join our team! In this role, you will be responsible for leading the team of data engineers (Onsite/Offshore) to build new data pipelines (loading millions of transformed true records into our cloud data warehouse in a timely manner) to support evolving analytical needs for critical business decisions. Manage existing data pipelines keeping data relevant and true, ready for analytical consumption. Key responsibilities: Lead d
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer: Design, model, or implement data warehousing activities in order to contribute to the design and development of Facebook products. Telecommuting is permitted from anywhere in the U.S. (ref. code REQ-2505-151081: $229173/year - $235400/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect the base salary onl
This is a FTE role This role is not open for sponsorship 10 years of experience in data analytics, data modelling, or business intelligence roles.Strong understanding of the Property & Casualty insurance business lines.Proficient in Snowflake including SQL, schema design, and performance optimization.Strong analytical skills with ability to interpret data and translate findings into actionable insights.Proficiency in data visualization tools (e.g., Power BI, Tableau) and familiarity with Python
Job Summary: We are seeking a highly experienced Informatica Extract Transform Load (ETL) Developer to join our Enterprise Data Warehouse (EDW) team. This role is responsible for designing, developing, and maintaining robust ETL processes using Informatica PowerCenter and Teradata. The ideal candidate will have a deep understanding of data integration, performance tuning, and data warehousing best practices, with a focus on supporting healthcare-related data systems. Required Qualifications: 7+
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer, Product Analytics: Design, model, or implement data warehousing activities in order to contribute to the design and development of Facebook products. Telecommuting is permitted from anywhere in the U.S. (ref. code REQ-2506-152543: $195,254/year - $235,400/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflec
Cycle3 IT Staffing is seeking several Resident Solutions Architects who are SENIOR LEVEL for a REMOTE role. Architecting and Leadership: o Deep understanding of modern data architecture, including Lakehouse, Medallion architecture, and Data Mesh. Data Engineering: o Strong programming experience in Python; Scala is a significant plus. o Proficiency in complex query tuning. o Experience building Slowly Changing Dimensions (SCD) TYPEs. o Familiarity with DBT. o Experience with structured streaming
Note : GCEAD, L2EAD, consultant can apply for this position who can work on W2. Job Descriptions: very senior Senior DataBricks Admin/Lead/ADB Admin/Lead with vast Architectural and solutioning experience in Data, AI and Mil and Strong in Azure Databricks-Data Lakehouse. data warehouse, ETL, BI and AI/ML knowledge, along with strong Python and SQL skills, in designing/developing architectural plans, frameworks and prototypes. Expert in Solutioning not limited to concepts using Power BI, Exc
Data Modeler - Pharma Commercial 12+ years of experience in Conceptual, Logical and Physical Data Modeling experience.Erwin Data Modeling Software hands-on experienceCreate database objects.Create, review and tune SQL scripting.Good knowledge of metadata management, data modeling, and related tools (Erwin, ER Studio, Oracle Designer or PowerDesigner)Data analysis and Database design experience.Knowledge understanding of different database platforms, such as, Oracle, Postgres, Hadoop, Snowflake,
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer, Analytics: Design, model, and implement data warehousing activities to deliver the data foundation that drives impact through informed decision making. Telecommute from anywhere in the U.S. permitted. (ref. code REQ-2502-147413: $212,275/year - $235,400/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect
Sr. Systems Analyst - IBM i / Supply Chain Applications (Remote, EST Hours) Location: Remote - USA Must work Eastern Time hours, 8 AM - 5 PM, Occasional on-call support 24x7 Contract Type: W2 with Health Benefits (Enhanced benefits with PTO available) Duration: 12 Months, possible extension About the Role: We're seeking a highly skilled Sr. Systems Analyst with expertise in IBM i (AS/400) and supply chain systems integration. In this role, you'll support and enhance enterprise-level plannin
This individual will be responsible for: Designing robust, componentized, and adaptable processes to scale-up our business intelligence platform.Understanding big data and cloud technologies and quickly become productive with, and provide analysis on these toolsHaving strong SQL experience and the ability to work with cloud infrastructure programs, OOD, and data pipelinesIdentifying inefficiencies in code or processes and recommend solutions.Being able to adapt quickly to changing requirements a
Description Justice Information Management System (JIMS) 3.0 is a data integration project aimed to amalgamate, re-design and implement several disjointed reporting data marts containing court data into an Enterprise Data Warehouse (EDW) and a set of dependent data marts and views. This project will replace the existing six stand-alone IBM Cognos BI data marts (JIMS 1.0), Crown File Ownership (CFO), Court Case Management (CCM), Courtroom Utilization, Courtroom Activity, and Case Adjournment) wit
10+ years of experience is required Data warehousing solutions on Oracle, Teradata, DB2 etc.Snowflake internals and integration with other data processing technologiesCreation of data ingestion pipelines with tools like Informatica, Talend etc.Cloud computing experience with AWS, Microsoft Azure, Google CloudKnowledge of SQL and complex query writingUnderstanding of data compliance and necessary security protocolsData lakes, data structures and data models suited to Snowflake architectureETL pro
COMPANY OVERVIEW adroitts is a fast-growing IT solutions company that helps businesses adapt and grow in a continuously evolving market. Our tailor-made technological solutions are perfectly aligned to our client s business goals and objectives. we strive to be a long-term trusted and reliable partner for our customers' organization to help overcome IT challenges. Our solutions, methodologies and implementations are designed with customer centric focus and customers ROI. We pride ourselves in ou
Tech Stack: ETL Tools, Data Pipelines, DevOps We're looking for a Technical Lead to architect and guide backend data integration efforts within large-scale insurance systems. You will own the ETL design, technical reviews, and vendor oversight, ensuring secure, performant, and standards-compliant data flows. Responsibilities: Design and approve architecture of ETL pipelines and backend systems Ensure compliance with enterprise data and security standards Conduct code reviews, approve data mapp
Required Knowledge and Level of Experience 15+ years of experience implementing data management solutions (Required).Management consulting experience (Highly Desired).Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel and automating them.Experience in handling semi-structured data (JSON, XML), columnar PARQUET using the VARIANT attribute in Snowflake.Experience in in re-clustering of t
Position Overview: We are looking for an experienced Senior ETL SSIS Developer to join our team and contribute to critical enhancements of the Admission, Discharge, and Transfer (ADT) System. This role will focus on developing and optimizing ETL solutions using Microsoft SSIS, with data sourced from Teradata, SQL Server, and CSV files. The ideal candidate will have a strong background in healthcare data integration and must possess a solid understanding of HL7 standards and HIPAA compliance du
We are seeking an experienced Senior Data Analyst to join our client s Enterprise Data team. The ideal candidate will have 8+ years of hands-on experience, with a deep understanding of the P&C insurance domain, strong data modelling capabilities, and advanced skills in Snowflake. Experience working with Agency Management Systems such as EPIC and AMS360 is a significant advantage. Requirements: 8+ years of experience in data analytics, data modelling, or business intelligence roles.Strong underst
Required Skills: 5+ years of experience in BI development with a focus on Power BI and MicroStrategy.Strong proficiency in DAX, Power Query, and Power BI Service (publishing, gateways, scheduling).Solid experience creating and managing MicroStrategy dashboards, reports, and schema objects.Deep understanding of data modeling, star/snowflake schemas, and ETL processes.Experience with SQL, stored procedures, and data warehousing concepts.Excellent problem-solving skills and attention to detail.Stro