Qualifications: 14+ years of experience in ETL & Data WarehousingShould have excellent leadership & communication skillsShould have strong working experience on Data Lakehouse architecture.Should have strong working experience in implementing CI/CD, on DevOps tools like GitHub, Jenkins, Docker, Azure Kubernetes Service, Ansible, etc, and proficient in JSON, YAML, etcShould have strong experience in implementing Azure infrastructure & Infrastructure-as-a-code (IaaC)Should have very good knowledge
Hello Everyone, Hope you are doing good My name is Pavan and I work with SPAR Information System., I have a great opportunity for you, please find the job details below, if you are interested in applying please send me your updated resume and best time for you to discuss about this opportunity in details Role: Data Warehouse Principal Architect Location: Chicago, IL/Seattle, WA (Remote until COVID) Duration: 12 Months Contract to Hire Job description: Basic qualifications: Experience designing
Job Title: Technical Manager - Master Data and Enterprise Datawarehouse program Job Summary:We are seeking a skilled and experienced Technical Manager to lead the design, implementation, and maintenance of Master Data Management and Enterprise data warehousing solutions. The ideal candidate will manage technical teams, oversee data integration projects, and ensure the delivery of high-performance, scalable, and reliable data warehousing solutions. Key Responsibilities:1. Strategic Planning and L
Job Details: Experience: 10+ Years Responsibilities: * Data Warehouse Design and Development (Snowflake): Design, develop, and maintain scalable and efficient data models and data warehouses within the Snowflake environment. * ETL/ELT Pipeline Development: Design, build, and optimize robust ETL/ELT pipelines using various tools and technologies to ingest, transform, and load data from diverse sources into Snowflake and other data stores. * Azure Data Services: Utilize and integrate various Azure
Knowledge of SQL and data modeling Strong knowledge and experience working with LookerML and Looker Development Experience with data visualization, dashboard. Understanding of source data optimization for Looker processing Understanding of data modelling, relational database design, ETL process, SQL, data analysis skill and basic data warehousing concepts Looker Developer2SQL,LookerN/AC2C,W2Canada
Role: ETL DataStage Developer Location: Remote Required Skills: Hands-on experience in Enterprise Data Warehouse, Encounter Processing Module, reporting engines, and an Integrated Case Management system. Experience with IBM InfoSphere DataStage, DB2.Experience with Medicaid Statistical Information System or Children's Health Insurance Program.
Minimum 15 years of experience in data product management, with strong stakeholder alignment and strategic planning skills.Minimum 10 years of domain experience in banking (preferably in commercial and retail banking) and regulatory data.Minimum 10 years of experience working within Enterprise Data Warehouse (EDW) environments.Define and manage product scope, vision, priorities, roadmap, and backlog aligned with business objectives.Drive PI and sprint planning, proactively identifying and mitiga
Hands-on experience with Power BI. Strong proficiency in data modeling concepts including relationships, hierarchies, and normalization/denormalization. Deep understanding of cross-filtering behavior (single vs both direction), context transition, and filter propagation in Power BI. Solid experience with DAX and Power Query (M language). Experience with performance tuning and troubleshooting report slowness. Knowledge of data warehousing concepts and ETL workflows. Familiarity with Power BI REST
We are seeking a seasoned Data Migration & Data Warehousing Specialist with over 10 years of experience in data management, ETL, and large-scale system rollouts. The ideal candidate will have strong expertise in data mapping, migration strategies, and supply chain systems, with exposure to ERP platforms like SAP or Oracle. This role requires a hands-on expert who can ensure accurate and efficient data migration while collaborating closely with business teams. Key Responsibilities:Data Migration
Job Title: Databricks Architect Duration: 12 months Location: Remote Skills Required: Architecting and Leadership:Deep understanding of modern data architecture, including Lakehouse, Medallion architecture, and Data Mesh.Data Engineering:Strong programming experience in Python; Scala is a significant plus.Proficiency in complex query tuning.Experience building Slowly Changing Dimensions (SCD) TYPEs.Familiarity with DBT.Experience with structured streaming.Knowledge of data formats like Iceberg
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer: Design, model, and implement data warehousing activities to deliver the data foundation that drives impact through informed decision making. Telecommute from anywhere in the U.S. permitted. (ref. code REQ-2507-153156: $200,066/year - $208,069/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect the base sa
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer, Analytics: Design, model, and implement data warehousing activities to deliver the data foundation that drives impact through informed decision making. Telecommute from anywhere in the US permitted. (ref. code REQ-2506-151308: $222,388/year to $235,400/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect t
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer: Design, model, or implement data warehousing activities in order to contribute to the design and development of Facebook products. Telecommuting is permitted from anywhere in the U.S. (ref. code REQ-2505-151081: $229173/year - $235400/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect the base salary onl
10+ years of experience in data modeling, data architecture, or data engineering roles. 4+ years of experience modeling data in Snowflake or other cloud data warehouses. Strong understanding and hands-on experience with Medallion Architecture and modern data platform design. Experience using data modeling tools (Erwin etc.). Proficiency in data modeling techniques: 3NF, dimensional modeling, data vault, and star/snowflake schemas. Expert-level SQL and experience working with semi-structured data
Responsibilities: Lead the migration of the existing SSIS-based ETL workflows to cloud-native pipelines using DBT and/or Google Cloud Platform tools such as Dataflow, Dataform, or Cloud Composer (Airflow). Design and implement scalable, efficient data models in BigQuery, following best practices for dimensional modeling. Optimize and maintain existing SQL transformations, ensuring correctness and performance in the cloud. Collaborate with BI developers and analysts to ensure data marts align wi
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer, Product Analytics: Design, model, or implement data warehousing activities in order to contribute to the design and development of Facebook products. Telecommuting is permitted from anywhere in the U.S. (ref. code REQ-2506-152543: $195,254/year - $235,400/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflec
Cycle3 IT Staffing is seeking several Resident Solutions Architects who are SENIOR LEVEL for a REMOTE role. Architecting and Leadership: o Deep understanding of modern data architecture, including Lakehouse, Medallion architecture, and Data Mesh. Data Engineering: o Strong programming experience in Python; Scala is a significant plus. o Proficiency in complex query tuning. o Experience building Slowly Changing Dimensions (SCD) TYPEs. o Familiarity with DBT. o Experience with structured streaming
Job Summary: We are seeking a highly experienced Informatica Extract Transform Load (ETL) Developer to join our Enterprise Data Warehouse (EDW) team. This role is responsible for designing, developing, and maintaining robust ETL processes using Informatica PowerCenter and Teradata. The ideal candidate will have a deep understanding of data integration, performance tuning, and data warehousing best practices, with a focus on supporting healthcare-related data systems. Required Qualifications: 7+
Job Title: Sr. SQL Developers - Remote - 2 positions Contract - 6 months 1099 or C2C Exp: 12+ Years minimum note: no H1b's Responsibilities: Design, build, and optimize data pipelines for Data Warehouse using SQL Server and SSIS Integrate data from various sources (e.g., SQL Server, Excel, APIs, Smartsheet) Support reporting and analytics using Kimball methodology Ideal Candidate Profile: 7+ years of experience with SQL Server, T-SQL, and SSIS Nice to have experience with Azure Data Factory, Az
Design, develop, and automate ETL processes using DBT and AWS Build robust data pipelines to move data from various sources to data warehouses or data lakes. Collaborate with cross-functional teams to ensure data accuracy, completeness, and consistency. Perform data cleansing, validation, and transformation to ensure data quality and integrity. Optimize database and query performance to ensure efficient data processing. Work with data analysts and data scientists to provide clean, reliable data