Hands-on experience with Power BI. Strong proficiency in data modeling concepts including relationships, hierarchies, and normalization/denormalization. Deep understanding of cross-filtering behavior (single vs both direction), context transition, and filter propagation in Power BI. Solid experience with DAX and Power Query (M language). Experience with performance tuning and troubleshooting report slowness. Knowledge of data warehousing concepts and ETL workflows. Familiarity with Power BI REST
Job Title: Databricks Architect Duration: 12 months Location: Remote Skills Required: Architecting and Leadership:Deep understanding of modern data architecture, including Lakehouse, Medallion architecture, and Data Mesh.Data Engineering:Strong programming experience in Python; Scala is a significant plus.Proficiency in complex query tuning.Experience building Slowly Changing Dimensions (SCD) TYPEs.Familiarity with DBT.Experience with structured streaming.Knowledge of data formats like Iceberg
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer: Design, model, and implement data warehousing activities to deliver the data foundation that drives impact through informed decision making. Telecommute from anywhere in the U.S. permitted. (ref. code REQ-2507-153156: $200,066/year - $208,069/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect the base sa
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer, Analytics: Design, model, and implement data warehousing activities to deliver the data foundation that drives impact through informed decision making. Telecommute from anywhere in the US permitted. (ref. code REQ-2506-151308: $222,388/year to $235,400/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect t
10+ years of experience in data modeling, data architecture, or data engineering roles. 4+ years of experience modeling data in Snowflake or other cloud data warehouses. Strong understanding and hands-on experience with Medallion Architecture and modern data platform design. Experience using data modeling tools (Erwin etc.). Proficiency in data modeling techniques: 3NF, dimensional modeling, data vault, and star/snowflake schemas. Expert-level SQL and experience working with semi-structured data
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer: Design, model, or implement data warehousing activities in order to contribute to the design and development of Facebook products. Telecommuting is permitted from anywhere in the U.S. (ref. code REQ-2505-151081: $229173/year - $235400/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect the base salary onl
Job Title: Azure Databricks Architect Location: Remote Duration: / Term: 12+ months Job Description: Experience Desired: 12+ Years. Key required skills Manage end to end delivery by Investigating problem areas, working cross-functionally with product manager & other stakeholders. Follow the Agile development methodology; think strategically and execute methodically. Develop and manage capacity and growth projection forecasts of the environment within budgets. Create and maintain optimal da
Responsibilities: Lead the migration of the existing SSIS-based ETL workflows to cloud-native pipelines using DBT and/or Google Cloud Platform tools such as Dataflow, Dataform, or Cloud Composer (Airflow). Design and implement scalable, efficient data models in BigQuery, following best practices for dimensional modeling. Optimize and maintain existing SQL transformations, ensuring correctness and performance in the cloud. Collaborate with BI developers and analysts to ensure data marts align wi
Job Summary: We are seeking a highly experienced Informatica Extract Transform Load (ETL) Developer to join our Enterprise Data Warehouse (EDW) team. This role is responsible for designing, developing, and maintaining robust ETL processes using Informatica PowerCenter and Teradata. The ideal candidate will have a deep understanding of data integration, performance tuning, and data warehousing best practices, with a focus on supporting healthcare-related data systems. Required Qualifications: 7+
Cycle3 IT Staffing is seeking several Resident Solutions Architects who are SENIOR LEVEL for a REMOTE role. Architecting and Leadership: o Deep understanding of modern data architecture, including Lakehouse, Medallion architecture, and Data Mesh. Data Engineering: o Strong programming experience in Python; Scala is a significant plus. o Proficiency in complex query tuning. o Experience building Slowly Changing Dimensions (SCD) TYPEs. o Familiarity with DBT. o Experience with structured streaming
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer, Product Analytics: Design, model, or implement data warehousing activities in order to contribute to the design and development of Facebook products. Telecommuting is permitted from anywhere in the U.S. (ref. code REQ-2506-152543: $195,254/year - $235,400/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflec
Design, develop, and automate ETL processes using DBT and AWS Build robust data pipelines to move data from various sources to data warehouses or data lakes. Collaborate with cross-functional teams to ensure data accuracy, completeness, and consistency. Perform data cleansing, validation, and transformation to ensure data quality and integrity. Optimize database and query performance to ensure efficient data processing. Work with data analysts and data scientists to provide clean, reliable data
Please note that the consultant need to be on our W2 Title: Senior Analyst/Programmer Duration: 1 Year Client: Mayo Clinic Req ID: 36265123 RemoteScope: The IT Image Management Systems unit is seeking a talented and motivated individual to join the team supporting the InfinityView application. The InfinityView application is the general medical image viewer for patient care at Mayo Clinic. Primary responsibilities will include support, maintenance, and limited development. Requirements: -A b
Note : GCEAD, L2EAD, consultant can apply for this position who can work on W2. Job Descriptions: very senior Senior DataBricks Admin/Lead/ADB Admin/Lead with vast Architectural and solutioning experience in Data, AI and Mil and Strong in Azure Databricks-Data Lakehouse. data warehouse, ETL, BI and AI/ML knowledge, along with strong Python and SQL skills, in designing/developing architectural plans, frameworks and prototypes. Expert in Solutioning not limited to concepts using Power BI, Exc
Data Modeler - Pharma Commercial 12+ years of experience in Conceptual, Logical and Physical Data Modeling experience.Erwin Data Modeling Software hands-on experienceCreate database objects.Create, review and tune SQL scripting.Good knowledge of metadata management, data modeling, and related tools (Erwin, ER Studio, Oracle Designer or PowerDesigner)Data analysis and Database design experience.Knowledge understanding of different database platforms, such as, Oracle, Postgres, Hadoop, Snowflake,
About DataAffect: We are a boutique data/service management firm specializing in the delivery of Data Governance, Enterprise Data Strategy, Solutions Architecture, Data Warehousing, Data Integrations, Data Security & Privacy Management, IT Service Management, Business analysis, and Agile Project Management services to diverse clients across multiple industries. Role: BigID Engineer Location: Remote Duration: 12+ months Required skills: Must have 10+ years of IT experience.BigID Experience: 5+ ye
This individual will be responsible for: Designing robust, componentized, and adaptable processes to scale-up our business intelligence platform.Understanding big data and cloud technologies and quickly become productive with, and provide analysis on these toolsHaving strong SQL experience and the ability to work with cloud infrastructure programs, OOD, and data pipelinesIdentifying inefficiencies in code or processes and recommend solutions.Being able to adapt quickly to changing requirements a
COMPANY OVERVIEW adroitts is a fast-growing IT solutions company that helps businesses adapt and grow in a continuously evolving market. Our tailor-made technological solutions are perfectly aligned to our client s business goals and objectives. we strive to be a long-term trusted and reliable partner for our customers' organization to help overcome IT challenges. Our solutions, methodologies and implementations are designed with customer centric focus and customers ROI. We pride ourselves in ou
the primary goal is to replace the legacy reporting function with a modern, scalable Data Operations Team (initially focusing on these four roles) that can deliver faster, more automated, and insight-driven reporting aligned with the organization s evolving data needs. Specifically, these four new hires will: Audit and rationalize 700+ legacy reportsBuild baseline executive and operational dashboardsIntroduce automation and standard templates to reduce manual report requests (currently over 200/
Hi , Hope you are doing good. This is Shahbaz from Prodware Solutions LLC. We have a role which closely matches your skill set. Please go through the job description below and let me know if you or anyone who is looking for a good opportunity. You can reach me on . Job title: Lead Data EngineerDuration: FulltimeLocation: Remote 7-12 years experience on Data Engineering role working with Databricks & Cloud technologies. Strong proficiency in PySpark, Python, SQL. Strong experience in data mode