Fulltime Opportunity Role: Google Cloud Platform Data Engineer Location: Remote - USA Duration: FTE only Job Description: 7+ years' proven experience as a Data Engineer with a focus on Google Cloud Platform services. Strong proficiency in Google Cloud Platform services such as GCS, Dataflow with Apache Beam (Batch & Stream data processing), BigQuery, cloud Composer and Pub/Sub. Proficiency in SQL and Python for data manipulation and analysis is mandatory. Solid understanding of data warehousin
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer, Analytics: Design, model, and implement data warehousing activities to deliver the data foundation that drives impact through informed decision making. Telecommute from anywhere in the US permitted. (ref. code REQ-2506-151308: $222,388/year to $235,400/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect t
10+ years of experience in data modeling, data architecture, or data engineering roles. 4+ years of experience modeling data in Snowflake or other cloud data warehouses. Strong understanding and hands-on experience with Medallion Architecture and modern data platform design. Experience using data modeling tools (Erwin etc.). Proficiency in data modeling techniques: 3NF, dimensional modeling, data vault, and star/snowflake schemas. Expert-level SQL and experience working with semi-structured data
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer: Design, model, or implement data warehousing activities in order to contribute to the design and development of Facebook products. Telecommuting is permitted from anywhere in the U.S. (ref. code REQ-2505-151081: $229173/year - $235400/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect the base salary onl
Responsibilities: Lead the migration of the existing SSIS-based ETL workflows to cloud-native pipelines using DBT and/or Google Cloud Platform tools such as Dataflow, Dataform, or Cloud Composer (Airflow). Design and implement scalable, efficient data models in BigQuery, following best practices for dimensional modeling. Optimize and maintain existing SQL transformations, ensuring correctness and performance in the cloud. Collaborate with BI developers and analysts to ensure data marts align wi
Job Summary: We are seeking a highly experienced Informatica Extract Transform Load (ETL) Developer to join our Enterprise Data Warehouse (EDW) team. This role is responsible for designing, developing, and maintaining robust ETL processes using Informatica PowerCenter and Teradata. The ideal candidate will have a deep understanding of data integration, performance tuning, and data warehousing best practices, with a focus on supporting healthcare-related data systems. Required Qualifications: 7+
Meta Platforms, Inc. (f/k/a Facebook, Inc.) has the following positions in Menlo Park, CA Data Engineer, Product Analytics: Design, model, or implement data warehousing activities in order to contribute to the design and development of Facebook products. Telecommuting is permitted from anywhere in the U.S. (ref. code REQ-2506-152543: $195,254/year - $235,400/year). Individual pay is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflec
Cycle3 IT Staffing is seeking several Resident Solutions Architects who are SENIOR LEVEL for a REMOTE role. Architecting and Leadership: o Deep understanding of modern data architecture, including Lakehouse, Medallion architecture, and Data Mesh. Data Engineering: o Strong programming experience in Python; Scala is a significant plus. o Proficiency in complex query tuning. o Experience building Slowly Changing Dimensions (SCD) TYPEs. o Familiarity with DBT. o Experience with structured streaming
Role: SAS Developer Location: Sacramento, CA / Remote Duration: 24+ Months (Multiyear project) Job Description: Key Skills: SAS, COBOL and any Business Intelligence tool Five (5) years' experience front-end data warehouse or similar processing which includes extracting data for large files from various sources. cleansing data. and preparing the data for loading into a data warehouse. Five (5) years' experience as an analyst-programmer performing various system development life cycle functions
Job Title: Sr. SQL Developers - Remote - 2 positions Contract - 6 months 1099 or C2C Exp: 12+ Years minimum note: no H1b's Responsibilities: Design, build, and optimize data pipelines for Data Warehouse using SQL Server and SSIS Integrate data from various sources (e.g., SQL Server, Excel, APIs, Smartsheet) Support reporting and analytics using Kimball methodology Ideal Candidate Profile: 7+ years of experience with SQL Server, T-SQL, and SSIS Nice to have experience with Azure Data Factory, Az
Design, develop, and automate ETL processes using DBT and AWS Build robust data pipelines to move data from various sources to data warehouses or data lakes. Collaborate with cross-functional teams to ensure data accuracy, completeness, and consistency. Perform data cleansing, validation, and transformation to ensure data quality and integrity. Optimize database and query performance to ensure efficient data processing. Work with data analysts and data scientists to provide clean, reliable data
Role: Azure .NET Lead Location: Chicago, IL Job Type: Hybrid / Full-time Key Skills: Azure APIM, Azure DevOps, Logic APPS , Core .NET development , API Development , App Modernization, Microservices architecture, CI/CD pipelines , Azure AI services Job Summary OptimeTech is seeking a highly skilled Azure .Net developer to lead enterprise-wide Azure rollouts and API development initiatives. The ideal candidate must have deep expertise in Azure API Management (APIM), CI/CD Pipelines, Function/Log
Note : GCEAD, L2EAD, consultant can apply for this position who can work on W2. Job Descriptions: very senior Senior DataBricks Admin/Lead/ADB Admin/Lead with vast Architectural and solutioning experience in Data, AI and Mil and Strong in Azure Databricks-Data Lakehouse. data warehouse, ETL, BI and AI/ML knowledge, along with strong Python and SQL skills, in designing/developing architectural plans, frameworks and prototypes. Expert in Solutioning not limited to concepts using Power BI, Exc
Please note that the consultant need to be on our W2 Title: Senior Analyst/Programmer Duration: 1 Year Client: Mayo Clinic Req ID: 36265123 RemoteScope: The IT Image Management Systems unit is seeking a talented and motivated individual to join the team supporting the InfinityView application. The InfinityView application is the general medical image viewer for patient care at Mayo Clinic. Primary responsibilities will include support, maintenance, and limited development. Requirements: -A b
Position : Google Cloud Platform Architect Location: Remote - USA Mode of Hire: Full Time Job Description: 10+ years proven experience as a Data Architect with expertise on Google Cloud Platform Data Platform Strong proficiency in Google Cloud Platform services such as Dataflow with Apache Beam (Batch & Stream data processing), Big Query, and Pub/Sub. Proficiency in SQL and Python for data manipulation and analysis is mandatory. Understanding of Data Governance and Dat Quality frameworks
About DataAffect: We are a boutique data/service management firm specializing in the delivery of Data Governance, Enterprise Data Strategy, Solutions Architecture, Data Warehousing, Data Integrations, Data Security & Privacy Management, IT Service Management, Business analysis, and Agile Project Management services to diverse clients across multiple industries. Role: BigID Engineer Location: Remote Duration: 12+ months Required skills: Must have 10+ years of IT experience.BigID Experience: 5+ ye
Data Modeler - Pharma Commercial 12+ years of experience in Conceptual, Logical and Physical Data Modeling experience.Erwin Data Modeling Software hands-on experienceCreate database objects.Create, review and tune SQL scripting.Good knowledge of metadata management, data modeling, and related tools (Erwin, ER Studio, Oracle Designer or PowerDesigner)Data analysis and Database design experience.Knowledge understanding of different database platforms, such as, Oracle, Postgres, Hadoop, Snowflake,
. DIRECT CLIENT REQUIREMENT Job Summary: We are seeking a highly experienced Data Engineer to support the development and maintenance of an AWS-based Data Warehouse that enables advanced business analytics and reporting in a Medicaid data environment. The engineer will be responsible for developing and managing ETL/data pipeline jobs, transforming data to align with defined DB2 Warehouse schemas, and ensuring consistent data quality and availability for downstream Cognos reports. The role invol
This individual will be responsible for: Designing robust, componentized, and adaptable processes to scale-up our business intelligence platform.Understanding big data and cloud technologies and quickly become productive with, and provide analysis on these toolsHaving strong SQL experience and the ability to work with cloud infrastructure programs, OOD, and data pipelinesIdentifying inefficiencies in code or processes and recommend solutions.Being able to adapt quickly to changing requirements a
COMPANY OVERVIEW adroitts is a fast-growing IT solutions company that helps businesses adapt and grow in a continuously evolving market. Our tailor-made technological solutions are perfectly aligned to our client s business goals and objectives. we strive to be a long-term trusted and reliable partner for our customers' organization to help overcome IT challenges. Our solutions, methodologies and implementations are designed with customer centric focus and customers ROI. We pride ourselves in ou