Overview
On Site
USD 150,000.00 - 190,000.00 per year
Full Time
Skills
Bloomberg
Wholesale
Data Science
Machine Learning (ML)
Data Validation
Continuous Integration
Continuous Delivery
Software Development
Data Modeling
Python
Process Improvement
Data Management
Reporting
Advanced Analytics
Tandem
Optimization
Data Storage
Change Data Capture
SQL Azure
Cosmos
Data Processing
Extraction
API
Databricks
Cosmos-Db
Informatica
IBM DB2
Oracle
Flat File
Computerized System Validation
XML
JSON
Business Data
Git
DevOps
Big Data
Apache Kafka
Apache Hadoop
SQL
Workflow Management
Apache Spark
Streaming
Apache Storm
Business Analytics
Business Analysis
Computer Science
Google Cloud
Google Cloud Platform
Cloud Computing
Agile
Retail
Communication
SAP PI
SAP BODS
Job Scheduling
Orchestration
IBM iSeries
SAP CRM
Microsoft Azure
Extract
Transform
Load
ELT
Database
Storage
Data Lake
Relational Databases
NoSQL
Data Warehouse
Data Quality
Data Integration
Gmail
Privacy
Pharmacy
Health Care
Insurance
Life Insurance
Recruiting
Authorization
Employment Authorization
Job Details
Costco IT is responsible for the technical future of Costco Wholesale, the third largest retailer in the world with wholesale operations in fourteen countries. Despite our size and explosive international expansion, we continue to provide a family, employee centric atmosphere in which our employees thrive and succeed.
This is an environment unlike anything in the high-tech world and the secret of Costco's success is its culture. The value Costco puts on its employees is well documented in articles from a variety of publishers including Bloomberg and Forbes. Our employees and our members come FIRST. Costco is well known for its generosity and community service and has won many awards for its philanthropy. The company joins with its employees to take an active role in volunteering by sponsoring many opportunities to help others.
Come join the Costco Wholesale IT family. Costco IT is a dynamic, fast-paced environment, working through exciting transformation efforts. We are building the next generation retail environment where you will be surrounded by dedicated and highly professional employees.
Data Engineers are responsible for developing and operationalizing data pipelines/integrations to make data available for consumption (i.e. Reporting, Data Science/Machine Learning, Data APIs, etc.). This includes data ingestion, data transformation, data validation/quality, data pipeline optimization, orchestration; and deploying code to production via CI/CD. The Data Engineer role requires knowledge of software development/programming methodologies, various data sources (Relational Databases, flat files (csv, delimited), APIs, XML, JSON, etc.), data access (SQL, Python, etc.), followed by expertise in data modeling, cloud architectures/platforms, data warehousing, and data lakes. This role will also partner closely with Product Owners, Data Architects, Platform/DevOps Engineers, etc. to design, build, test, implement, and maintain data pipelines.
If you want to be a part of one of the worldwide BEST companies "to work for", simply apply and let your career be reimagined.
ROLE
Develops complex SQL & Python against a variety of data sources.
Implements streaming data pipelines using event/message-based architectures.
Defines and maintains optimal data pipeline architecture.
Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery/orchestration.
Analyzes data to spot anomalies, trends, and correlate data to ensure Data Quality.
Identifies ways to improve data reliability, efficiency, and quality of data management.
Performs peer review for another Data Engineer's work.
Develops and operationalizes data pipelines to create enterprise certified data sets that are made available for consumption (reporting, advanced analytics, APIs/Services).
Works in tandem with Architects, Data Analysts, and Software Engineers to design data requirements and recommends ongoing optimization of data storage, data ingestion, data quality, and orchestration.
Designs, develops, and implements ETL/ELT/CDC processes using Informatica Intelligent Cloud Services (IICS).
Uses Azure services, such as Azure SQL DW (Synapse), ADLS, Azure Event Hub, Cosmos, Databricks, and Delta-Lake to improve and speed delivery of data products and services.
Implements big data and NoSQL solutions by developing scalable data processing platforms to drive high-value insights to the organization.
Builds required infrastructure for optimal extraction, transformation, and loading of data from various data sources using Azure and SQL technologies.
Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.
REQUIRED
5 years' experience engineering and operationalizing data pipelines with large and complex datasets.
5 years' experience with Data Pipeline, ETL, and Data Warehousing.
3 years' experience working with Cloud technologies, such as ADLS, Azure Databricks, Spark, Azure Synapse, Cosmos DB, Google Cloud Platform BigQuery, Google Cloud Platform Cloud Spanner, Google Cloud Platform AlloyDB, and Google Cloud Platform Cloud Logging.
2 years' hands-on experience with Informatica IICS or other ETL tools.
Extensive experience working with various data sources (DB2, SQL, Oracle, flat files (csv, delimited), APIs, XML, JSON).
Experience implementing data integration techniques, such as event/message-based integration (Kafka, Azure Event Hub), ETL.
Advanced SQL skills; solid understanding of relational databases and business data; ability to write complex SQL queries against a variety of data sources.
Strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
Experience with Git/Azure DevOps.
Ability to build and optimize data sets, 'big data' data pipelines, and architectures.
Demonstrated understanding and experience using software and tools, including big data tools, such as Kafka, Spark, and Hadoop; relational NoSQL and SQL databases, including Cassndra and Pastgres; workflow management and pipeline tools, such as Airflow, Luigi, and Azkaban; Azure close services; stream-processing systems, such as Spark-Streaming and Storm.
Ability to work in a fast-paced agile development environment.
Recommended
BA/BS in Computer Science, Engineering, or equivalent software/services experience.
Azure and Google Certifications.
Experience with Google Cloud Platform BigQuery, Google Cloud Platform Cloud Spanner, Google Cloud Platform AlloyDB, Google Cloud Platform Cloud Logging.
Experience delivering data solutions through agile software development methodologies.
Exposure to the retail industry.
Excellent verbal and written communication skills.
Experience working with SAP integration tools, including BODS.
Experience with Job scheduling and orchestration tools.
Experience with the Costco membership platforms (iSeries, SAP CRM, and Azure).
Able to demonstrate strong understanding of data integration techniques and tools (e.g. Extract, Transform, Load (ETL)/Extract, Load, Transform (ELT)) tools.
Ability to demonstrate strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
Ability to demonstrate strong understanding with coding and programming concepts to build data pipelines (e.g. data transformation, data quality, data integration, etc.).
Ability to communicate technical concepts to non-technical audiences both in written and verbal form.
Proficient in Google Workspace applications, including Sheets, Docs, Slides, and Gmail.
Required Documents
Cover Letter
Resume
California applicants, please click here to review the Costco Applicant Privacy Notice.
Pay Ranges:
Level 1 - $85,000 - $110,000
Level 2 - $105,000 - $135,000,
Level 3 - $130,000 - $160,000
Level SR - $150,000 - $190,000, Bonus and Restricted Stock Unit (RSU) eligible
Level STF - $180,000 - $225,000, Bonus and Restricted Stock Unit (RSU) eligible
We offer a comprehensive package of benefits including paid time off, health benefits - medical/dental/vision/hearing aid/pharmacy/behavioral health/employee assistance, health care reimbursement account, dependent care assistance plan, short-term disability and long-term disability insurance, AD&D insurance, life insurance, 401(k), stock purchase plan to eligible employees.
Costco is committed to a diverse and inclusive workplace. Costco is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or any other legally protected status. If you need assistance and/or a reasonable accommodation due to a disability during the application or the recruiting process, please send a request to
If hired, you will be required to provide proof of authorization to work in the United States. In some cases, applicants and employees for selected positions will not be sponsored for work authorization, including, but not limited to H1-B visas.
This is an environment unlike anything in the high-tech world and the secret of Costco's success is its culture. The value Costco puts on its employees is well documented in articles from a variety of publishers including Bloomberg and Forbes. Our employees and our members come FIRST. Costco is well known for its generosity and community service and has won many awards for its philanthropy. The company joins with its employees to take an active role in volunteering by sponsoring many opportunities to help others.
Come join the Costco Wholesale IT family. Costco IT is a dynamic, fast-paced environment, working through exciting transformation efforts. We are building the next generation retail environment where you will be surrounded by dedicated and highly professional employees.
Data Engineers are responsible for developing and operationalizing data pipelines/integrations to make data available for consumption (i.e. Reporting, Data Science/Machine Learning, Data APIs, etc.). This includes data ingestion, data transformation, data validation/quality, data pipeline optimization, orchestration; and deploying code to production via CI/CD. The Data Engineer role requires knowledge of software development/programming methodologies, various data sources (Relational Databases, flat files (csv, delimited), APIs, XML, JSON, etc.), data access (SQL, Python, etc.), followed by expertise in data modeling, cloud architectures/platforms, data warehousing, and data lakes. This role will also partner closely with Product Owners, Data Architects, Platform/DevOps Engineers, etc. to design, build, test, implement, and maintain data pipelines.
If you want to be a part of one of the worldwide BEST companies "to work for", simply apply and let your career be reimagined.
ROLE
Develops complex SQL & Python against a variety of data sources.
Implements streaming data pipelines using event/message-based architectures.
Defines and maintains optimal data pipeline architecture.
Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery/orchestration.
Analyzes data to spot anomalies, trends, and correlate data to ensure Data Quality.
Identifies ways to improve data reliability, efficiency, and quality of data management.
Performs peer review for another Data Engineer's work.
Develops and operationalizes data pipelines to create enterprise certified data sets that are made available for consumption (reporting, advanced analytics, APIs/Services).
Works in tandem with Architects, Data Analysts, and Software Engineers to design data requirements and recommends ongoing optimization of data storage, data ingestion, data quality, and orchestration.
Designs, develops, and implements ETL/ELT/CDC processes using Informatica Intelligent Cloud Services (IICS).
Uses Azure services, such as Azure SQL DW (Synapse), ADLS, Azure Event Hub, Cosmos, Databricks, and Delta-Lake to improve and speed delivery of data products and services.
Implements big data and NoSQL solutions by developing scalable data processing platforms to drive high-value insights to the organization.
Builds required infrastructure for optimal extraction, transformation, and loading of data from various data sources using Azure and SQL technologies.
Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.
REQUIRED
5 years' experience engineering and operationalizing data pipelines with large and complex datasets.
5 years' experience with Data Pipeline, ETL, and Data Warehousing.
3 years' experience working with Cloud technologies, such as ADLS, Azure Databricks, Spark, Azure Synapse, Cosmos DB, Google Cloud Platform BigQuery, Google Cloud Platform Cloud Spanner, Google Cloud Platform AlloyDB, and Google Cloud Platform Cloud Logging.
2 years' hands-on experience with Informatica IICS or other ETL tools.
Extensive experience working with various data sources (DB2, SQL, Oracle, flat files (csv, delimited), APIs, XML, JSON).
Experience implementing data integration techniques, such as event/message-based integration (Kafka, Azure Event Hub), ETL.
Advanced SQL skills; solid understanding of relational databases and business data; ability to write complex SQL queries against a variety of data sources.
Strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
Experience with Git/Azure DevOps.
Ability to build and optimize data sets, 'big data' data pipelines, and architectures.
Demonstrated understanding and experience using software and tools, including big data tools, such as Kafka, Spark, and Hadoop; relational NoSQL and SQL databases, including Cassndra and Pastgres; workflow management and pipeline tools, such as Airflow, Luigi, and Azkaban; Azure close services; stream-processing systems, such as Spark-Streaming and Storm.
Ability to work in a fast-paced agile development environment.
Recommended
BA/BS in Computer Science, Engineering, or equivalent software/services experience.
Azure and Google Certifications.
Experience with Google Cloud Platform BigQuery, Google Cloud Platform Cloud Spanner, Google Cloud Platform AlloyDB, Google Cloud Platform Cloud Logging.
Experience delivering data solutions through agile software development methodologies.
Exposure to the retail industry.
Excellent verbal and written communication skills.
Experience working with SAP integration tools, including BODS.
Experience with Job scheduling and orchestration tools.
Experience with the Costco membership platforms (iSeries, SAP CRM, and Azure).
Able to demonstrate strong understanding of data integration techniques and tools (e.g. Extract, Transform, Load (ETL)/Extract, Load, Transform (ELT)) tools.
Ability to demonstrate strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
Ability to demonstrate strong understanding with coding and programming concepts to build data pipelines (e.g. data transformation, data quality, data integration, etc.).
Ability to communicate technical concepts to non-technical audiences both in written and verbal form.
Proficient in Google Workspace applications, including Sheets, Docs, Slides, and Gmail.
Required Documents
Cover Letter
Resume
California applicants, please click here to review the Costco Applicant Privacy Notice.
Pay Ranges:
Level 1 - $85,000 - $110,000
Level 2 - $105,000 - $135,000,
Level 3 - $130,000 - $160,000
Level SR - $150,000 - $190,000, Bonus and Restricted Stock Unit (RSU) eligible
Level STF - $180,000 - $225,000, Bonus and Restricted Stock Unit (RSU) eligible
We offer a comprehensive package of benefits including paid time off, health benefits - medical/dental/vision/hearing aid/pharmacy/behavioral health/employee assistance, health care reimbursement account, dependent care assistance plan, short-term disability and long-term disability insurance, AD&D insurance, life insurance, 401(k), stock purchase plan to eligible employees.
Costco is committed to a diverse and inclusive workplace. Costco is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or any other legally protected status. If you need assistance and/or a reasonable accommodation due to a disability during the application or the recruiting process, please send a request to
If hired, you will be required to provide proof of authorization to work in the United States. In some cases, applicants and employees for selected positions will not be sponsored for work authorization, including, but not limited to H1-B visas.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.