Lead Data Engineer

Overview

On Site
Full Time

Skills

Data Flow
IoT
Real-time
Collaboration
Data Architecture
Scalability
Decision-making
Data Processing
Documentation
SAFE
SAP BASIS
Authorization
Computer Science
Law
Data Lake
Storage
HDFS
Apache Spark
Apache Flink
Databricks
Unity
Microsoft
Apache Kafka
Apache Airflow
Apache NiFi
Talend
Informatica
Data Integration
NoSQL
Database
PostgreSQL
MongoDB
Apache Cassandra
Data Warehouse
Cloud Computing
Microsoft Azure
Google Cloud
Google Cloud Platform
Amazon Web Services
Amazon Redshift
Programming Languages
Data Engineering
Python
Java
Scala
Workflow
Data Modeling
Star Schema
Snow Flake Schema
Analytics
Reporting
Data Quality
Data Integrity
Regulatory Compliance
Continuous Integration and Development
Continuous Delivery
Jenkins
GitLab
Continuous Integration
Extract
Transform
Load
Analytical Skill
Problem Solving
Conflict Resolution
FOCUS
Mentorship
Knowledge Sharing
Continuous Improvement
Management
Laptop

Job Details

Job Description:

Position Overview

The primary responsibility of the Lead Data Engineer is to spearhead the design and implementation of our data pipelines and integration strategies for a casino management system being built from the ground up. This role requires a strong technical background and experience in building scalable data architectures that support real-time data processing, analytics, and reporting. The Lead Data Engineer will collaborate with cross-functional teams to ensure seamless data flow and integration across various systems.

All duties are to be performed in accordance with departmental and Las Vegas Sands Corp.'s policies, practices, and procedures. All Las Vegas Sands Corp. Team Members are expected to conduct and carry themselves in a professional manner at all times. Team Members are required to observe the Company's standards, work requirements and rules of conduct.

Essential Duties & Responsibilities
  • Design, develop, and maintain robust data pipelines that support data ingestion, transformation, and storage, ensuring high data quality and reliability.
  • Lead the integration of diverse data sources (e.g., transactional systems, third-party APIs, IoT devices) to create a unified data ecosystem for the casino management system.
  • Implement and optimize Extract, Transform, Load (ETL) processes to ensure efficient data movement and processing for both batch and real-time analytics.
  • Collaborate with the Principal Data Architect to establish the overall data architecture, ensuring it meets business needs and supports future scalability.
  • Develop and implement data quality checks and monitoring processes to ensure accuracy and consistency across all data sources.
  • Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions that enable data-driven decision-making.
  • Monitor and optimize data pipeline performance, identifying bottlenecks and implementing improvements to enhance data processing speeds.
  • Maintain comprehensive documentation of data engineering processes, architecture, and workflows to support ongoing development and maintenance.
  • Perform job duties in a safe manner.
  • Attend work as scheduled on a consistent and regular basis.
  • Perform other related duties as assigned.


Minimum Qualifications
  • At least 21 years of age.
  • Proof of authorization to work in the United States.
  • Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.
  • Must be able to obtain and maintain any certification or license, as required by law or policy.
  • 5+ years of experience in data engineering, with at least 2 years in a lead or senior role, preferably in the gaming or casino industry.
  • Hands-on experience with on-premise Data Lake data pipelines, including storage component (HDFS, Cassandra), compute component (Spark), message component (Kafka).
  • Experience with on-premise and Data Lakehouse technologies (Iceberg, Dremio, Flink, AWS Lake Formation, AWS Glue, AWS Athena, Azure Databricks-Unity Catalog, Azure Synapse, Microsoft Fabric Lakehouse).
  • Proficiency with data pipeline technologies (e.g., Apache Kafka, Apache Airflow, Apache NiFi) for orchestrating data workflows.
  • Demonstrated experience with ETL frameworks and tools (e.g., Talend, Informatica, AWS Glue) for data integration and processing.
  • Strong knowledge of relational and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra) and experience with data warehousing concepts.
  • Familiarity with cloud data solutions (e.g., AWS, Azure, Google Cloud) and their associated data services (e.g., AWS Redshift, Google BigQuery).
  • Proficiency in programming languages commonly used for data engineering (e.g., Python, Java, Scala) for building data pipelines and processing workflows.
  • Understanding of data modeling techniques (e.g., star schema, snowflake schema) to support analytics and reporting needs.
  • Demonstrated experience with data quality tools and frameworks to ensure data integrity and compliance.
  • Knowledge of continuous integration/continuous deployment (CI/CD) practices and tools (e.g., Jenkins, GitLab CI) for automating data pipeline deployment.
  • Strong analytical and problem-solving skills with a focus on delivering high-quality data solutions.
  • Proven ability to lead and mentor junior data engineers, fostering a culture of knowledge sharing and continuous improvement.
  • Strong interpersonal skills with the ability to communicate effectively and interact appropriately with management, other Team Members and outside contacts of different backgrounds and levels of experience.


Physical Requirements

Must be able to:
  • Physically access assigned workspace areas with or without reasonable accommodation.
  • Work indoors and be exposed to various environmental factors such as, but not limited to, CRT, noise, and dust.
  • Utilize laptop and standard keyboard to perform essential functions of the job.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.