Job Title: Big Data Engineer
Location: Phoenix, AZ
Job Summary:
We are seeking a Big Data Engineer with experience in designing, developing, and maintaining large-scale data processing systems. The ideal candidate should have hands-on experience with big data technologies, data pipelines, cloud platforms, and distributed data processing.
Required Skills & Responsibilities:
Design, build, and maintain scalable data pipelines for structured and unstructured data.
Develop ETL/ELT workflows for data ingestion, transformation, and processing.
Hands-on experience with big data technologies such as Apache Spark, Apache Hadoop, and Apache Kafka.
Strong programming skills in Java, Python, or Scala.
Experience with SQL and NoSQL databases.
Experience working with cloud platforms such as Microsoft Azure, Amazon Web Services, or Google Cloud.
Build batch and real-time data processing solutions.
Optimize data workflows for performance, scalability, and reliability.
Work with cross-functional teams including data analysts, data scientists, and application teams.
Troubleshoot production issues and support data platform operations.
Nice to Have:
Experience with Apache Airflow, Databricks, or Snowflake.
Knowledge of CI/CD, containerization, and data governance.