Overview
Hybrid
Depends on Experience
Contract - W2
Contract - 12 Month(s)
25% Travel
Skills
ADF
Agile
Amazon ECS
Amazon Kinesis
Amazon Web Services
Apache Airflow
Apache Flink
Apache Hive
Apache Kafka
Apache NiFi
Apache Spark
Big Data
Cloud Computing
Cloud Storage
Collaboration
Communication
Computer Science
Conflict Resolution
Continuous Delivery
Continuous Integration
Data Analysis
Data Engineering
Data Governance
Data Lake
Data Modeling
Data Processing
Data Quality
Data Security
Data Visualization
Data Warehouse
Databricks
Debugging
DevOps
Docker
Extract
Transform
Load
FOCUS
Git
IBM Cognos Analytics
Information Technology
Java
Knowledge Sharing
MPP
Microsoft Azure
Microsoft Power BI
Orchestration
Problem Solving
PySpark
Python
Regulatory Compliance
Relational Databases
SAP
Salesforce.com
Scala
Scalability
Scrum
Snow Flake Schema
Streaming
Tableau
Teamwork
Terraform
Version Control
Job Details
Job Title: Data Engineer
Location: Vienna, VA
Note: Required only 12+ Years of candidate's
Mandatory Skills: Azure, AWS, ETL, ADF (Azure data Factory), Snowflake, PySpark, Azure data Brick. Tableau, Cognos, Power BI, Azure Data Lake
Responsibilities:
- Design, develop, and maintain data pipelines using Amazon ECS and Data bricks to ingest, process, and transform data from various sources.
- Collaborate with data scientists and analysts to understand their data requirements and ensure the data engineering infrastructure can support their needs.
- Optimize data pipelines for efficiency, scalability, and performance to meet business objectives.
- Troubleshoot and resolve issues related to data processing and pipeline failures.
- Ensure data security and compliance with relevant data protection regulations.
- Document data engineering processes and procedures for future reference and knowledge sharing.
- Stay up-to-date with the latest trends and best practices in data engineering and cloud-based data processing technologies.
Requirements:
- Bachelor's degree in computer science, information technology, or a related field. A master's degree is a plus.
- Proven experience as a data engineer with a focus on data processing and integration.
- Strong knowledge and hands-on experience with Amazon ECS for container orchestration.
- Experience working in an Agile/Scrum environment.
- 5+ years of experience in developing/supporting a data platform in Azure Databricks.
- Proficiency in Data bricks for data analytics, ETL, and data engineering.
- Familiarity with big data technologies such as Apache Spark.
- Experience with Azure cloud platforms and Azure Data Lake cloud storage.
- Solid understanding of data warehousing concepts and relational databases.
- Strong programming skills in languages such as Python, Java, or Scala.
- Knowledge of data modeling and data warehouse design principles.
- Experience working with Big Data processing frameworks such as Spark, Hive, etc.
- Experience working with Big Data streaming frameworks such as Nifi, Spark-Streaming, Flink, etc.
- Experience working with Big Data streaming services such as Kinesis, Kafka, etc.
- Excellent problem-solving and debugging skills.
- Strong teamwork and communication skills to collaborate effectively with cross-functional teams.
Required Qualifications:
- AWS Certified Data Analytics - Specialty or relevant certifications.
- Experience working with data streaming technologies like Apache Kafka or Amazon Kinesis.
- Knowledge of DevOps practices for CI/CD pipelines.
- Familiarity with data governance and data quality best practices.
- Experience with version control systems like Git.
Bonus Skills:
- Experience with data streaming technologies like PySpark
- Experience with pipeline technologies like DBT, Apache Airflow or FiveTran
- Experience with MPP technologies and databases
- Experience with data visualization tools like Tableau or Sigma
- Experience with container orchestration tools like Docker or Kubernetes
- Experience with Azure data product offerings and platform
- Experience working with Salesforce and SAP data.
- Experience using Terraform or other infrastructure as code tools.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.