Data Analyst 3

Overview

On Site
$90 - $90
Contract - W2

Skills

ASP.NET
Amazon Lambda
Amazon S3
Amazon Web Services
Analytical Skill
Apache Airflow
Apache Kafka
Apache Spark
Attention To Detail
Business Intelligence
Cloud Computing
Collaboration
Communication
Conflict Resolution
Data Integration
Data Processing
Database
Database Design
Docker
Documentation
ELT
Extract
Transform
Load
FOCUS
File Systems
Git
Google Cloud
Google Cloud Platform
Kubernetes
Management
Microsoft Azure
Microsoft Power BI
Microsoft SQL Server
Orchestration
Problem Solving
Programming Languages
Python
RESTful
Real-time
SQL
Snow Flake Schema
Soft Skills
Software Engineering
Tableau
C#
Version Control
Workflow

Job Details

Description: Restricted States - Alabama, Arkansas, Delaware, Florida, Indiana, Iowa, Louisiana, Maryland, Mississippi, Missouri, Oklahoma, Pennsylvania, South Carolina, and Tennessee.

1 round of interview

Job Description: Data EngineerWe are seeking a skilled Data Application Engineer to design, build, and maintain data-driven applications and pipelines that enable seamless data integration, transformation, and delivery across systems. The ideal candidate will have a strong foundation in software engineering, database technologies, and cloud data platforms, with a focus on building scalable, robust, and efficient data applications.Key Responsibilities: Develop Data Applications: Build and maintain data-centric applications, tools, and APIs to enable real-time and batch data processing. Data Integration: Design and implement data ingestion pipelines, integrating data from various sources such as databases, APIs, and file systems. Data Transformation: Create reusable ETL/ELT pipelines to process and transform raw data into consumable formats using tools like Snowflake, DBT, or Python. Collaboration: Work closely with analysts, and stakeholders to understand requirements and translate them into scalable solutions. Documentation: Maintain comprehensive documentation for data applications, workflows, and processes.Required Skills and Qualifications: Education: Bachelor s degree in Computer Science, Engineering, or a related field (or equivalent experience). Programming: Proficiency in programming languages Python, C# , ASP.NET (Core) Databases: Strong understanding of SQL, database design, and experience with relational (e.g., Snowflake, SQL Server) databases Data Tools: Hands-on experience with ETL/ELT tools and frameworks such as Apache Airflow (DBT - Nice to Have) Cloud Platforms: Familiarity with cloud platforms such as AWS, Azure, or Google Cloud, and their data services (e.g., S3, AWS Lambda etc.). Data Pipelines: Experience with real-time data processing tools (e.g., Kafka, Spark) and batch data processing. APIs: Experience designing and integrating RESTful APIs for data access and application communication. Version Control: Knowledge of version control systems like Git for code management. Problem-Solving: Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues.Preferred Skills: Knowledge of containerization tools like Docker and orchestration platforms like Kubernetes. Experience with BI tools like Tableau, Power BI, or Looker.Soft Skills: Excellent communication and collaboration skills to work effectively in cross-functional teams. Ability to prioritize tasks and manage projects in a fast-paced environment. Strong attention to detail and commitment to delivering high-quality results.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.