Overview
Skills
Job Details
One of our premier clients is looking for Data Analyst (REMOTE) for a contract position. If interested, please submit your resume ASAP indicating (1) current location (2) desired hourly rate, W2 or 1099? (3) Your email address.
RESPONSIBILITIES:
Support the design and delivery of secure, scalable, and traceable data pipelines in Azure, with a focus on integrating and transforming sensitive, public-facing datasets.
Build ingestion workflows and transformation logic to support reporting, analytics, and public transparency initiatives.
This role requires experience with Azure-native tools, strong ETL/ELT capabilities, and a strong commitment to data quality, auditability, and responsible data use.
work with an existing software development team in an Agile/Scrum environment.
REQUIRED SKILLS:
At least 3 years of experience building and maintaining ETL/ELT pipelines in enterprise environments using Azure-native tools.
Hands-on expertise with Azure Data Factory, Dataflows, Synapse Pipelines, or similar orchestration tools.
Proficiency in SQL, Python, or PySpark for transformation logic and data cleansing workflows.
Experience with Delta Lake, Azure Data Lake Storage Gen2, JSON, and Parquet formats.
Ability to build modular, reusable pipeline components using metadata-driven approaches and robust error handling.
Familiarity with public data sources, government transparency datasets, and publishing workflows.
Knowledge of data masking, PII handling, and encryption techniques to manage sensitive data responsibly.
Experience with data quality frameworks, including automated validation, logging, and data reconciliation methods.
Strong grasp of DevOps/DataOps practices, including versioning, testing, and CI/CD for data pipelines.
Experience supporting data publishing for oversight, regulatory, or open data initiatives is highly desirable.
Certifications such as DP-203 (Azure Data Engineer Associate) or Azure Solutions Architect are a plus.