Overview
Skills
Job Details
Title: ETL Solution Architect
Location: Reston VA, Herndon VA, DC (Hybrid)
Duration: Long Term
Job Summary:
We are seeking an experienced ETL Solution Architect to lead the design and implementation of data integration and ETL/ELT solutions that support enterprise data strategy and analytics. The ideal candidate will have a strong background in data warehousing, data modeling, and ETL pipeline architecture, with deep experience across modern ETL tools, cloud platforms, and best practices.
This role requires both hands-on expertise and strategic leadership, working closely with cross-functional teams to ensure data integration solutions are scalable, maintainable, and aligned with business needs.
Key Responsibilities:
Architect end-to-end ETL/ELT solutions for enterprise data warehouses, data lakes, and analytics platforms.
Define data integration standards, best practices, and reusable patterns.
Evaluate and select appropriate ETL tools and technologies (e.g., Informatica, Talend, Apache NiFi, Azure Data Factory, DBT, etc.).
Collaborate with business and technical stakeholders to gather data requirements and translate them into scalable technical solutions.
Lead data integration projects, ensuring timely delivery, data quality, and system performance.
Design and document data pipelines, transformations, data lineage, and orchestration workflows.
Optimize data flows for performance, scalability, and cost-efficiency.
Ensure solutions comply with data governance, security, and regulatory requirements (e.g., GDPR, HIPAA).
Provide mentorship and technical leadership to ETL developers and data engineers.
Collaborate with cloud and infrastructure teams on deployment and automation strategies.
Required Skills & Qualifications:
Bachelor s or Master s degree in Computer Science, Information Systems, or a related field.
8+ years of experience in data integration, with at least 3 5 years in an ETL/solution architecture role.
Deep knowledge of ETL/ELT concepts, data modeling, and data warehousing.
Expertise in one or more ETL tools: Informatica, Talend, SSIS, Apache NiFi, Matillion, Azure Data Factory, DBT, or equivalent.
Strong SQL skills and experience working with relational databases and big data environments.
Hands-on experience with cloud data platforms (Azure, AWS, or Google Cloud Platform) and cloud-native ETL architectures.
Familiarity with data lakes, streaming data (Kafka, Spark Streaming), and modern data stack concepts.
Knowledge of DevOps, CI/CD, and data pipeline automation tools.
Strong understanding of data governance, metadata management, and data quality frameworks.
Preferred Qualifications:
Experience with Azure Data Services, AWS Glue, Google Cloud Dataflow or similar.
Knowledge of containerization (Docker, Kubernetes) for data pipeline deployment.
Exposure to real-time data processing and event-driven architecture.
Experience working with data catalogs (e.g., Alation, Collibra, or Azure Purview).
Thanks,
Naveen S