Overview
Skills
Job Details
*****TO BE CONSIDERED, CANDIDATES MUST BE US CIT*****
***** TO BE CONSIDERED, CANDIDATES MUST BE LOCAL TO THE DC/MD/VA METRO AREA AND BE OPEN TO A HYBIRD SCHEDULE IN RESTON, VA*****
Formed in 2011, Inadev is focused on its founding principle to build innovative customer-centric solutions incredibly fast, secure, and at scale. We deliver world-class digital experiences to some of the largest federal agencies and commercial companies. Our technical expertise and innovations are comprised of codeless automation, identity intelligence, immersive technology, artificial intelligence/machine learning (AI/ML), virtualization, and digital transformation.
POSITION DESCRIPTION:
Inadev is seeking a strong Lead Principal Data Solutions Architect Primary focus will be in Natural language processing (NLP), applying data mining techniques, doing statistical analysis and building high quality prediction systems.
PROGRAM DESCRIPTION:
This initiative focuses on modernizing and optimizing a mission-critical data environment within the immigration domain to enable advanced analytics and improved decision-making capabilities. The effort involves designing and implementing a scalable architecture that supports complex data integration, secure storage, and high-performance processing. The program emphasizes agility, innovation, and collaboration to deliver solutions that meet evolving stakeholder requirements while maintaining compliance with stringent security and governance standards.
RESPONSIBILITES:
- Leading system architecture decisions, ensuring technical alignment across teams, and advocating for best practices in cloud and data engineering.
- Serve as a senior technical leader and trusted advisor, driving architectural strategy and guiding development teams through complex solution design and implementation
- Serve as the lead architect and technical authority for enterprise-scale data solutions, ensuring alignment with strategic objectives and technical standards.
- Drive system architecture design, including data modeling, integration patterns, and performance optimization for large-scale data warehouses.
- Provide expert guidance to development teams on Agile analytics methodologies and best practices for iterative delivery.
- Act as a trusted advisor and advocate for the government project lead, translating business needs into actionable technical strategies.
- Oversee technical execution across multiple teams, ensuring quality, scalability, and security compliance.
- Evaluate emerging technologies and recommend solutions that enhance system capabilities and operational efficiency.
NON-TECHNICAL REQUIREMENTS:
- Must be a US Cit.
- Must be willing to work a HYRBID Schedule (2-3 Days) in Reston, VA & client locations in the Northern Virginia/DC/MD area as required.
- Ability to pass a 7-year background check and obtain/maintain a U.S. Government Clearance
- Strong communication and presentation skills.
- Must be able to prioritize and self-start.
- Must be adaptable/flexible as priorities shift.
- Must be enthusiastic and have passion for learning and constant improvement.
- Must be open to collaboration, feedback and client asks.
- Must enjoy working with a vibrant team of outgoing personalities.
MANDATORY REQUIREMENTS/SKILLS:
- Bachelor of Science degree in Computer Science, Engineering or related subject and at least 10 years of experience leading architectural design of enterprise-level data platforms, with significant focus on Databricks Lakehouse architecture.
- Experience within the Federal Government, specifically DHS is preferred.
- Must possess demonstrable experience with Databricks Lakehouse Platform, including Delta Lake, Unity Catalog for data governance, Delta Sharing, and Databricks SQL for analytics and BI workloads.
- Must demonstrate deep expertise in Databricks Lakehouse architecture, medallion architecture (Bronze/Silver/Gold layers), Unity Catalog governance framework, and enterprise-level integration patterns using Databricks workflows and Auto Loader.
- Knowledge of and ability to organize technical execution of Agile Analytics using Databricks Repos, Jobs, and collaborative notebooks, proven by professional experience.
- Expertise in Apache Spark on Databricks, including performance optimization, cluster management, Photon engine utilization, and Delta Lake optimization techniques (Z-ordering, liquid clustering, data skipping).
- Proficiency in Databricks Unity Catalog for centralized data governance, metadata management, data lineage tracking, and access control across multi-cloud environments.
- Experience with Databricks Delta Live Tables (DLT) for declarative ETL pipeline development and data quality management.
- Certification in one or more: Databricks Certified Data Engineer Associate/Professional, Databricks Certified Solutions Architect, AWS, Apache Spark, or cloud platform certifications.
DESIRED REQUIREMENTS/SKILLS:
- Expertise in ETL tools.
- Advanced knowledge of cloud platforms (AWS preferred; Azure or Google Cloud Platform a plus).
- Proficiency in SQL, PL/SQL, and performance tuning for large datasets.
- Understanding of security frameworks and compliance standards in federal environments.
PHYSICAL DEMANDS:
- Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions
Inadev Corporation does not discriminate against qualified individuals based on their status as protected veterans or individuals with disabilities and prohibits discrimination against all individuals based on their race, color, religion, sex, sexual orientation/gender identity, or national origin.