Data Resource Engineer
Location: Dallas 75244 (near LBJ Freeway and Dallas North Tollway)
Schedule: Hybrid (3 days onsite)
Interview process: 2 rounds (1st round virtual; 2nd round onsite)
We are partnered with our client to search for a Data Resource Engineer who will serve as a critical technical leader within the technology organization, acting as the company s in-house expert in Snowflake, Python, and AI-driven data architecture. This role integrates data engineering, advanced analytics, and AI solution development to enable datadriven decision-making across the enterprise.
Partnering closely with stakeholders across both business and technology functions, this position will design, build, and enhance data pipelines, machinelearning ready datasets, and AIpowered analytical solutions. The engineer ensures the organization maximizes the value of its data ecosystem through scalable, high-quality architecture and insights.
Needed:
- Bachelor s degree in Computer Science, Engineering, Information Systems, or a related discipline.
- 5+ years of experience in data engineering, business intelligence, or analytics-focused roles.
- Demonstrated success in building and deploying scalable data solutions using Snowflake and Python.
- Hands-on experience integrating AI/ML models or automation into analytical workflows.
- Working knowledge of modern data stack tools such as dbt, Airflow, Git, Docker, and CI/CD preferred.
- Background in metricsdriven, highgrowth environments; experience in Private Equity backed organizations is a strong plus.
Other skills:
- Strong Python development skills leveraging tools such as pandas, PySpark, scikitlearn, and FastAPI for data engineering and AI-driven solutions.
- Practical experience with AI/ML frameworks (e.g., TensorFlow, PyTorch, OpenAI, AWS SageMaker) is a plus.
- Skilled in data visualization tools such as Tableau, Power BI, or comparable platforms.
- Solid understanding of data modeling, ETL/ELT processes, and modern data architecture best practices.
- Strong project management capabilities paired with clear, effective stakeholder communication.
- Highly analytical, with the ability to convert complex data into meaningful, actionable insights.
- Organized, proactive, and adept at managing multiple priorities in a fast-paced setting.
Focus:
- Lead the architecture, development, and optimization of Snowflakebased data platforms, encompassing data ingestion, transformation, quality, and governance.
- Build Python-driven automations, data processing pipelines, and AI workflows that enhance scalability and support advanced analytics and decision-making.
- Integrate AI and machine learning capabilities into business intelligence solutions, leveraging APIs, predictive modeling, and NLP where applicable.
- Collaborate with business and technology leaders to shape a modern, AIenabled BI strategy aligned with organizational goals.
- Promote a dataasaservice approach by delivering reliable, scalable, and secure data assets across the enterprise.
- Cultivate teamwork and innovation, maintaining a strong customerservice mindset when partnering with internal stakeholders.
Other duties:
- Act as a technical thought leader in Snowflake architecture, data engineering, and AI enablement.
- Design, develop, and maintain robust data pipelines and ETL/ELT processes using Snowflake, dbt, Python, and orchestration tools such as Airflow.
- Build and manage data models that power analytics, forecasting, and AI/ML initiatives.
- Partner with crossfunctional teams to prototype, refine, and deploy AI solutions including predictive analytics, recommendation engines, and process automation.
- Oversee data governance, security, and compliance to ensure all analytics and AI practices meet applicable privacy and regulatory standards (e.g., GDPR, CCPA).
- Mentor team members on engineering best practices, AI integration techniques, and scalable analytics architecture.
- Optimize Snowflake environments through performance tuning and costmanagement best practices.
- Lead the ongoing advancement of the BI and AI architectural roadmap, staying abreast of emerging technologies such as LLMs, vector databases, and data observability platforms.