Data Engineer
Duration - Contract
Location - Blue Ash, OH
Visa - any visa is fine
Top 3 skills: Azure data bricks, python, and spark
Soft Skills Needed: problem solving, attention to detail, and ability to work independently and part of agile team
Team details, i.e. size, dynamics, locations: 10 team members, working independently but will do peer programing throughout the day.
VERY IMPORTANT DETAILS
· Work Location must be local
· Interviews will be in person, onsite.
· Not only do they need to be local, but they also need to be willing to come on-site for their interview, as well as that they will be expected to work on-site with the team.
· Prescreening Details: 3 video questions, prior screenings will not carry over. These are specific questions given by the HM. Please coach candidates to reply with their own knowledge and experience, and to NOT use AI generated responses. Any candidates who appear to be reading responses will be rejected.
· Please include a link to your candidate''s LinkedIn profile with their submittal!
Notes from MSP:
· Some takeaways so far. The candidate must be local! They can relocate to Cincy by day one, but they have to be able to be in the office 5 days a week. This is a tough find. The manager does not want to manager someone remote. They are getting their assigned work done and not participating in strategy.
· The manager is finding that basic questions cannot be answered in the interview. Please do not enhance their resumes, Data Bricks is a must for this req and lead experience is needed
· The bill rate is flexible! Please don''t send over the highest rate possible, but he is aware he will have to probably pay more for a solid candidate.
· Please add their LinkedIn profile with every submittal.
· PLEASE tell your candidate to NOT read during the prescreen. No matter how good they are, we can tell who is reading and who is being genuine. We are not looking for perfect videos; we are looking for those who know what they are talking about. Natural conversations vs stating facts.
Requirements
· Senior experience as a Data Engineer
· Strong experience with Azure Databricks, Spark, and Python
· Strong SQL skills and database experience
· Experience monitoring and optimizing Databricks clusters or workflows
· Experience working with Azure data services and integrating them with Databricks and enterprise data platforms
· Experience building and optimizing distributed data processing systems (e.g., partitions, joins, shuffles, cluster performance)
· Experience with data pipeline development using tools such as Delta Live Tables (DLT) or Databricks SQL
· Experience with orchestration, messaging services, or serverless components (e.g., Azure Functions)
· Experience with version control and CI/CD tools such as GitHub and GitHub Actions
· Experience using Terraform for cloud infrastructure provisioning
· Familiarity with SDLC and modern data engineering best practices
· Strong organizational skills with the ability to manage multiple priorities and work independently
Nice to Have
· Experience with data governance, lineage, or cataloging tools (e.g., Purview, Unity Catalog)
Responsibilities
· Analyze, design, and develop enterprise data solutions using Azure, Databricks, Spark, Python, and SQL
· Develop, optimize, and maintain Spark/PySpark data pipelines, addressing performance issues such as data skew, partitioning, caching, and shuffle optimization
· Build and support Delta Lake tables and data models for analytical and operational use cases
· Apply reusable design patterns, data standards, and architectural guidelines across the enterprise, including collaboration with 84.51° when needed
· Use Terraform to provision and manage cloud and Databricks resources, supporting Infrastructure as Code (IaC) practices
· Implement and maintain CI/CD workflows using GitHub and GitHub Actions for source control, testing, and pipeline deployment
· Manage Git-based workflows for Databricks notebooks, jobs, and data engineering artifacts
· Troubleshoot failures and improve reliability across Databricks jobs, clusters, and data pipelines
· Apply cloud computing skills to deploy fixes, upgrades, and enhancements in Azure environments
· Collaborate with engineering teams to enhance tools, systems, development processes, and data security
· Participate in the development and communication of data strategy, standards, and roadmaps
· Create architectural diagrams, interface specifications, and design documentation
· Promote reuse of data assets and contribute to enterprise data catalog practices
· Provide timely support and communication to stakeholders and end users
· Mentor team members on data engineering best practices and emerging technologies
Thanks & Regards
Deepak Singh
Vcentrix Services US LLC
8 The Green, Suite B, Dover, DE 19901, USA
Contact No : Ext: 110
Email ID: