CCS Global Tech is a rapidly growing Information Technology company with a diverse portfolio of technology products and services and a large network of industry partnerships. With over 22 years of being a successful business with a global talent pool and presence, CCS is a certified Microsoft Gold Partner and specializes in delivering expert Microsoft based solutions for technical and business needs. We have been recognized by Inc. 500 Magazine as one of the fastest growing small companies in the Unites States.
we are a Tier 1 vendor for the City and County of San Francisco for Cloud Services, Staffing Services and Training Services. For this multi-year opportunity with a diverse set of needs to address, we are currently focusing on establishing partnerships with individuals as well as companies who can help us enhance our overall service portfolio, cut lead times, and ultimately help us deliver successfully. We currently hold sizable Government accounts in the San Francisco bay area including City and County of San Francisco, San Mateo County, and Santa Clara County.
We take great pride in our global reach and local influence. Your experience alongside our highly skilled and talented internal team who guide you along the way, offers key insights into what helps you stand out in a competitive job market.
If you are a partner company, please submit resumes with contact information of your own W2 Consultants only. Submitted consultants are expected to have excellent communication skills.
Role - Databricks Data Engineer
Location - Silver Spring, MD
Clearance - Public Trust clearance
Salary - $130,000 - $140,000 Description
We are seeking a Databricks Data Engineer with strong hands-on experience in Python and PySpark to support a fast-paced, high-impact delivery environment. This role requires someone who can quickly step in, take ownership, and deliver scalable data solutions with minimal ramp-up.
Key Responsibilities
Design, develop, and optimize data pipelines and ETL workflows using Databricks
Build scalable data processing solutions leveraging PySpark and Python
Work with large-scale datasets across cloud platforms (AWS and/or Azure)
Implement and maintain Medallion Architecture (Bronze, Silver, Gold layers)
Ensure data quality, performance optimization, and reliability of data pipelines
Collaborate with cross-functional teams to support data-driven initiatives
Required Qualifications
3 to 5 years of experience in Databricks is required.
Strong hands-on experience with Databricks, Python, PySpark, and SQL
Proven experience building and optimizing ETL pipelines
Experience working with large-scale data environments
Solid understanding of cloud platforms (AWS and/or Azure)
Familiarity with Medallion architecture (Delta Lake)
Ability to work independently in fast-paced, deadline-driven environments
What We're Looking For
Someone who can hit the ground running with minimal onboarding
Strong sense of ownership and accountability
Ability to deliver under tight timelines without compromising quality
Proactive problem-solver with a delivery-first mindset Nice to Have
Experience with Delta Lake
Exposure to data governance and data quality frameworks
Prior experience in federal or regulated environments
Preferred Additional Qualifications:
Experience with Delta Lake
Exposure to data governance and data quality frameworks
Prior experience in federal or regulated environments
Requirements
Required Qualifications: :
Education:
Bachelor's degree in information systems, computer science, or other related field. Will accept a suitable combination of education, training, or experience
Clearance:
Clearance: Federal Health Agency within the U.S. Department of Health and Human Services Public Trust clearance
Onsite Requirement:
Location: Hybrid - locals to DMV area
Professional Experience
3 to 5 years of experience in Databricks is required.
Strong hands-on experience with Databricks, Python, PySpark, and SQL
Proven experience building and optimizing ETL pipelines
Experience working with large-scale data environments
Solid understanding of cloud platforms (AWS and/or Azure)
Familiarity with Medallion architecture (Delta Lake)
Ability to work independently in fast-paced, deadline-driven environments