Boston, Massachusetts
•
Today
Job Description: Primary Skills: Databricks, Pyspark Proficiency in PySpark and Databricks (Delta Lake, clusters, jobs).Experience is architecting designs for integrating DBX withdifferent application like Sales Force & MDM etcDifferent tool like Collibra etcHands-on with Apache Airflow (DAG design, monitoring).Strong in AWS services: S3, EC2, Lambda, IAM.Strong SQL and Python for transformations and orchestration.Knowledge of Lakehouse architecture (Delta Lake) and data modeling.Experience in
Easy Apply
Contract, Third Party
Depends on Experience