Overview
Skills
Job Details
Hi,
Hope you are doing well. Below is the Job Description, kindly go through it and please let me know if you are interested.
Senior Python Developer with Databrick and Kafka(Capital Markets)
Iselin, NJ 3days/week onsite.
We are seeking a skilled Python Developer with hands-on experience in Databricks, and Kafka to join our technology team. The ideal candidate will design, develop, and optimize large-scale data processing pipelines and real-time data streaming solutions to support our trading, risk, and compliance functions. You will collaborate with business stakeholders and data teams to deliver high-performance data solutions in a fast-paced financial environment.
________________________________________
Responsibilities:
Develop, test, and maintain scalable ETL/ELT data pipelines using Python, PySpark, and Databricks on cloud platforms.
Build and manage real-time data streaming solutions with Kafka to support low-latency data feeds.
Collaborate with quantitative analysts, traders, and risk managers to understand data requirements and deliver effective solutions.
Optimize existing data workflows for performance, reliability, and efficiency.
Implement data quality checks and monitoring mechanisms.
Participate in code reviews, documentation, and knowledge sharing within the team.
Ensure compliance with financial data governance and security standards.
Stay updated with emerging technologies and propose innovative solutions for data processing challenges.
________________________________________
Required Skills & Qualifications:
8+ years of experience in Python development
Strong experience with Databricks platform and cloud-based data engineering.
Proven expertise in Kafka for building scalable, real-time streaming applications.
Knowledge of relational and NoSQL databases (e.g., SQL, Cassandra, MongoDB).
Familiarity with investment banking processes, trading systems, risk management, or financial data workflows.
Good understanding of distributed computing concepts and big data ecosystem.
Experience with version control systems (e.g., Git) and Agile development methodologies.
Excellent problem-solving skills, attention to detail, and ability to work under tight deadlines.
________________________________________
Preferred Qualifications:
Experience with other big data tools such as Hadoop, Spark SQL, or Flink.
Knowledge of financial data standards and regulations.
Certification in Cloud platforms (AWS, Azure, Google Cloud Platform).
Previous experience working in a regulated financial environment.