Overview
On Site
Full Time
Skills
Jersey
Collaboration
Information Technology
Testing
Cloud Computing
Cyber Security
Leadership
Mentorship
Data Structure
Data Warehouse
Business Management
Dashboard
Meta-data Management
IBM
TWS
Job Scheduling
Scheduling
Data Quality
Databricks
Data Lake
Storage
GRID
Analytics
Unity
Data Governance
Data Architecture
Financial Services
Continuous Integration
Continuous Delivery
Git
Jenkins
DevOps
Data Management
Data Modeling
IBM InfoSphere
Microsoft SQL Server
PL/SQL
Transact-SQL
SQL
Stored Procedures
Python
Java
Identity Management
Cloud Security
Incident Management
Vulnerability Management
PaaS
SaaS
ServiceNow
IO
JIRA
Microsoft Azure
Active Directory
Microsoft
Event Management
SIEM
Information Security
Regulatory Compliance
RESTful
Web Services
Microservices
Extract
Transform
Load
ELT
Tivoli
IBM Tivoli Workload Scheduler
Microsoft Power BI
Data Visualization
Reporting
Problem Solving
Conflict Resolution
Analytical Skill
Electronic Warfare
Job Details
Job Title: Senior Databricks Engineer
Jersey City NJ, hybrid 3 days per week on site
Contract-to-hire , 2-3 positions
W2 only
no 3rd parties please
Role Description
This role will be responsible for designing, developing & deploying cloud data solutions for ISDAD. This is part of the overall cyber data initiative focusing on building out the security and risk data platform for Information Security. This individual will be responsible for developing the data feeds and will be a part of the larger development effort of building out a Cybersecurity Data Lake. The goal of the data lake is to centralize the data as well as establish effective data governance around the data sources and its data lineage using Databricks heavily. This role will collaborate with the developers, data owners, governance leads and business analysts within the Information Technology (IT) department as well as other stakeholders aligned with the applications.
Responsibilities
Qualifications and Skills
Jersey City NJ, hybrid 3 days per week on site
Contract-to-hire , 2-3 positions
W2 only
no 3rd parties please
Role Description
This role will be responsible for designing, developing & deploying cloud data solutions for ISDAD. This is part of the overall cyber data initiative focusing on building out the security and risk data platform for Information Security. This individual will be responsible for developing the data feeds and will be a part of the larger development effort of building out a Cybersecurity Data Lake. The goal of the data lake is to centralize the data as well as establish effective data governance around the data sources and its data lineage using Databricks heavily. This role will collaborate with the developers, data owners, governance leads and business analysts within the Information Technology (IT) department as well as other stakeholders aligned with the applications.
Responsibilities
- Design, develop, testing & support of cloud data solutions focusing on data ingestion, data quality, data tuning and performance of upstream cybersecurity data sources using Databricks.
- Indirect leadership and mentorship of junior team members. Able to lead data feed development efforts and design initiatives. Participate in development meetings to align development priorities and objectives, assign tasks, and share experiences and challenges with applications under development.
- Consult with other technology and development teams as needed to coordinate on the integration of applications with the larger company software ecosystem.
- Capture and document metadata for identified Key Data Elements (KDEs) to ensure accuracy and completeness for Data Quality (DQ) rules and processing of daily datasets.
- Work with the data architecture team to align KDEs to the logical data models, develop physical data structures, and document physical data names, definitions, and data types.
- Partner with the data owners and stakeholders to create technical requirements and DQ rules around the data elements needed in the data warehouse. Partner with the Business Management team and Data Owners to understand what critical metrics and data fields are needed for Metric Dashboards.
- Establish views to encapsulate the data so that it is fit for downstream consumption. Ensure that the data aligns with DQ rules established on that metadata, so it is fit for daily use. Utilize IBM TWS production job scheduling system and adhere to standards around the daily scheduling and batch monitoring of production jobs.
- Identify and resolve DQ issues including inaccuracies and incomplete information. Enhance data quality efforts by implementing improved procedures and processes.
Qualifications and Skills
- Strong knowledge of Azure Databricks, Azure Data Factory, Azure Functions, Azure Data Lake storage, Azure Event Grid, Azure Log Analytics, Azure Monitor, Unity catalog repository configuration
- 10+ years' experience in IT development, data governance, data architecture or related roles, preferably in a highly regulated environment such as financial services.
- Strong knowledge of CI/CD and DevOps tooling (i.e., Git, Jenkins, Azure DevOps)
- Proficient in data management & data modeling tools (e.g., Collibra DQIM/DQE, IBM Infosphere DA).
- Proficient in SQL Server, Oracle / PL-SQL, T-SQL and SQL stored procedures
- Proficient in Python, Java or similar high-level server-side languages
- Strong knowledge of enterprise Information Security data (i.e., Phishing, Identity Management, Privileged Access, Cloud Security, Incident Response, Vulnerability Management, Threat Detection).
- Data knowledge of PaaS/SaaS products (i.e., ServiceNow, Crowdstrike, MS Purview, Proofpoint, WIZ.IO, JIRA, SharePoint, Azure Active Directory, SAI360). Knowledge of Microsoft Sentinel for security information and event management (SIEM) is a plus.
- Understanding of information security frameworks (i.e., NIST, CIS, CRI Profile) and regulatory compliance (i.e., NYSDFS, GDPR, CCPA).
- Experience with REST API web services and microservice architecture. Strong understanding of ETL/ELT. Knowledge of IBM Tivoli Workload Scheduler a plus.
- Exposure to PowerBI for data visualization and reporting is a plus.
- Problem solving and analytical skills, with an initiative-taking and results oriented approach.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.