Overview
Remote
On Site
Depends on Experience
Full Time
Skills
Apache Airflow
IBM
Docker
IBM DB2
Jenkins
Kubernetes
Linux
Microsoft Azure
PySpark
Python
SQL
Scripting
Snow Flake Schema
Splunk
Workflow
z/OS
Unix
Grafana
Job Details
Job Title: Senior DB2 Database Administrator
Responsibilities:
Required Skills:
Preferred Tools and Technologies:
Soft Skills:
Education:
We are seeking a highly experienced Senior DB2 Database Administrator with over 12 years of expertise in administering, managing, and optimizing DB2 databases across LUW (Linux, Unix, Windows) and/or z/OS platforms. The ideal candidate will demonstrate deep technical proficiency, leadership, and the ability to work independently in both development and production environments. Experience with modern data engineering tools and practices such as Python, PySpark, Airflow, and CI/CD pipelines is highly desirable.
Responsibilities:
- Lead the design, implementation, and maintenance of enterprise-level DB2 LUW/z/OS database systems.
- Plan and perform DB2 software installations, upgrades, and patching.
- Monitor database performance, analyze system metrics, and apply performance tuning strategies.
- Develop and execute backup, recovery, and high-availability solutions using native tools and third-party utilities.
- Enforce database security, manage user roles, access controls, and support audit/compliance activities.
- Create and maintain automation scripts for database maintenance, health checks, and monitoring.
- Collaborate with application developers and architects to review and optimize SQL queries, stored procedures, and database structures.
- Lead and contribute to capacity planning, disaster recovery planning, and business continuity strategies.
- Document all aspects of database configuration, operational procedures, and troubleshooting guides.
- Stay current with industry trends and proactively recommend improvements to database architecture and operations.
Required Skills:
- 12+ years of experience in DB2 Database Administration on LUW and/or z/OS platforms.
- In-depth knowledge of SQL, query optimization, and performance tuning.
- Strong experience with backup/recovery techniques and high availability solutions (e.g., HADR, Q Replication).
- Experience with Python programming:
- Strong fundamentals including virtual environments, package management, and writing modular, configurable code.
- Ability to mentor junior team members and resolve version/library issues.
- Hands-on experience with DataBricks, preferably on Azure Cloud.
- PySpark programming skills for big data processing and ETL workloads.
- Familiarity with Apache Airflow for workflow orchestration.
- Working knowledge of Snowflake (basic to intermediate level acceptable).
- CI/CD implementation experience with Jenkins, including pipelines, code quality, and security scans.
- Exposure to modern containerization and orchestration tools such as Docker and Kubernetes (a plus).
- Proactive, self-motivated, and able to work independently with minimal guidance.
Preferred Tools and Technologies:
- IBM Data Studio, Toad, or equivalent DB2 tools
- Shell scripting, Unix/Linux command-line expertise
- Monitoring and logging tools (e.g., Grafana, Prometheus, Splunk)
Soft Skills:
- Strong analytical and problem-solving abilities
- Excellent communication and documentation skills
- Ability to lead initiatives, mentor juniors, and collaborate with cross-functional teams
Education:
- Bachelor s or Master s Degree in Computer Science, Information Technology, or related field
- Relevant certifications (e.g., IBM Certified Database Administrator) are a plus
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.