Overview
On Site
69.5/hr - 78.31/hr
Contract - W2
Skills
Financial Services
Finance
Big Data
Hortonworks
Management
DevOps
Mentorship
Splunk
Grafana
Cloud Computing
Analytical Skill
OCP
Root Cause Analysis
Development Testing
Scripting
Virtualization
Testing
Modeling
Documentation
Performance Metrics
Collaboration
IT Infrastructure
Apache Hadoop
Cloudera
Data Lake
Job Details
Outstanding long-term contract opportunity! A well-known Financial Services Company is looking for a Infrastructure Engineer/Cloud and Big Data Tools Engineer in Dallas TX or Charlotte, NC (Hybrid).
Work with the brightest minds at one of the largest financial institutions in the world. This is a long-term contract opportunity that includes a competitive benefit package! Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.
Contract Duration: 12 Months+ with possible extensions W2 Only Required Skills & Experience
Work with the brightest minds at one of the largest financial institutions in the world. This is a long-term contract opportunity that includes a competitive benefit package! Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.
Contract Duration: 12 Months+ with possible extensions W2 Only Required Skills & Experience
- Strong experience with big data platforms: MapR, Hortonworks, Cloudera Data Platform.
- Hands-on expertise with data virtualization tools: Dremio, JupyterHub, AtScale.
- Proficiency in deploying and managing tools in cloud and containerized environments (CDP, OCP).
- Solid understanding of platform engineering, automation scripting, and DevOps practices.
- Proven ability to troubleshoot complex issues and perform root cause analysis.
- Experience in leading technical efforts and mentoring team members.
- Dremio, Hadoop, Splunk and Grafana
- Administer and support tools on the Data private cloud , Including CDP, HWX, MapR.
- Install, configure, and maintain data analytical and virtualization tools such as Dremio, JupyterHub and AtScale, across multiple clusters.
- Develop proof-of-concept solutions leveraging CDP and OCP technologies.
- Deploy tools and troubleshoot issues, perform root cause analysis, and remediate vulnerabilities.
- Act as a technical subject matter expert, supporting programming staff during development, testing, and implementation phases.
- Develop automation scripts for configuration and maintenance of data virtualization tools.
- Lead complex platform design, coding, and testing efforts.
- Drive advanced modeling, simulation, and analysis initiatives.
- Maintain comprehensive documentation of Hadoop cluster configurations, processes, and procedures.
- Generate reports on cluster usage, performance metrics, and capacity utilization.
- Work closely with data engineers, data scientists, and other stakeholders to understand their requirements and provide necessary support.
- Collaborate with IT infrastructure teams for integrating Dremio Tool, Hadoop clusters with existing systems and services.
- Certifications in Cloudera, OpenShift, or related technologies.
- Experience with enterprise-level data lake architectures and governance.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.