Overview
On Site
Contract - Independent
Skills
Artificial Intelligence
Teradata
Migration
Wireless Communication
Data Migration
IT Management
Data Profiling
eXist
Interfaces
IT Architecture
Data Modeling
Collaboration
Automated Testing
Data Integrity
Data Quality
Project Planning
Strategy Development
Acceptance Testing
Data Validation
Product Management
Design Architecture
Data Domain
Data Architecture
Testing
Extract
Transform
Load
Real-time
Data Processing
Analytics
Apache Spark
SQL
Scala
Apache Hadoop
Business Intelligence
Data Warehouse
Continuous Integration
Continuous Delivery
Cloud Computing
Software Development
Analytical Skill
Problem Solving
Conflict Resolution
Big Data
Organizational Skills
Communication
Google Cloud
Google Cloud Platform
Project Management
Bug Tracking
JIRA
Computer Science
Privacy
Marketing
Job Details
Location: Temple Terrace, FL
Salary: Negotiable
Description: Our client is currently seeking a BIG DATA DEVELOPER - IV
Targeted Years of Experience: 15-20 years ****Location : Temple Terrace, FL, Irving, Dallas TX , OR Alpharetta, GA ****
Working Model: Hybrid ( Three days a week in Office)
Job Responsibilities: As a part of our AI & D team, the Data Engineer will be responsible for leading development and validation activities of Big Data products and applications which run on the large Hadoop and Teradata clusters.
The qualified engineer will bring technical leadership for developing and testing ETL process, migrating different applications to cloud, developing data validation tools used for performing quality assessments and measurements on different data sets that feed Big Data products. Should have VZ Data & System knowledge of Wireless & Wireline. The candidate will be involved in,
Lead in design, development and testing of data ingestion pipelines, perform end to end validation of ETL process for various datasets that are being ingested into the big data platform. Perform data migration and conversion validation activities on different applications and platforms.
Provide the technical leadership on data profiling/analysis, discovery, analysis, suitability and coverage of data, and identify the various data types, formats, and data quality issues which exist within a given data source. Contribute to development of transformation logic, interfaces and reports as needed to meet project requirements.
Participate in discussion for technical architecture, data modeling and ETL standards, collaborate with Product Managers, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments)
Lead in design and develop validation framework and integrated automated test suites to validate end to end data pipeline flow, data transformation rules, and data integrity. Develop tools to measure the data quality and visualize the anomaly pattern in source and processed data.
Assist Manager in project planning, validation strategy development Provide support in User acceptance testing and production validation activities. Provide technical recommendations for identifying data validation tools, recommend new technologies to improve the validation process.
Evaluate existing methodologies and processes and recommend improvements. Work with the stakeholders, Product Management, Data and Design, Architecture teams and executives to call out issues, guide and contribute to the resolution's discussions. Required Skills: 10+ years of Software development and testing experience.
15 Years of VZ Data Domain knowledge With Data Architecture Background. Experience with developing and testing ETL, real-time data-processing and Analytics Application Systems.
Strong knowledge in Spark SQL, Scala code development in big data Hadoop environment and/or BI/DW development experiences.
Experience with development and automated framework in a CI/CD environment. Experience with cloud environments - Google Cloud Platform is a plus. A solid understanding of common software development practices and tools. Strong analytical skills with a methodical approach to problem solving applied to the Big Data domain Good organizational skills and strong written and verbal communication skills.
Desired Skills:
Working experience on large VZ Projects is a big plus. Working experience on Google Cloud platform is a big plus
Development experience for tools and utilities for monitoring and alert set etc. Familiarity with project Management and bug tracking tools, i.e., JIRA or a similar tool.
EDUCATION/CERTIFICATIONS: Please indicate whether education and/or certifications are required or desired. Bachelor's Degree in computer science, or engineering
By providing your phone number, you consent to: (1) receive automated text messages and calls from the Judge Group, Inc. and its affiliates (collectively "Judge") to such phone number regarding job opportunities, your job application, and for other related purposes. Message & data rates apply and message frequency may vary. Consistent with Judge's Privacy Policy, information obtained from your consent will not be shared with third parties for marketing/promotional purposes. Reply STOP to opt out of receiving telephone calls and text messages from Judge and HELP for help.
Contact:
This job and many more are available through The Judge Group. Please apply with us today!
Salary: Negotiable
Description: Our client is currently seeking a BIG DATA DEVELOPER - IV
Targeted Years of Experience: 15-20 years ****Location : Temple Terrace, FL, Irving, Dallas TX , OR Alpharetta, GA ****
Working Model: Hybrid ( Three days a week in Office)
Job Responsibilities: As a part of our AI & D team, the Data Engineer will be responsible for leading development and validation activities of Big Data products and applications which run on the large Hadoop and Teradata clusters.
The qualified engineer will bring technical leadership for developing and testing ETL process, migrating different applications to cloud, developing data validation tools used for performing quality assessments and measurements on different data sets that feed Big Data products. Should have VZ Data & System knowledge of Wireless & Wireline. The candidate will be involved in,
Lead in design, development and testing of data ingestion pipelines, perform end to end validation of ETL process for various datasets that are being ingested into the big data platform. Perform data migration and conversion validation activities on different applications and platforms.
Provide the technical leadership on data profiling/analysis, discovery, analysis, suitability and coverage of data, and identify the various data types, formats, and data quality issues which exist within a given data source. Contribute to development of transformation logic, interfaces and reports as needed to meet project requirements.
Participate in discussion for technical architecture, data modeling and ETL standards, collaborate with Product Managers, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments)
Lead in design and develop validation framework and integrated automated test suites to validate end to end data pipeline flow, data transformation rules, and data integrity. Develop tools to measure the data quality and visualize the anomaly pattern in source and processed data.
Assist Manager in project planning, validation strategy development Provide support in User acceptance testing and production validation activities. Provide technical recommendations for identifying data validation tools, recommend new technologies to improve the validation process.
Evaluate existing methodologies and processes and recommend improvements. Work with the stakeholders, Product Management, Data and Design, Architecture teams and executives to call out issues, guide and contribute to the resolution's discussions. Required Skills: 10+ years of Software development and testing experience.
15 Years of VZ Data Domain knowledge With Data Architecture Background. Experience with developing and testing ETL, real-time data-processing and Analytics Application Systems.
Strong knowledge in Spark SQL, Scala code development in big data Hadoop environment and/or BI/DW development experiences.
Experience with development and automated framework in a CI/CD environment. Experience with cloud environments - Google Cloud Platform is a plus. A solid understanding of common software development practices and tools. Strong analytical skills with a methodical approach to problem solving applied to the Big Data domain Good organizational skills and strong written and verbal communication skills.
Desired Skills:
Working experience on large VZ Projects is a big plus. Working experience on Google Cloud platform is a big plus
Development experience for tools and utilities for monitoring and alert set etc. Familiarity with project Management and bug tracking tools, i.e., JIRA or a similar tool.
EDUCATION/CERTIFICATIONS: Please indicate whether education and/or certifications are required or desired. Bachelor's Degree in computer science, or engineering
By providing your phone number, you consent to: (1) receive automated text messages and calls from the Judge Group, Inc. and its affiliates (collectively "Judge") to such phone number regarding job opportunities, your job application, and for other related purposes. Message & data rates apply and message frequency may vary. Consistent with Judge's Privacy Policy, information obtained from your consent will not be shared with third parties for marketing/promotional purposes. Reply STOP to opt out of receiving telephone calls and text messages from Judge and HELP for help.
Contact:
This job and many more are available through The Judge Group. Please apply with us today!
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.