Data Engineer III

Bentonville, AR, US • Posted 3 days ago • Updated 6 hours ago
Full Time
On-site
USD $92,934.00 - 180,000.00 per year
Fitment

Dice Job Match Score™

🫥 Flibbertigibetting...

Job Details

Skills

  • Augmented Reality
  • Big Data
  • Business Cases
  • Data Governance
  • Data Science
  • Analytics
  • Service Level
  • Data Quality
  • Logical Data Model
  • Data Marts
  • Stored Procedures
  • System Integration
  • Instructional Design
  • Data Modeling
  • Test Cases
  • Software Design
  • Servers
  • Documentation
  • Computer Science
  • Software Engineering
  • Data Flow
  • Google Cloud
  • Google Cloud Platform
  • Scala
  • Object-Oriented Programming
  • Python
  • Data Warehouse
  • Apache Spark
  • PySpark
  • SQL
  • Extract
  • Transform
  • Load
  • Apache Hive
  • Git
  • Continuous Integration
  • Continuous Delivery
  • Shell
  • Scripting
  • Real-time
  • Streaming
  • Apache Kafka
  • Apache Sqoop
  • Testing
  • Database
  • MySQL
  • SAFE

Summary

What you'll do...

Position: Data Engineer III

Job Location: 1 Customer Dr, Mail Stop# 0215, Bentonville, AR, 72716

Duties: Identifies possible options to address the business problems within one's discipline through analytics, big data analytics, and automation. Supports the development of business cases and recommendations. Owns delivery of project activity and tasks assigned by others. Supports process updates and changes. Solves business issues. Supports the documentation of data governance processes. Supports the implementation of data governance practices. Understands, articulates, and applies principles of the defined strategy to routine business problems that involve a single function. Extracts data from identified databases. Creates data pipelines and transform data to a structure that is relevant to the problem by selecting appropriate techniques. Develops knowledge of current data science and analytics trends. Supports the understanding of the priority order of requirements and service level agreements. Helps identify the most suitable source for data that is fit for purpose. Performs initial data quality checks on extracted data. Analyzes complex data elements, systems, data flows, dependencies, and relationships to contribute to conceptual, physical, and logical data models. Develops the Logical Data Model and Physical Data Models including data warehouse and data mart designs. Defines relational tables, primary and foreign keys, and stored procedures to create a data model structure. Evaluates existing data models and physical databases for variances and discrepancies. Develops efficient data flows. Analyzes data-related system integration challenges and proposes appropriate solutions. Creates training documentation and trains end-users on data modeling. Oversees the tasks of less experienced programmers and stipulates system troubleshooting supports. Writes code to develop the required solution and application features by determining the appropriate programming language and leveraging business, technical, and data requirements. Creates test cases to review and validate the proposed solution design. Creates proofs of concept. Tests the code using the appropriate testing approach. Deploys software to production servers. Contributes code documentation, maintains playbooks, and provides timely progress updates. Demonstrates up-to-date expertise and applies this to the development, execution, and improvement of action plans by providing expert advice and guidance to others in the application of information and best practices; supporting and aligning efforts to meet customer and business needs; and building commitment for perspectives and rationales.

Minimum education and experience required: Master's degree or the equivalent in Computer Science or related field; OR Bachelor's degree or the equivalent in Computer Science or related field plus 2 years of experience in software engineering or related field.

Skills required: Must have experience with: Designing and developing dataflows in Google Cloud Platform; Coding in programming language in Scala and object-oriented programming in Python; Designing and developing data warehouse in BigQuery; Developing data pipelines and analyzing Adhoc request with Apache Spark and PySpark; Designing and developing SQL like querying and Data pipeline on Google bucket with Apache Hive; Code commit and version controlling in Git and Code deployment in Looper with CI/CD; Developing Shell Scripts to Automate jobs; Developing real time data streaming with Apache Kafka; Retrieving relevant data and analysis with Data Discovery; Developing data ingestion pipelines with Sqoop; and Designing and testing Relation database management system in BigQuery and MySQL. Employer will accept any amount of experience with the required skills.

Rate of pay: $92,934 - $180,000/year

Wal-Mart is an Equal Opportunity Employer.

Walmart and its subsidiaries are committed to maintaining a drug-free workplace and has a no tolerance policy regarding the use of illegal drugs and alcohol on the job. This policy applies to all employees and aims to create a safe and productive work environment.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: walar001
  • Position Id: 43404534f7659f1cda0431febe6c88ba
  • Posted 3 days ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

No location provided

Today

Full-time

No location provided

Today

Full-time

USD 100,600.00 - 199,000.00 per year

Remote

Yesterday

Easy Apply

Contract

Depends on Experience

No location provided

Today

Easy Apply

Full-time, Part-time, Contract, Third Party

Search all similar jobs