Workday Finance Data Engineer - W2 Position

Troy, MI, US • Posted 2 days ago • Updated 2 days ago
Full Time
No Travel Required
On-site
$65 - $70/hr
Fitment

Dice Job Match Score™

🎯 Assessing qualifications...

Job Details

Skills

  • Accounts Payable
  • Accounts Receivable
  • Finance
  • Databricks
  • Data Integration
  • Python
  • PySpark
  • Microsoft SSIS
  • Data Modeling

Summary

Hi,

The following requirement is open with our client.

Title                                       : Workday Finance Data Engineer

Location                               : Troy, MI - Onsite

Duration                              : 12+ Months

Rate                                        :$70/hr on W2 

Position                                 : W2 Position

Relevant Experience (in Yrs.):

 

Job Description:
Key Responsibilities
• Lead extraction of Workday Finance and HCM data with primary focus on Finance reporting and analytics use cases
• Design, develop and maintain Workday reports and APIs to support reliable data integrations
• Build and manage scalable ETL/ELT pipelines in Databricks, following Bronze, Silver and Gold Lakehouse patterns
• Manage Workday integration setup including Integration System Users, security domains and access controls
• Collaborate closely with Finance, HR and Analytics teams to deliver trusted, analytics-ready datasets
Required Skills
• Deep understanding of Workday Finance data structures and reporting concepts
• Hands-on expertise in Workday Reporting, RaaS, WQL and integration services
• Strong Databricks skills including PySpark, Spark SQL and Delta Lake
• Data integration and ETL best practices in cloud environments
• Advanced SQL, Python and data modeling skills

 

Must Have Experience
• Strong hands-on experience with Workday Financial Management modules including General Ledger, Accounts Payable, Accounts Receivable, Customer Invoicing, Supplier and Customer Master data
• Proven experience with Workday Reporting including RaaS (Reports-as-a-Service), custom reports, calculated fields and WQL for downstream data consumption
• 3+ years of experience building and optimizing data pipelines on Databricks using PySpark, Spark SQL and Delta Lake
• Strong SQL and Python skills with experience in enterprise-scale data modeling
• Experience working on cloud platforms, preferably Azure

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10117479
  • Position Id: 8963888
  • Posted 2 days ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Auburn Hills, Michigan

9d ago

Easy Apply

Full-time

90,000 - 120,000

Michigan

Today

Full-time

USD 90,000.00 - 150,000.00 per year

Warren, Michigan

Today

Easy Apply

Full-time

$80 - $120

Dearborn, Michigan

Today

Full-time

Search all similar jobs