Data Engineer with IICS exp - Issaquah, WA (hybrid)

Hybrid in Issaquah, WA, US • Posted 2 days ago • Updated 2 days ago
Contract W2
Contract Corp To Corp
No Travel Required
Hybrid
Depends on Experience
Fitment

Dice Job Match Score™

🧠 Analyzing your skills...

Job Details

Skills

  • 10 + Years of exp in Data Engineering
  • 5 + Years of Exp in Python
  • IICS Experience
  • Strong understanding of database storage concepts (data lake
  • relational databases
  • NoSQL
  • Graph
  • data warehousing).
  • ETL

Summary

Role: Data Engineer with IICS Experience

Location: Issaquah, WA (hybrid)

Duration: 12+ Months

 

Must have:

  • 10 + Years of exp in Data Engineering
  • IICS Experience
  • 5 + Years of Exp in Python
  • 4+ Years of exp in Google Cloud Platform/Azure
  • Strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).

 

Overview:

The Data Engineer will be responsible for designing, building, and maintaining scalable, high-performance data pipelines and integration solutions using Python and Google Cloud Platform (Google Cloud Platform) services.

This role requires a hands-on engineer with strong expertise in data architecture, ETL/ELT development, and real-time/batch data processing, who can collaborate closely with analytics, development, and DevOps teams to ensure reliable, secure, and efficient data delivery across the organization.

 

Job Duties/Essential Functions:

  • Builds data models and develop data pipelines to store data in defined data models and structures.
  • Identifies ways to improve data reliability, efficiency and quality of data management.
  • Conducts ad-hoc data retrieval for business reports and dashboards.
  • Assesses the integrity of data from multiple sources.
  • Manages database configuration including installing and upgrading software and maintaining relevant documentation.
  • Develops and operationalizes data pipelines to create enterprise certified data sets that are made available for consumption (BI, Advanced analytics, APIs/Services).
  • Works in tandem with Data Architects, Data Stewards and Data Quality Engineers to design data pipelines and recommends ongoing optimization of data storage, data ingestion, data quality and orchestration.
  • Designs, develops, & implements ETL/ELT processes using Informatica Intelligent Cloud Services (IICS).
  • Uses Google Cloud and Azure services such as Azure SQL DW (Synapse), ADLS, Azure Event Hub, Cosmos, Databricks, Delta Lake to improve and speed up delivery of our data products and services.
  • Implements big data and NoSQL solutions by developing scalable data processing platforms to drive high-value insights to the organization.
  • Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery.
  • Identifies ways to improve data reliability, efficiency and quality of data management.
  • Communicates technical concepts to non-technical audiences both in written and verbal form.
  • Performs peer reviews for other data engineer’s work.
  • Regular and reliable workplace attendance at your assigned location.

 

Requirements:

  • 5+ years’ experience engineering and operationalizing data pipelines with large and complex datasets.
  • 3+ years’ hands-on experience with Informatica PowerCenter and/or IICS.
  • 4+ years’ experience working with Cloud technologies such as Data flow, Data Fusion, Pub/Sub, Dataform, dbt, GCS, Bigquery, Cloud SQL, Firestore/ Datastore, Apigee ADLS, Azure Databricks, Spark, Azure Synapse, Cosmos DB, and other big data technologies.
  • Extensive experience working with various data sources (DB2, SQL,Oracle, flat files (csv, delimited), APIs, XML, JSON.
  • Advanced SQL skills required. Solid understanding of relational databases and business data; ability to write complex SQL queries against a variety of data sources.
  • 3+ years’ experience with Data Modeling, ETL, and Data Warehousing.
  • Strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).

 

Recommended:

  • Azure or Google datacloud certifications.
  • Experience implementing data integration techniques such as event/message-based integration (Kafka, Azure Event Hub), ETL.
  • Experience with Git/Azure DevOps.
  • Experience delivering data solutions through agile software development methodologies.
  • Exposure to the retail industry.
  • Excellent verbal and written communication skills.
  • Experience with UC4 Job Scheduler.
  • Proficient in Google Workspace applications, including Sheets, Docs, Slides, and Gmail.
  • Successful internal candidates will have spent one year or more on their current team.

 

 

Thanks

Mayank Verma

Senior Technical Recruiter | Empower Professionals

......................................................................................................................................

| Phone:  x 364

LinkedIn:

Fax: | 100 Franklin Square Drive – Suite 104 | Somerset, NJ 08873

Certified NJ and NY Minority Business Enterprise (NMSDC)

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10120856
  • Position Id: 8949238
  • Posted 2 days ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Issaquah, Washington

Today

Full-time

USD 85,000.00 - 110,000.00 per year

Seattle, Washington

Today

Easy Apply

Contract

$71.00 - $71.66 per hour

Seattle, Washington

Today

Easy Apply

Full-time, Part-time, Contract, Third Party

$DOE

Redmond, Washington

Today

Easy Apply

Contract, Third Party

$70 - $85

Search all similar jobs