Python Data Engineer - Requires Local to VA

  • McLean, VA
  • Posted 4 hours ago | Updated 3 hours ago

Overview

On Site
Depends on Experience
Contract - W2

Skills

python
ETL
Informatica
Apache Spark
PySpark
SQL
Snowflake

Job Details

Python Data Engineer

5 days onsite

Location: FreddieMac, McLean, VA

Duration: 6-7 months

Description:

The Senior Developer will be part of the Client's Enterprise Risk Business Technology Office.

This role will be responsible for supporting our organization s data-driven initiatives, specifically designing and building a data warehouse solution.

This position requires strong experience in data analysis, modeling and engineering with the ability to translate complex technical issues into easily understood communications that will influence executive audiences with varied technical backgrounds and capabilities.

Qualifications:

Bachelor s degree in computer science, information technology or related field; advanced studies/degree preferred.

5 years extensive knowledge and experience in the Data technologies for Data Analytics, Data Lake/Mart/Warehouse, Databases SQL/NoSQL (DB2, Mongo, Postgres), Big Data Technologies (Spark or PySpark), ETL (Informatica, Talend), REST API, Integration/EAI technologies like Informatica

3+ years experience with Technologies including Web Service API, XML, JSON, JDBC, Java, Python.

3+ years working with SaaS platforms such as Snowflake, Collibra, Mongo/MongoDB Atlas,

Knowledge of enterprise data models, information classification, meta-data models, taxonomies and ontologies.

Exposure to Full stack enterprise application development(Agular, Spring Boot, Automation testing using Selenium)

5-7 years experience in a logical/physical data modeling, data architecture, data analysis, and data management role

Experience with different query languages such as PL/SQL, T-SQL, and ANSI SQL

Experience with database technologies such as DB2, PostgreSQL, Snowflake

Knowledge of data warehousing and business intelligence concepts including data mesh, data fabric, data lake, data warehouse, and data marts

Keys to Success in this Role:

Ability to operate as a self-motivated, pro-active, and result-driven problem solver with excellent analytical and interpersonal skills

Quick learner of new technologies, tools, concepts and ability to translate them to action.

Excellent problem-solving skills and attention to detail

Effective communication and interpersonal skills

Ability to work independently and in a team environment

Must Have Qualifications:

5+ of hands-Python, Pyspark, SQL. Understanding of Agile practices. Preferred Snowflake and Informatica.

Technical Skills:

Must Have:

Apache Spark

PySpark

Python (Programming Language)

Snowflake

Structured Query Language (SQL)

Web APIs

Nice To Have:

Attunity

AWS Step Functions

Control M

Data Warehousing (DW)

IICS, AWS, Cloud

Talend

Call Notes:

Required Skills:

It s basically Python Developer role. Someone who has experience with PySpark development along with Python.

Data Engineer will also work on this position who has experience with Python Development.

Standard Database that is under their Risk Division, they are migrating that to Data Warehouse in Snowflake.

Snowflake experience req.

Position involves lot of ETL processing.

ETL/Informatica Experience required.

Step function Experience is nice to have.

Automation using selenium (Nice to have)

Need a self-starter who can take the initiative and can hit the ground running.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.