Software Engineer III

Overview

On Site
USD 169,623.63 - 179,623.63 per year
Full Time

Skills

Telecommuting
Data Processing
Use Cases
Communication
Collaboration
Migration
Testing
Agile
Data Quality
Job Scheduling
Flat File
Git
Continuous Integration
Continuous Delivery
Computer Science
Computer Engineering
Electrical Engineering
Electronic Engineering
Software Development
Big Data
Apache Hadoop
Apache Hive
Cloudera Impala
Informatica
IPC
Microsoft SQL Server
Microsoft SSIS
Python
Unix
Scripting
Warehouse
JSON
Apache Parquet
Apache Avro
Computerized System Validation
XML
Management
Workflow
Automic
Scheduling
Extract
Transform
Load
Microsoft Azure
Amazon Web Services
Data Warehouse
Cloud Computing
Snow Flake Schema

Job Details

Title: Software Engineer IIII

Job Location: 301 West 11th Street, Wilmington, DE 19801. Telecommuting permitted up to 40%

of the week.

Job Description: Design and develop data ETL pipelines that include data processing, enrichment, archival, security, and data quality to transform Commercial data assets into business insights by delivering data for consumption. Assess future commercial domain needs and collaborate with other groups to bring business needs to realization. Analyze use cases and design reusable solutions to avoid any duplications and reduce costs while improving cross-domain communication and collaboration. Work towards migrating the commercial domain to the cloud, collaborating with the cloud platform group, automating deployments and testing, designing and providing solutions focusing on enabling business changes. Utilize agile methodologies to build data products and product enhancements; coordinate with business technology partners and customers on future requirements and prioritizations. Perform hands-on development using Hadoop, Informatica, SQL Server(SSIS), Cloud technologies and Snowflake; Big Data technologies like Hive, Impala, BDM, and data quality in Unix; scripting in Python. Use scheduling tools including CA and Automic for job scheduling, work with multiple file structures like flat files, JSON, and XML in ETL; GIT for CI/CD.

Minimum requirements: Bachelor's degree, or foreign equivalent, in Computer Science, Computer Engineering, Electrical/Electronic Engineering, or related field of study plus six (6) years of experience in the job offered or as Software Developer, IT Consultant, or related occupation. Requires six (6) years of experience with each of the following: ETL software development using big data technologies; designing and developing data warehouse solutions for the business requirements in Hadoop technologies including Hive or Impala; using Informatica BDM; developing ETL solutions using IPC, SQL Server (SSIS), Python and Unix scripts to generate to load batches into warehouse and generating extracts for downstream teams and business users; developing solutions handling different file structures including JSON, PARQUET, AVRO, CSV, and XML and plain text files in both ASCII and EBCDIC; developing and managing jobs and workflows in Automic or other scheduling tool in order to automate the processes; designing and developing objects (tables, views, etc) in snowflake to transform the data into a desired format and generate extracts and reports for business users for their daily operations; designing and developing ETL solutions in cloud technology (Azure, AWS or others) to migrate the on-premise data warehouse into cloud. Requires one (1) year of experience with Snowflake.

Salary: $169,623.63- $179,623.63 per year

Location
Wilmington, Delaware, United States of America
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.