Senior Data Platform Engineer

Agile, Analysis, Architecture, Automated, Bash, Customer Service, Data Analysis, Data Modeling, Data Warehouse, Development, Ecommerce, Hadoop, Java, Linux, Micro, Modeling, NoSQL, Python, Scrum, SQL, XML
Full Time
Work from home not available Travel not required

Job Description


COMPANY OVERVIEW

For over a century, Neiman Marcus Group has served the unique needs of our discerning customers by staying true to the principles of our founders: to be the premier omni-channel retailer of luxury and fashion merchandise dedicated to providing superior service and a distinctive shopping experience in our stores and on our websites. Neiman Marcus Group is comprised of the Specialty Retail Stores division, which includes Neiman Marcus and Bergdorf Goodman, and our international brand, mytheresa.com. Our portfolio of brands offers the finest luxury and fashion apparel, accessories, jewelry, beauty, and home dcor. The Company operates more than 40 Neiman Marcus full-line stores in the most affluent markets across the United States, including U.S. gateway cities that draw an international clientele. In addition, we operate 2 Bergdorf Goodman stores in landmark locations on Fifth Avenue in New York City. We also operate more than 40 Last Call by Neiman Marcus off-price stores that cater to a value oriented, yet fashion minded customer. Our upscale eCommerce and direct-to-consumer division includes NeimanMarcus.com, BergdorfGoodman.com Horchow.com, LastCall.com, and CUSP.com. Every day each of our 15,000 NMG associates works towards the goal of enabling our customer to shop any of our brands "anytime, anywhere, and on any device." Whether the merchandise we sell, the customer service we offer, or our investments in technology, everything we do is to enhance the customer experience across all channels and brands.


Neiman Marcus Group has an immediate opening for a Cloud Data Engineer.



Data engineer will have the unique combination of business acumen needed to interface directly with key stakeholders to understand the problem along with the skills and vision to translate the need into a world-class technical solution using the latest technologies


This person will be a hands-on role who is responsible for building data engineering solutions for NMG Enterprise using cloud based data platform. Data engineer will provide day-to-day technical deliverables and participate in technical design, development and support for data engineering workloads. In this role, you need to be equally skilled with the whiteboard and the keyboard.



Job Duties


  • Understand, Analyze data from multiple data sources and develop technology to integrate the enterprise data layer

  • Create robust and automated pipelines to ingest and process structured and unstructured data from source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset

  • Work activity includes processing complex data sets, leveraging technologies used to process these disparate data sets and understanding the correlations as well as patterns that exist between these different data sets

  • Implement orchestrations of data pipelines and environment using Airflow

  • Implement custom applications using the Kinesis, Lambda and other AWS toolset as required to address streaming use cases

  • Implement automation to optimize data platform compute and storage resources

  • Develop and enhance end to end monitoring capability of cloud data platforms

  • Support migration of on-premise data platforms to AWS cloud

  • Participate in educating and cross training other team members

  • Provide regular updates to all relevant stakeholders

  • Participate in daily scrum calls and provide clear visibility to work products


Job Requirements


  • BS in Computer Science or related field

  • 4+ years of experience in the data and analytics space

  • Certification preferably AWS Certified Big Data or any other cloud data platforms, big data platforms

  • 2+ years of experience developing and implementing enterprise-level data solutions utilizing Python (Scikit-lean, Scipy, Pandas, Numpy, Tensorflow) , Java, Spark, and Scala, Airflow , Hive

  • 2+ years in key aspects of software engineering such as parallel data processing, data flows, REST APIs, JSON, XML, and micro service architectures

  • 1+ year of experience working on Big Data Processing Frameworks and Tools Map Reduce, YARN, Hive, Pig, Oozie, Sqoop, and good knowledge of common big data file formats (e.g., Parquet, ORC, etc.)

  • 4+ years of RDBMS concepts with strong data analysis and SQL experience

  • 3+ years of Linux OS command line tools and bash scripting proficiency

  • Solid programing experience in Python - needs to be an expert in this 4/5 level.


Nice to have:


  • Kubernetes and Docker experience a plus

  • Prior working experience on data science work bench

  • Cloud data warehouse experience - Snowflake is a plus

  • Data Modeling experience a plus

  • Knowledge of data engineering aspects within machine learning pipelines (e.g., train/test splitting, scoring process, etc.)


Knowledge, Skills and Abilities:


  • A passion for technology and data analytics with a strong desire to constantly be learning and honing skills

  • Ability to work in a team environment

  • Flexibility to work in matrix reporting structure

  • Strong understanding of Hadoop fundamentals with experience working on Big Data Processing Frameworks and Tools Map Reduce, YARN, Hive, Pig, Oozie, Sqoop, and good knowledge of common big data file formats (e.g., Parquet, ORC, etc.)

  • Develop large scale event based streaming architectures

  • Strong communication and documentation skills

  • Mentor other team members and participate in cross training

  • Working knowledge of NoSQL, in-memory databases

  • Working knowledge of developing data ingestion and data transformation capabilities using Hive, Python, Spark and Scala, Airflow

  • Background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and micro service architecture.

  • Solid programing experience in Python - needs to be an expert in this 4/5 level

  • Experience working in a scrum/agile environment and associated tools (Jira)

  • Experience with large data sets and associated job performance tuning and troubleshooting

  • Able to collaborate with cross-functional IT teams and global delivery teams


Dice Id : RTX1854dc
Position Id : 10019
Have a Job? Post it