Data Architects are aligned with the engineering teams, anticipate the business needs and proactively bring new data assets from operational systems into the analytics platform. They are the experts on a given area of data, covering the range from how it is generated in an application to how to move, transform, and extract key business insights from it. They are big data experts, comfortable working with datasets of varying latencies and size. They are excited about unlocking the valuable data hidden in inaccessible raw tables and logs. These data experts pair deep data modeling expertise with business understanding to make answering key business questions streamlined.
What you’ll do
• Own an entire key area of data within Client and create foundational data sets, such as a trillion rows clickstream dataset or product information from thousands of suppliers.
• Build, schedule, and manage data movement from application origin through batch and streaming systems to make it available for key business decisions
• Build flexible data structures, data pipelines and data integrity processes to provide high quality data in a timely manner to support business needs.
• Develop a robust, sustainable plan for the foundational data area, including projecting space requirements, procuring technology, and partnering with engineering on improvements to the data
• Collaborate with Engineering, Business Intelligence team and the Business stakeholders to evaluate and make appropriate changes to the foundational data layer as a result of enhancements to the engineering applications.
• Ensure data products are aligned with the rapidly evolving needs of a multibillion dollar business
• Act as a subject matter expert for technical guidance on large-scale data engineering
What You Have:
• Demonstrated success working with stakeholders and implementing large scale databases to solve complex business requirements.
• Hands on experience with advanced SQL including writing complex, multi-stage transformations, user-defined functions, stored procedures, and tuning query performance.
• Experience scheduling, structuring, and owning data transformation jobs that span multiple systems and have high requirements for volume handled, duration, or timing
• Experience in designing and implementing DW Architectures, data models, Starschema, Snowflake schema and Aggregation Techniques.
• Experience architecting database management systems with traditional and big data technologies such as Oracle, SQL Server, Vertica, Cassandra, Hadoop, Spark, MongoDB. Prior experience with databases 100TB+ highly desired.
• Strong business acumen, critical thinking and technical abilities along with problem solving skills.
• Bachelors or Masters in Computer Science, Computer Engineering, Analytics, Mathematics, Statistics, Information Systems, Management or other engineering or quantitative discipline field.