Position: Big Data Architect
Location: Connecticut, Seattle.
Duration: Long Term
Info Services is looking for strong Big Data Architect for largest entertainment company. This person will be handling petabytes of consumer data for analytics (both content engagement and ad engagement). Salary can be based on relevant experience.
Act as the proactive and technical architect point person for Consumer Data Platforms end to end (data collection through knowledge extraction via statistical, machine learning and deep learning approaches, distribution via stream, APIs and files, ad-hoc analysis, reporting & visualization).
Present and educate management team about technical direction to achieve maximum profitability by using best data management technologies while reducing overall cost of operation.
Lead and coach other software engineers by developing re-usable frameworks. Review design and code produced by other engineers.
Provide expert level advice to data scientists, data engineers, and operations to deliver high quality analytics via machine learning and deep learning via data pipelines and APIs.
Lead the transformation of a peta-byte scale batch-based processing platform to a near real-time streaming platform using technologies such as Apache Kafka, Cassandra and Spark.
Design and build efficient ETL/ELT process to move data through the data processing pipeline to meet the demands of the business use cases using Java, Open Source, and AWS Products. Build easy to re-use workflow model and take the entire team to follow the pattern to implement all ETL process to improve efficiency and reduce cost.
Optimize and automate data ingestion, data processing and distribution data from variety of sources, including click stream data, ratings data, advertising data, 3rd party sources and sources not yet identified.
Manage complex data dependencies across datasets and incremental data loading workflows.
Design and build api/stream/batch based data export mechanism to be used by other products such as AdSales, Web, App platforms.
Be a fearless leader in championing smart, scalable and flexible design
Collaborating with product management and acting as the bridge between product management, engineering teams, and customers to understand requirements and technical solutions
Help us stay ahead of the curve by working closely with data management team, data engineers, our DevOps team, and analysts to design systems which can scale overnight in ways which make other groups envy
Have 8+ years of professional experience in building large scale data platforms from Architecture all the way to implementation and support. Platform is expected to handle Peta Bytes of data in cloud environment, on a real-time manner.
Must be hands-on on latest technologies such as Java, Scala, Apache Spark, Apache Kafka, Hadoop, API design and development, No-SQL databases such as Cassandra, OLAP columnar storage systems, Bit Map indexes to handle millions of consumers and thousands of attributes while allowing real-time querying/ segmentation.
Solid understanding of software development from design and architecture to build software for future.
Familiar with technologies relevant to the data and integration space including Hadoop, Spark, Apache Druid, Cassandra, Java, Python, and ML frameworks.
Genuine interest to learn new cutting-edge technologies and share it with rest of the engineering team to keep them up-to-date on technology trends.
Enjoy new and meaningful technology or business challenges which require you to think and respond quickly
Are passionate about data, technology, & creative innovation
Prefer open source technologies and build it yourself mentality.
Enjoy working collaboratively with a talented group of people to tackle challenging business problems so we all succeed (or fail fast) as a team
17177 North Laurel Park Drive, Suite 236 Livonia, MI, 48152