Overview
Remote
On Site
Full Time
Skills
Media
Video
Sales
Software Documentation
Research
Editing
Documentation
Evaluation
Patents
Collaboration
Quality Assurance
Computer Science
Scala
Python
Shell
Apache Flink
Streaming
Apache Spark
API
SQL
Apache HBase
NoSQL
HDFS
Real-time
Data Processing
Apache Kafka
Amazon Web Services
Amazon EC2
Amazon S3
Amazon Route 53
Grafana
MySQL
Linux
Kubernetes
Inventory
Apache Hadoop
Java
SAP BASIS
Law
Job Details
FreeWheel, a Comcast company, provides comprehensive ad platforms for publishers, advertisers, and media buyers. Powered by premium video content, robust data, and advanced technology, we're making it easier for buyers and sellers to transact across all screens, data types, and sales channels. As a global company, we have offices in nine countries and can insert advertisements around the world.
Job Summary
Job Description
DUTIES: Develop and optimize large-scale, real-time data processing pipelines using Spark Streaming, Flink, and languages including Java, Scala, Python, and Shell; handle large-scale streaming data to meet business requirements for low-latency and high-reliability processing; perform data transformations and aggregations using Spark API and SQL; develop and maintain scalable, fault-tolerant systems within the Hadoop ecosystem, utilizing HBase, NoSQL, HDFS, and YARN to ensure reliable and efficient real-time data processing; transmit data across systems using Kafka; use AWS Services including EC2, Lambda, S3, and Route 53; monitor project statuses using DataDog and Grafana; store and query relational data using MySQL and Presto; use Linux; containerize applications using Kubernetes; support applications under development and customize current applications; assist with the software update process for existing applications, and roll-outs of software releases; analyze, test, and assist with the integration of new applications; document all development activity; research, write, and edit documentation and technical requirements, including software designs, evaluation plans, test results, technical manuals, and formal recommendations and reports; monitor and evaluate competitive applications and products; review literature, patents, and current practices relevant to the solution of assigned projects; collaborate with project stakeholders to identify product and technical requirements; conduct analysis to determine integration needs; work with the Quality Assurance team to determine if applications fit specification and technical requirements. Position is eligible to work remotely one or more days per week, per company policy.
REQUIREMENTS: Bachelor's degree, or foreign equivalent, in Computer Science, Engineering, or related technical field, and two (2) years of experience developing software using Java, Scala, Python, and Shell; leveraging Flink and Spark Streaming to process large-scale, real-time data streams; performing data transformations and aggregations using Spark API and SQL; developing and maintaining scalable, fault-tolerant systems within the Hadoop ecosystem, utilizing HBase, NoSQL, HDFS and YARN to ensure reliable and efficient real-time data processing; transmitting data across systems using Kafka; using AWS Services including EC2, Lambda, S3, and Route 53; monitoring project statuses using DataDog and Grafana; storing and querying relational data using MySQL and Presto; using Linux; and containerizing applications using Kubernetes
Disclaimer: This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications
Skills
Datadog, Hadoop Ecosystem, Java
We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That's why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality-to help support you physically, financially and emotionally through the big milestones and in your everyday life.
Please visit the benefits summary on our careers site for more details.
Comcast is an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.
Job Summary
Job Description
DUTIES: Develop and optimize large-scale, real-time data processing pipelines using Spark Streaming, Flink, and languages including Java, Scala, Python, and Shell; handle large-scale streaming data to meet business requirements for low-latency and high-reliability processing; perform data transformations and aggregations using Spark API and SQL; develop and maintain scalable, fault-tolerant systems within the Hadoop ecosystem, utilizing HBase, NoSQL, HDFS, and YARN to ensure reliable and efficient real-time data processing; transmit data across systems using Kafka; use AWS Services including EC2, Lambda, S3, and Route 53; monitor project statuses using DataDog and Grafana; store and query relational data using MySQL and Presto; use Linux; containerize applications using Kubernetes; support applications under development and customize current applications; assist with the software update process for existing applications, and roll-outs of software releases; analyze, test, and assist with the integration of new applications; document all development activity; research, write, and edit documentation and technical requirements, including software designs, evaluation plans, test results, technical manuals, and formal recommendations and reports; monitor and evaluate competitive applications and products; review literature, patents, and current practices relevant to the solution of assigned projects; collaborate with project stakeholders to identify product and technical requirements; conduct analysis to determine integration needs; work with the Quality Assurance team to determine if applications fit specification and technical requirements. Position is eligible to work remotely one or more days per week, per company policy.
REQUIREMENTS: Bachelor's degree, or foreign equivalent, in Computer Science, Engineering, or related technical field, and two (2) years of experience developing software using Java, Scala, Python, and Shell; leveraging Flink and Spark Streaming to process large-scale, real-time data streams; performing data transformations and aggregations using Spark API and SQL; developing and maintaining scalable, fault-tolerant systems within the Hadoop ecosystem, utilizing HBase, NoSQL, HDFS and YARN to ensure reliable and efficient real-time data processing; transmitting data across systems using Kafka; using AWS Services including EC2, Lambda, S3, and Route 53; monitoring project statuses using DataDog and Grafana; storing and querying relational data using MySQL and Presto; using Linux; and containerizing applications using Kubernetes
Disclaimer: This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications
Skills
Datadog, Hadoop Ecosystem, Java
We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That's why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality-to help support you physically, financially and emotionally through the big milestones and in your everyday life.
Please visit the benefits summary on our careers site for more details.
Comcast is an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.