Senior Software Engineer, Data - Advertising Engineering
Confirmed live in the last 24 hours
Roku
Job Description
Teamwork makes the stream work.
Roku is changing how the world watches TV
Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers.
From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines.
About the team
The Ads Analytics team plays a critical role in Roku’s Advertising organization, leading measurement and analytics initiatives that power decision-making across the advertising ecosystem. We develop and manage products that deliver actionable insights for advertisers while meeting the operational and analytical needs of internal business teams.
We work closely with Product Managers, Ad Sales, Ad Operations, Data Science, and multiple other software teams within the Advertising Engineering organization to deliver high-impact solutions.
Looking ahead, we are exploring AI-driven measurement capabilities to further enhance the effectiveness of advertising campaigns and strengthen internal analytics.
About the role
We are seeking a high skilled Senior Software Engineer with deep expertise in API Development using Spring Boot while also possessing knowledge of Apache Spark-based Big Data pipelines using Airflow. This hybrid position bridges software engineering and data engineering, requiring the ability to design, build, and maintain scalable systems for both application development and large-scale data processing.
You will communicate and collaborate heavily with both business and engineering to ensure high quality standards and overall project delivery success. In this role, you will collaborate with cross-functional teams to architect and manage robust, production-grade data products that power critical analytics and measurement capabilities. You will work with technologies such as Apache Spark, Apache Airflow, Trino, Druid, Spring Boot, StarRocks, and Looker to deliver reliable, high-performance solutions.
The ideal candidate is a proactive, self-motivated professional with a strong track record in building applications and services with a dedication to delivering exceptional results.
What you’ll be doing
Software Development
- Design and build APIs and backend services using Spring Boot to support data products and analytics workflows.
- Write clean, maintainable, and efficient code and tests, ensuring adherence to best practices through code reviews.
Big Data Engineering
- Design, develop, and maintain data pipelines and ETL workflows using Apache Spark and Apache Airflow.
- Optimize data storage, retrieval, and processing systems to ensure reliability, scalability, and performance.
- Develop and fine-tune complex queries and analytics solutions using Druid, Trino, and StarRocks for large-scale datasets.
- Monitor, troubleshoot, and improve data systems to minimize downtime and maximize efficiency.
Collaboration & Mentorship
- Partner with data scientists, software engineers, and other teams to deliver integrated, high-quality solutions.
- Provide technical guidance and mentorship to junior engineers, promoting best practices in software and data engineering.
We’re excited if you have
- 10+ years professional software development experience.
- Experience building highly scalable, low-latency REST services and API frameworks.
- Able to efficiently write SQL queries and analyze results to explore business problems and inquiries.
- Ability to translate queries and reporting requirements into production-level data processing pipelines.
- Proficient in day-to-day use of Python / Java / Scala.
- Strong understanding of and experience with distributed computing frameworks like Hive/Hadoop and Apache Spark.
- Expertise in data modeling, schema design, and data visualization tools.
- Experience working in GCP and/or AWS is a plus.
- Experience with JavaScript, Node.js and React is nice to have.
- Experience deploying services on Kubernetes is a plus.
- Knowledge of CI/CD pipelines, DevOps practices, and infrastructure-as-code tools (e.g., Terraform) is a plus.
- Background in computer science or similar quantitative field.
#LI-GL1
Our Hybrid Work Approach
Roku fosters an inclusive and collaborative environment where teams work in the office Monday through Thursday. Fridays are flexible for remote work except for employees whose roles are required to be in the office five days a week or employees who are in offices with a five day in office policy.
Benefits
Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support a
Similar Jobs
Accenture Federal Services
Sr. Full Stack Engineer (Python, AWS & GenAI)
Shiftsmart
Early Career Software Engineer
Ethos Life
Full Stack Engineer
Virtu Financial
Software Engineer - Web Application, Front-end (Javascript/React)
Babylist
Senior Software Engineer
BJAK