Back to Search
Overview
Mid-Level

SW Engineer, Cloud Services

Confirmed live in the last 24 hours

Roku

Roku

Bengaluru, India
Hybrid
Posted April 6, 2026

Job Description

Teamwork makes the stream work.

 

Roku is changing how the world watches TV

Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers.

From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines.

 

About the Team

Roku is the No. 1 TV streaming platform in the U.S., Canada, and Mexico with 70+ millions of active accounts. Roku pioneered streaming to the TV and continues to innovate and lead the industry. A highly scalable, highly available, real-time advertising platform is critical to support and grow Roku's rapidly expanding ad business. The mission of inventory forecasting team is to build a world-class service across all advertising verticals, integrating with planning and serving systems to accurately predict advertiser goals and maximize revenue for Roku based on supply and demand.

 

About the Role

We are seeking a highly skilled Software Engineer with deep expertise in Big Data technologies. This hybrid position bridges software engineering and data engineering, requiring the ability to design, build, and maintain scalable systems for both application development and large-scale data processing. In this role, you will collaborate with cross-functional teams to architect and manage robust, production-grade data products. You will work with technologies such as Apache Spark, Apache Airflow, Trino, Apache Pinot, Spring Boot and Looker to deliver reliable, high-performance solutions. The ideal candidate is a proactive, self-motivated professional with a strong track record in building high-scale data services and a dedication to delivering exceptional results.

 

What you will be doing

  • Engage with stakeholders, product managers to understand their needs, assess application features and design the solution to solve business problems.
  • Collaboration with cross-functional teams: Partner with product managers, data scientists, and other engineers to deliver impactful solutions.
  • Design, develop, and maintain data pipelines and ETL workflows using Apache Spark and Apache Airflow.
  • Optimize data storage, retrieval, and processing systems to ensure reliability, scalability, and performance.
  • Develop and fine-tune complex queries and analytics solutions using Apache Pinot, Trino, for large-scale datasets.
  • Monitor, troubleshoot, and improve data systems to minimize downtime and maximize efficiency.
  • Design and build APIs and backend services using Spring Boot to support data products and workflows.
  • Write clean, maintainable, and efficient code, ensuring adherence to best practices through code reviews.

 

We are excited if you have

  • 5+ years of experience building large-scale distributed systems.
  • 5+ years of experience in BigData technologies.
  • Advanced SQL skills, with expertise in query optimization for large datasets.
  • Experience with cloud platforms such as AWS, GCP, or Azure, and containerization tools like Docker and Kubernetes.
  • Proven experience in distributed data processing, data warehousing, and real-time data pipelines.
  • Experience with programming in Python, Scala, Java preferred
  • Exceptional problem-solving abilities and the capacity to work independently or collaboratively.
  • Demonstrated ability to drive timely consensus in design with other senior team members.
  • Bachelors or Master's degree in Computer Science, Engineering, or equivalent.
  • Experience with Ad-Tech and building Agentic AI systems to automate decision-making is a pl
pythonjavagorustawsgcpazurekubernetesdockerai