Back to Search
Overview
Mid-Level

Data Engineer (L3)

Confirmed live in the last 24 hours

Twilio

Twilio

Remote - India
Remote
Posted March 27, 2026

Job Description

Who we are 

At Twilio, we’re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences.

Our dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day. As we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding. Your career at Twilio is in your hands.

We use Artificial Intelligence (AI) to help make our hiring process efficient. That said, every hiring decision is made by real Twilions!

.

See yourself at Twilio

Join the team as Twilio’s next Data Engineer (L3), Data Platform

About the job

This position is needed to build a highly scalable, reliable, and efficient data platform that makes it easy for users to get deep insights from vast amounts of distributed data. This data platform will be a differentiator for Twilions and our customers. 

We are the data backbone for strategic decisions at Twilio. This role will be responsible for delivering key technical decisions to completion for the organization and also guiding other engineers.

Responsibilities

In this role, you’ll:

  • Develop, construct, test, and maintain data architectures (e.g., databases, large-scale processing systems).
  • Design and implement efficient data pipelines for the acquisition, storage, and analysis of large datasets.
  • Collaborate with cross-functional teams to understand data requirements and implement solutions.
  • Ensure the availability and integrity of data pipelines and systems.
  • Optimize and fine-tune performance of data solutions.
  • Troubleshoot and resolve data-related issues in a timely manner.
  • Develop and maintain documentation for data processes and pipelines.
  • Stay abreast of emerging technologies and industry trends related to data engineering.

Qualifications 

Twilio values diverse experiences from all kinds of industries, and we encourage everyone who meets the required qualifications to apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table!

Required:

  • 5 - 8 years of technical experience in Data Engineering/Data Infrastructure and Data Platforms.
  • Proven experience in high-level programming languages, like Scala, Python, Java.
  • Deep technical understanding of needle moving technologies like Kafka, Spark, dbt, Hudi-based data lake, Presto, ETL tools, low-latency data stores, multiple data warehouses and a data catalog.
  • Proficiency in working with large datasets and distributed computing frameworks (e.g., Hadoop, Spark).
  • Knowledge of data warehousing concepts and best practices.
  • Demonstrated technical breadth and depth as evidenced by papers, code, and/or presentations
  • Clear understanding of the software development lifecycle from reviewing requirements to debugging complex systems in production
  • Build data systems and pipelines
  • Explore ways to enhance data quality and reliability
  • Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud).
  • Proven experience in software development methodologies.
  • Experience working independently and as part of a team. Will take
pythonjavagoawsazurekubernetesdockeraidataproduct