Back to Search
Overview
Senior

Senior Data Engineer

Confirmed live in the last 24 hours

Truecaller

Truecaller

Bangalore
On-site
Posted April 14, 2026

Job Description

Join Truecaller – The place where innovation meets impact!

Truecaller's mission is to build trust in communication by making it safer, smarter, and more efficient. Born in Sweden, trusted by the world, and here’s why we stand out:

  • We are trusted by over 450 million active users every month across 190+ countries
  • We identify over 15 billion calls daily, helping users avoid spam and scams
  • We are powered by a team of 450+ employees from 45+ nationalities

We always look for people who take initiative, own their work, and keep raising the bar. An entrepreneurial mindset matters here, especially when it turns bold ideas into real actions. We stay collaborative and focused, always searching for smarter paths forward. If you want to make an impact and grow with a team that inspires millions, you’ll fit right in.

The role:

You will play an important role in the development of data pipelines, frameworks and models to support the understanding of our users and making better product decisions. You will contribute to empowering the product teams with a complete self-serve analytics platform by working on scalable and robust solutions while collaborating with data engineers, data scientists and data analysts across the company.

What you’ll do: 

  • Design, develop, and maintain scalable data pipelines to process and analyze large data sets in real-time and batch environments.
  • Play a crucial role in the team and own ETL pipelines.
  • Collaborate with data scientists, analysts, and stakeholders to gather data requirements, translate them into robust ETL solutions, and optimize the data flows.
  • Implement best practices for data ingestion, transformation, and data quality to ensure data consistency and accuracy.
  • Develop, test, and deploy complex data models and ensure the performance, reliability, and security of the infrastructure.
  • Own the architecture and design of data pipelines and systems, ensuring they are aligned with business needs and capable of handling growing volumes of data.
  • Make data-driven decisions accompanied by past experience.
  • Monitor data pipeline performance and troubleshoot any issues related to data ingestion, processing, or extraction.
  • Work with big data technologies to enable storage, processing, and analysis of massive datasets.
  • Ensure compliance with data protection and privacy regulations, particularly in regions like the EU where GDPR compliance is essential.

What you bring in: 

  • 6+ years of experience as a Data Engineer
  • Hands-on experience with Airflow for managing workflows and building complex data pipelines in a production environment.
  • Experience working with big data and ETL development.
  • Strong proficiency in SQL and experience working with relational databases
  • Programming skills in PySpark, Spark with Scala, Apache Spark, Kafka, or Flink.
  • Experience working with cloud computing services (eg : GCP, AWS, Azure).
  • Experience with Data Science workflows.
  • Experience in data modeling and creating data lakes using GCP services like BigQuery and Cloud Storage.
  • Expertise in containerization and orchestration using Docker and Kubernetes (GKE) for scaling applications and services on GCP.
  • Build data models and transformations using DBT following software engineering best practices (modularity, testing).
  • Version control experience with Git and familiarity with CI/CD pipelines (e.g., Github actions).
  • Strong understanding of data security, encryption, and GCP IAM roles to ensure privacy and compliance (especially in relation to GDPR and other regulations).
  • Experience in ML model lifecycle management (model deployment, versioning, and retraining) using GCP tools like AI Platform, TensorFlow Extended (TFX), or Kubeflow, and Vertex AI. 
  • Experience in working with Data Analysts and Scientists in building Systems in Production.
  • Excellent problem solving and communication skills both with peers and experts from other areas.
  • Self-motivated and have a proven ability to take initiative to solve problems.

It would be great if you also have:

pythonjavagorustawsgcpazurekubernetesdockerai