Back to Search
Overview
Mid-Level

Cloud Engineer - GCP & Databricks

Confirmed live in the last 24 hours

Valtech

Valtech

Bengaluru
Hybrid
Posted May 6, 2026

Job Description



Why Valtech?
We’re the experience innovation company - a trusted partner to the world’s most recognized brands. To our people we offer growth opportunities, a values-driven culture, international careers and the chance to shape the future of experience.
 

The opportunity

At Valtech, you’ll find an environment designed for continuous learning, meaningful impact, and professional growth. Whether you're pioneering new digital solutions, challenging conventional thinking or building the next generation of customer experiences, your work will help transform industries. 

We are proud of: 

 

The role  

As a Cloud Engineer, you are passionate about experience innovation and eager to push the boundaries of what’s possible. You bring 6+ YEARS of experience, a growth mindset and a drive to make a lasting impact. 

You will thrive in this role if you are: 

  • A curious problem solver who challenges the status quo 
  • A collaborator who values teamwork and knowledge-sharing 
  • Excited by the intersection of technology, creativity and data 
  • Experienced in Agile methodologies and consulting (a plus) 

 

Role responsibilities

  • Design GCP-native architectures using services such as BigQuery, Dataflow (Apache Beam), Cloud Composer (Airflow), Pub/Sub, Cloud Storage, Vertex AI, and Cloud Run
  • Build and maintain data pipelines — batch and streaming — following medallion architecture patterns (Bronze / Silver / Gold)
  • Implement infrastructure as code using Terraform and manage deployments via Cloud Build or equivalent CI/CD tooling
  • Define and apply GCP landing zone standards including IAM, VPC, Shared VPC, Private Service Connect, and Org policies
  • Develop end-to-end Lakehouse solutions on Databricks — including Delta Lake table design, Unity Catalog governance, and MLflow model tracking
  • Write and optimise PySpark and SQL workloads for large-scale transformations and feature engineering
  • Configure Databricks clusters, job scheduling, auto-scaling, and cost controls in a GCP (DBX on GCP) environment
  • Implement Databricks Workflows and Asset Bundles for orchestration and deployment automation
  • Advise clients on Databricks platform adoption, including migration pathways from legacy Hadoop or on-premise data warehouses
  • Lead technical workshops and requirements-gathering sessions with client technical and business teams
  • Produce high-quality client-facing deliverables: architecture diagrams, technical specifications, migration runbooks, and data dictionaries
  • Present solution designs and progress updates to client stakeholders — adapting technical communication to audience level (CTO to Business Analyst)
  • Manage technical dependencies, risks, and issues across delivery workstreams — escalating to Delivery Managers as appropriate
  • Participate in sprint ceremonies and maintain delivery velocity within Agile/Scrum or Kanban frameworks
  • Implement data quality frameworks using tools such as Great Expectations, dbt tests, or Databricks Delta constraints
  • Support data catalogue initiatives and metadata management using Dataplex or Unity Catalog
  • Ensure GDPR/data residency compliance is embedded into solution design from the outset
  • Support practice leads in responding to RFPs and RFIs — contributing to effort estimation, solution sizing, and proposal narrative
  • Contribute to internal knowledge-sharing — through documentation, runbooks, and team enablement sessions
  • Stay current with GCP and Databricks product roadmaps and proactively identify opportunities to introduce new capabilities to clients
 

Must have qualifications

  • Bachelor's degree in Computer Science, Engineering, Information Systems, or equivalent practical experience
  • 6+ years of hands-on professional experience in cloud data engineering or GCP platform roles.
  • Google Cloud Professional Data Engineer or Professional Cloud Architect certification
  • Databricks Certified Associate Developer for Apache Spark or Databricks Certified Data Engineer certification
  • BigQuery — advanced SQL, partitioning, clustering, cost optimization
  • Cloud Storage, Cloud Functions, Cloud Run
  • Dataflow (Apache Beam) — batch & streaming pipelines
  • Cloud Composer / Airflow — DAG authoring and management
  • Pub/Sub — event-driven architectures
  • Vertex AI — model serving and Pipelines exposure
  • IAM, VPC, Org Policies — security and governance
  • Cloud Build / Artifact Registry — CI/CD and containerization
  • Terraform — infrastructure as code on GCP
  • Looker / Looker Studio — reporting layer familiarity
  • PySpark — DataFrame API, optimization, broadcast joins
  • Delta Lake — ACID transactions, time travel, Z-ordering
  • Unity Catalog — governance, lineage, access control
  • Databricks Workflows / Job orchestration
  • Databricks on GCP — cluster config, instance pools
  • MLflow — experiment tracking, model registry
  • dbt on Databricks or BigQuery
  • Databricks Asset Bundles / CI/CD integration
  • Python — data engineering and scripting proficiency
  • SQL — advanced analytical queries
  • Git / version control — branching, PR workflows
  • Docker — containerization basics
  • Unit testing and data pipeline testing practices
  • REST API integration patterns

Nice to have qualifications  

  • Experience with Azure or AWS alongside GCP — hybrid cloud delivery experience
  • Familiarity with commercetools, Contentful, or other composable commerce/content platforms
  • Exposure to Salesforce Marketing Cloud data connectors or CDPs
  • Knowledge of Generative AI / LLM integration patterns — Vertex AI GenAI Studio, Gemini APIs
  • Experience with Databricks Agent Framework or LangChain for agentic AI workloads
  • Apache Kafka or Confluent Cloud for event streaming architectures
  • Google Cloud Professional Machine Learning Engineer
  • HashiCorp Terraform Associate.
  • Experience in luxury, retail, or FMCG verticals — understanding of product data models and PIM systems

If you do not meet all the listed qualifications or have gaps in your experience, we still encourage you to apply. At Valtech, we recognize that talent comes in many forms, and we value diverse perspectives and a willingness to learn. 

 

Commitment to reaching all kinds of people 

We design experiences that work for all kinds of people - and that starts with our own teams. At Valtech, we’re intentional about building an inclusive culture where everyone feels supported to grow, thrive and achieve their goals. No matter your background, you belong here. Explore our Diversity & Inclusion site to see how we’re creating a more equitable Valtech for all. 

 

The benefits  

This is a Full Time position based in Bengaluru

Beyond a competitive compensation package, we offer: 

  • Flexibility, with remote and hybrid work options (country-dependent) 
  • Career advancement, with international mobility and professional development programs 
  • Learning and development, with access to cutting-edge tools, training and industry experts 

Our benefits are tailored to each location. Your Talent Partner will provide full details during the hiring process. 

  

Your application process

Once you apply, our Talent Acquisition team will review your application. Your CV should cover key information on relevant experiences and expertise. We do not require information such as age, gender, marital status, or a headshot in your application. We review all candidates based on skills, experience, and potential.

⚠️ Beware of recruitment fraud!

We are committed to inclusion and accessibility. If you need reasonable accommodations during the interview process, please either indicate it in your application or let your Talent Partner know. 

  

About Valtech

Valtech is the experience innovation company that exists to unlock a better way to experience the world. By blending crafts, categories, and cultures, we help brands unlock new value in an increasingly digital world. 

At the intersection of data, AI, creativity, and technology, we drive transformation for leading organizations, including L’Oréal, Mars, Audi, P&G, Volkswagen Dolby, and more. 

At Valtech, we don’t just talk about transformation. We make it happen. Our people are the heart of our success, and we foster a workplace where everyone has the support to thrive, grow and innovate. 

Are you ready to create what’s next? Join us.

pythongorustawsgcpazuredockermachine learningaidata