Data Architect
Confirmed live in the last 24 hours
Valtech
Job Description
Why Valtech? We’re the experience innovation company - a trusted partner to the world’s most recognized brands. To our people we offer growth opportunities, a values-driven culture, international careers and the chance to shape the future of experience.
The opportunity
At Valtech, you’ll find an environment designed for continuous learning, meaningful impact, and professional growth. Whether you're pioneering new digital solutions, challenging conventional thinking or building the next generation of customer experiences, your work will help transform industries.
We are proud of:
- The work we do and the innovation we drive
- Our values of share, care and dare
- A workplace culture that fosters creativity, diversity and autonomy
- Our borderless, global framework, which enables seamless collaboration
The role
As a Data Architect, you are passionate about experience innovation and eager to push the boundaries of what’s possible. You bring 10+ YEARS of experience, a growth mindset and a drive to make a lasting impact.
You will thrive in this role if you are:
- A curious problem solver who challenges the status quo
- A collaborator who values teamwork and knowledge-sharing
- Excited by the intersection of technology, creativity and data
- Experienced in Agile methodologies and consulting (a plus)
Role responsibilities
- Define and document the end-to-end target-state data architecture for enterprise client programmes — covering ingestion, storage, transformation, serving, and consumption layers
- Establish domain-driven data architecture boundaries aligned to business domains (e.g. Product, Customer, Order, Finance) using Domain-Driven Design (DDD) principles
- Lead architecture design sessions with clients to align on technology choices, topology, and migration sequencing
- Produce architecture artefacts to a consultancy standard: C4 diagrams, data flow diagrams, architecture decision records (ADRs), and technology selection rationale documents
- Evaluate and recommend GCP-native vs. third-party component trade-offs — with clear cost, scalability, and maintainability justification
- Define and enforce enterprise data modelling standards across the programme — covering 3NF (operational layer), dimensional modelling (Kimball star/snowflake for analytics), and Data Vault 2.0 (historized, auditable Lakehouse layers) as appropriate
- Establish canonical data models for core business domains — ensuring consistency across squads and brands or business units
- Design and govern the metadata framework: schema standards, naming conventions, entity definitions, data dictionaries, and lineage documentation
- Oversee data contract design between producing and consuming domains — defining SLAs, schemas, versioning, and change management protocols
- Ensure models are optimized for the target query engine — BigQuery partitioning/clustering strategies, Delta Lake Z-ordering, and Databricks Photon engine considerations
- Design reusable, domain-oriented data product patterns — encapsulating ingestion, transformation, quality, and serving logic as deployable, versioned units
- Define the data product interface contract: output ports (APIs, tables, streams), SLOs, ownership, and discoverability metadata in Unity Catalog or Dataplex
- Establish a data product taxonomy aligned to business capability domains — enabling a self-serve data mesh posture for mature clients
- Create accelerators and reference implementations that mid-level engineers can adopt — reducing bespoke build and enforcing consistency
- Collaborate with Data Science and Analytics Engineering teams to ensure feature stores and ML feature pipelines are aligned to the broader data product architecture
- Drive data governance strategy across the programme — defining policies for data classification, access control, retention, and quality thresholds
- Design the governance operating model: data stewardship roles, data ownership accountability (domain owners vs. platform owners), and escalation paths
- Define the data release sequencing strategy — prioritizing domains and data products based on business value, dependency mapping, and technical readiness
- Establish lifecycle management policies for schema evolution, deprecation, and backward compatibility — enforced through Unity Catalog or equivalent cataloguing tooling
- Implement or oversee data quality frameworks (Great Expectations, dbt tests, Databricks Delta constraints) aligned to governance thresholds
- Provide hands-on architectural oversight during foundational delivery phases (Milestone 1 and equivalent programme gates) — ensuring the build conforms to the agreed architecture and standards
- Conduct architecture reviews and code/design walkthroughs with engineering squads — identifying deviations, technical debt, and remediation paths
- Chair Architecture Review Board (ARB) sessions with client technical leadership — presenting architecture decisions, trade-offs, and risk assessments
- Define and track architecture fitness functions — measurable criteria that validate the architecture is being implemented correctly across squads
- Serve as the escalation point for cross-squad architectural decisions, integration conflicts, and technology blockers
- Lead client architecture workshops, data strategy sessions, and roadmap definition exercises at executive and technical levels
- Produce and present strategic client-facing artefacts: Data Platform Vision documents, Architecture Blueprints, Governance Frameworks, and Capability Roadmaps
- Contribute to pre-sales and bid activities — leading the data architecture strand of RFP/RFI responses, solution sizing, and pitch presentations
- Represent Valtech's data architecture capability externally — through client advisory conversations, partner events, and thought leadership content.
Must have qualifications
To be considered for this role, you must meet the following essential qualifications:
- Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or equivalent professional experience
- 10+ years of progressive experience in data architecture, data platform engineering, or enterprise BI/analytics roles
- Demonstrable experience as a lead data architect on at least two large-scale enterprise data platform programmes
- 3NF — operational and transactional layer design
- Dimensional modelling — Kimball star/snowflake, slowly changing dimensions
- Data Vault 2.0 — hubs, links, satellites, business keys
- Data mesh & domain-oriented data product design
- Architecture Decision Records (ADRs) and C4 modelling
- Data contract design — schema, SLAs, versioning
- Data catalogue & metadata management design
- Data quality framework design and governance policy
- BigQuery — advanced design, partitioning, clustering, BI Engine
- Dataflow (Apache Beam) — streaming and batch architecture
- Cloud Composer / Airflow — orchestration design
- Pub/Sub — event-driven data product patterns
- Dataplex — data cataloguing, lineage, data quality
- Vertex AI Pipelines — ML feature pipeline integration
- Cloud Storage — data lake zone design (raw/curated/serving)
- Terraform — IaC standards and landing zone design
- Databricks Lakehouse — medallion architecture, zone design
- Delta Lake — ACID, time travel, schema evolution, Z-ordering
- Unity Catalog — governance, lineage, access control, data products
- Databricks Workflows and Asset Bundles — lifecycle automation
- PySpark — advanced optimization, broadcast, AQE, dynamic pruning
- dbt on Databricks — transformation layer standards
- Databricks on GCP — cluster policy, instance pools, cost governance
- MLflow — model registry integration with data product layer
- Unity Catalog — fine-grained access, row/column security
- Dataplex — data mesh governance on GCP
- Data lineage — column-level lineage design and tooling
- GDPR, data residency, and privacy-by-design principles
- Data stewardship operating models
- Metadata standards — Dublin Core, OpenLineage, OpenMetadata
- Google Cloud Professional Data Engineer or Professional Cloud Architect certification
- Databricks Certified Professional Data Engineer or Databricks Certified Associate Developer for Apache Spark
- TOGAF, Zachman, or equivalent enterprise architecture framework certification.
Nice to have qualifications
- Azure Synapse / Microsoft Fabric — cross-cloud architecture experience
- Apache Iceberg or Apache Hudi — open table format evaluation and migration experience
- Event-driven architecture — Apache Kafka, Confluent Cloud, or Pub/Sub advanced patterns
- Generative AI integration patterns — RAG pipelines, vector stores (AlloyDB, Vertex AI Vector Search), LLM-as-data-product
- Exposure to composable commerce data models — product, pricing, inventory, order domains (commercetools, SFCC, SAP Commerce)
- Experience in luxury, retail, or FMCG verticals — multi-brand data architectures, PIM/DAM integration patterns
- OpenMetadata or Collibra — enterprise-grade data catalogue implementation
- Familiarity with Salesforce Marketing Cloud data connectors or Customer Data Platforms (CDPs)
- Google Cloud Professional Machine Learning Engineer
- CDMP (Certified Data Management Professional) — DAMA certification
- HashiCorp Terraform Associate
If you do not meet all the listed qualifications or have gaps in your experience, we still encourage you to apply. At Valtech, we recognize that talent comes in many forms, and we value diverse perspectives and a willingness to learn.
Commitment to reaching all kinds of people
We design experiences that work for all kinds of people - and that starts with our own teams. At Valtech, we’re intentional about building an inclusive culture where everyone feels supported to grow, thrive and achieve their goals. No matter your background, you belong here. Explore our Diversity & Inclusion site to see how we’re creating a more equitable Valtech for all.
The benefits
This is a Fulltime position based in Bengaluru.
Beyond a competitive compensation package, we offer:
- Flexibility, with remote and hybrid work options (country-dependent)
- Career advancement, with international mobility and professional development programs
- Learning and development, with access to cutting-edge tools, training and industry experts
Our benefits are tailored to each location. Your Talent Partner will provide full details during the hiring process.
Your application process
Once you apply, our Talent Acquisition team will review your application. Your CV should cover key information on relevant experiences and expertise. We do not require information such as age, gender, marital status, or a headshot in your application. We review all candidates based on skills, experience, and potential.
⚠️ Beware of recruitment fraud!
We are committed to inclusion and accessibility. If you need reasonable accommodations during the interview process, please either indicate it in your application or let your Talent Partner know.
About Valtech
Valtech is the experience innovation company that exists to unlock a better way to experience the world. By blending crafts, categories, and cultures, we help brands unlock new value in an increasingly digital world.
At the intersection of data, AI, creativity, and technology, we drive transformation for leading organizations, including L’Oréal, Mars, Audi, P&G, Volkswagen Dolby, and more.
At Valtech, we don’t just talk about transformation. We make it happen. Our people are the heart of our success, and we foster a workplace where everyone has the support to thrive, grow and innovate.
Are you ready to create what’s next? Join us.
Similar Jobs
Anaplan
AI Principal Architect
Anaplan
AI Principal Architect
Anaplan
AI Principal Architect
Anthropic
Applied AI Architect, Security
Amazon AWS Services Brazil Ltd