Back to Search
Overview
Senior

Senior Data Engineer

Confirmed live in the last 24 hours

Baker Tilly

Baker Tilly

IND KA Bangalore - Cherry Hills
On-site
Posted April 29, 2026

Job Description

Overview

BTVK Advisory is a leading advisory firm whose specialized professionals guide clients through an ever-changing business world, helping them win now and anticipate tomorrow. BTVK Advisory, and its affiliated entities, have operations in North America, South America, Europe, Asia, and Australia. BTVK Advisory’s ultimate parent entity, Baker Tilly US, LLP, is an independent member of Baker Tilly International, a worldwide network of independent accounting and business advisory firms in 141 territories, with 43,000 professionals and a combined worldwide revenue of $5.2 billion.

 
Baker Tilly is an equal opportunity/affirmative action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, gender identity, sexual orientation, or any other legally protected basis, in accordance with applicable federal, state or local law.


To be added to all ET through Experienced requisitions Any unsolicited resumes submitted through our website or to Baker Tilly Advisory Group, LP, employee e-mail accounts are considered property of Baker Tilly Advisory Group, LP, and are not subject to payment of agency fees. In order to be an authorized recruitment agency ("search firm") for Baker Tilly Advisory Group, LP, there must be a formal written agreement in place and the agency must be invited, by Baker Tilly's Talent Attraction team, to submit candidates for review via our applicant tracking system.

Job Description:

 

Responsibilities

  • Design, build, and optimize modern data solutions for our mid‑market and enterprise clients
  • Transform raw data into trusted, analytics‑ready assets that power dashboards, advanced analytics, and AI use cases – working primarily inside the Microsoft stack (Azure, Synapse, and Microsoft Fabric)
  • Develop scalable, well‑documented ETL/ELT pipelines using T‑SQL, Python, Azure Data Factory/Fabric Data Pipelines, and Databricks; implement best‑practice patterns for performance, security, and cost control. 
  • Design relational and lakehouse models; create Fabric OneLake shortcuts, medallion‑style layers, and dimensional/semantic models for Power BI. 
  • Build automated data‑quality checks, lineage, and observability metrics; contribute to CI/CD workflows in Azure DevOps or GitHub
  • Gather requirements, demo iterative deliverables, document technical designs, and translate complex concepts to non‑technical audiences. 
  • Research new Fabric capabilities, share findings in internal communities of practice, and contribute to reusable accelerators.  

Qualifications

  • Bachelor’s degree required.
  • Minimum of 3 years delivering production data solutions, preferably in a consulting or client‑facing role. 
  • Strong SQL for data transformation and performance tuning. 
  • Python for data wrangling, orchestration, or notebook-based development (PySpark), and hands on ETL/ELT with at least one Microsoft service (ADF, Synapse Pipelines, Fabric Data Pipelines).
  • Solid grasp of Azure fundamentals—storage, networking, security, and cost management. 
  • Project experience with Microsoft Fabric (OneLake, Lakehouses, Data Pipelines, Notebooks, Warehouse, Power BI DirectLake). 
  • Familiarity with Databricks, Delta Lake, or comparable lakehouse technologies, and exposure to DevOps (YAML pipelines, Terraform/Bicep) and test automation frameworks.
  • Experience writing unit tests for data pipelines and transformation logic, understanding or working with metadata frameworks for data governance and lineage, and integrating SaaS/ERP sources (e.g., Dynamics 365, Workday, Costpoint, SAP, Netsuite, Sage Intacct, IFS, etc.)
data