Back to Search
Overview
Senior

Sr Data Engineer

Confirmed live in the last 24 hours

Lowe's

Lowe's

Bengaluru
On-site
Posted April 21, 2026

Job Description

Innovate in Bengaluru

This position is based at our on-site office in Bengaluru. Lowe's offers an ultramodern work environment, complete with cutting-edge technology, collaborative workspaces, an on-site gym and clinic, and other perks to enhance your work experience. 

 

 

 

About Lowe’s

Lowe’s is a FORTUNE® 100 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe’s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe’s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com

Lowe’s India, the Global Capability Center of Lowe’s Companies Inc., is a hub for driving our technology, business, analytics, and shared services strategy. Based in Bengaluru with over 4,500 associates, it powers innovations across omnichannel retail, AI/ML, enterprise architecture, supply chain, and customer experience. From supporting and launching homegrown solutions to fostering innovation through its Catalyze platform, Lowe’s India plays a pivotal role in transforming home improvement retail while upholding strong commitment to social impact and sustainability. For more information, visit Lowes India

About the Team

Lowe's Data Analytics and Computational Intelligence (DACI) organization is seeking a best-in-class Sr Data Engineer with a strong interest in building Data Services and Data Pipelines for Machine Learning applications from scratch for Supply Chain Analytic applications, & preferably with experience leading full stack Java/Hadoop development teams. DACI’s Supply Chain team will drive cost savings and service improvement throughout Lowe’s supply chain by using techniques ranging from forecasting to simulation to optimization, deployed against problems from receipt of product from the vendor through final mile fulfillment of the product to the customer and everywhere in between.  The team is comprised of analytic professionals across business analytics, data scientists, product managers, and software engineers, all combining their expertise to stay on the leading edge 

Job Summary:

executing direction from leadership, delivering results that align with strategic objectives, communicating critical information to other teams, managing vendor relationships, developing processes that align to organizational goals, specific technical skills required for managing a process.

The primary purpose of this role is to translate business requirements and functional specifications into logical program designs and to deliver modules, stable application systems, and Data or Platform solutions. This includes developing, configuring, or modifying complex integrated business and/or enterprise infrastructure or application solutions within various computing environments. This role facilitates the implementation and maintenance of complex business and enterprise Data or Platform solutions to ensure successful deployment of released applications.

Roles & Responsibilities:

Core Responsibilities:

  • Design and implement pipelines for data ingestion & transformation
  • Manage data pipelines for analytics and operational use
  • Develop and maintain scalable data pipelines and build out new API integrations to support continuing increases in data volume and complexity
  • Ensure data accuracy and integrity across multiple sources and systems
  • Collaborate with data scientists to support DS algorithms and data analysts for analytics
  • Partners with the product team to help inform the priorities within a set of Software products, applications, and/or services
  • Play a crucial role in implementing software and methodologies for data correction, reconciliation, and quality checking
  • Work closely with data science, data analyst and product teams to drive insights and innovations.
  • Work with the platform team to resolve any infrastructure related concerns

Years of Experience:

•5-8 years of experience in Data, or Platform Engineering, Data Warehousing 

•4 years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC) end-end

Education Qualification & Certifications (optional)

Required Minimum Qualifications:

List the education, certification, and work experience for an incumbent in the job. Enter the Minimum Qualifications and Preferred Qualifications as directed, and delete the areas not used.

List the education, certification, work experience and skills required to minimally qualify an individual for the job.

•Bachelor's Degree in Engineering, Computer Science, CIS, or related field (or equivalent work experience in a related field)

Skill Set Required

Primary Skills (must have)

  • 5 years of experience in systems analysis, including defining technical requirements and performing high level design for complex solutions for Data Engineering 
  • 3 years of experience in Hadoop or any Cloud Bigdata components 
  • Expertise in Java/Scala/Python, SQL, Scripting, Hadoop (Sqoop, Hive, Pig, Map Reduce), Spark (Spark Streaming,
  • MLib), Airflow, Kafka or equivalent Cloud Bigdata components 
  • Expertise in, Scripting, any RDBMS, Hadoop (OLAP on Hadoop), Dashboard development

 Technical Skills

  • Programming: Python, Java, Scala (Python most common)
  • SQL: Advanced querying, joins, window functions, performance tuning
  • Data Structures & Algorithms: Basics for efficient data handling

  

Data Engineering & Processing

  • ETL/ELT Pipelines: Designing and building data pipelines
  • Big Data Tools: Apache Spark, Hadoop, Hive
  • Batch & Streaming: Kafka, Apache Flink, Spark Streaming

Cloud Platforms

  • AWS (S3, Redshift, Glue, Lambda)
  • Azure (Data Factory, Synapse)
  • Google Cloud (BigQuery, Dataflow) - Mandatory

Data Lakes & Lakehouse concepts

DevOps & Tools

  • Git (version control)
  • CI/CD pipelines
  • Docker, Kubernetes (nice to have)

Optional / Advanced Skills

  • Machine Learning pipeline support
  • DataOps practices
  • Real-time analytics systems
  • API integration


Lowe's is an equal opportunity employer and administers all personnel practices without regard to race, color, religious creed, sex, gender, age, ancestry, national origin, mental or physical disability or medical condition, sexual orientation, gender identity or expression, marital status, military or veteran status, genetic information, or any other category protected under federal, state, or local law.

data