Back to Search
Overview
Mid-Level

Innovation and Automation Specialist

Confirmed live in the last 24 hours

Accenture Federal Services

Accenture Federal Services

Compensation

$93,400 - $176,200/year

Arlington, VA
On-site
Posted April 14, 2026

Job Description

 
At Accenture Federal Services, nothing matters more than helping the US federal government make the nation stronger and safer and life better for people. Our 13,000+ people are united in a shared purpose to pursue the limitless potential of technology and ingenuity for clients across defense, national security, public safety, civilian, and military health organizations. 
 
Join Accenture Federal Services, a technology company within global Accenture. Recognized as a Glassdoor Top 100 Best Place to Work, we offer a collaborative and caring community where you feel like you belong and are empowered to grow, learn and thrive through hands-on experience, certifications, industry training and more. 
 
Join us to drive positive, lasting change that moves missions and the government forward!
 

As a Data Engineer Specialist on the Innovation and Automation team, you will serve as a subject matter expert, blending deep data engineering expertise with a passion for automation. You will not build individual data pipelines for business users; instead, you will build the factory that produces them. Your mission is to design, develop, and implement the reusable frameworks, automated patterns, and core tooling that our data engineering teams will use to build their own pipelines faster, more reliably, and more consistently. This is a highly technical, hands-on role for a problem-solver who wants to act as a force multiplier for the entire data organization.

Responsibilities:

  • Act as a technical expert on the design and implementation of automated data engineering solutions
  • Develop and maintain a library of standardized, reusable ETL/ELT pipeline templates using Python, SQL, and frameworks like Databricks or Snowflake
  • Engineer and implement robust, automated data quality and testing frameworks (e.g., using tools like Great Expectations) that are embedded within the core pipeline templates
  • Contribute to the development of Infrastructure-as-Code (IaC) modules (Terraform) for the automated provisioning of data infrastructure
  • Enhance and optimize the CI/CD for Data (DataOps) pipelines, ensuring seamless and reliable deployment of data workflows
  • Serve as an escalation point for the most complex data engineering and automation challenges, providing expert-level troubleshooting and guidance to other engineers
  • Mentor other data engineers on automation best practices, code standards, and the use of the frameworks you build
  • Research and prototype cutting-edge data engineering and automation technologies to drive continuous improvement

Must have: 

  • 5+ years of hands-on experience in data engineering
  • Expert-level programming skills in Python and advanced SQL
  • Proven, in-depth experience building and optimizing data pipelines in a cloud environment (AWS, Azure) on platforms like Databricks or Snowflake
  • Strong, hands-on experience with Infrastructure-as-Code (IaC) using Terraform
  • Demonstrable experience with CI/CD principles and tools (e.g., GitLab CI, Jenkins, GitHub Actions) applied to data workflows
  • Deep understanding of modern data architecture, data modeling, and software engineering best practices

Nice to have:

  • Experience in a DevOps or Site Reliability Engineering (SRE) role
  • Direct experience developing and operationalizing a "pipeline factory" or similar framework
  • Familiarity with data orchestration tools (e.g., Airflow) and containerization (Docker, Kubernetes)
  • Experience working in a high-security DoD or Intelligence Community environment
  • Proven ability to diagnose and resolve complex performance, data quality, and system-level issues

Security Clearance:

  • Active TS or TS/SCI clearance

 

 

pythongoawsazurekubernetesdockeraidevopsdatadesign