Back to Search
Overview
Mid-Level

Data Engineer / Integrations Specialist - Contract

Confirmed live in the last 24 hours

Tech Holding

Tech Holding

Mexico, Remote
Remote
Posted March 18, 2026

Job Description

About us:

Working at Tech Holding isn't just a job, it's an opportunity to be a part of something bigger. We are a full-service consulting firm that was founded on the premise of delivering predictable outcomes and high-quality solutions to our clients.  Our founders and team members have industry experience and have held senior positions in a wide variety of companies – from emerging startups to large Fortune 50 firms – and we have taken our combined experiences and developed a unique approach that is supported by the principles of deep expertise, integrity, transparency, and dependability.

The Role:

We are seeking a Data Engineer to own the data layer of a growing AI-powered platform focused on automating document processing and ERP integrations. This is a hands-on engineering role centered on building reliable, scalable data pipelines and integration systems that move data between AI workflows and customer ERP environments. You will design, build, and maintain ETL processes, data pipelines, and API integrations that ensure data is accurate, consistent, and available across multiple customer environments. This role requires strong ownership from design to deployment and monitoring and the ability to work independently in a fast-moving, async-first startup environment.

You will play a critical role in enabling real-time validation and submission of customer data by connecting document intelligence outputs to ERP systems through robust, production-grade integrations.

What You’ll Build:

Short-term (first 60 days):

  • Data sync pipeline for a live customer: crawl ERP products, customers, and pricing data into our
    validation layer
  • ERP connector with REST API integration — auth, retries, timeouts, error handling
  • Durable workflow: validate extracted order data against ERP reference data, submit on approval
  • Data quality checks and monitoring for the sync pipeline.

Medium-term:

  • Generalized ERP adapter pattern (we’re building this across multiple ERPs, data models vary
    significantly)
  • Improved validation: confidence scoring, auto-submission rules, exception handling ERP Unlocked Inc. | Confidential
  • Schema extensions as new customer requirements and ERP platforms surface
  • Data reconciliation tooling across multiple customer environments

Requirements:

  • 5+ years working with Python in a data engineering or backend integration context
  • Hands-on experience building data pipelines and ETL processes, extracting, transforming, and
    loading data between systems
  • Proven experience integrating third-party REST APIs, auth, rate limits, retries, error handling
  • Strong understanding of data quality: validation, deduplication, schema management, error recovery
  • Comfortable owning a data track end-to-end: design → build → ship → monitor
  • Can read API documentation and figure things out independently
  • Strong async Python

Nice to have:

  • Experience with durable workflow orchestration (Temporal, Prefect, Celery, Airflow, etc.)
  • Data pipeline frameworks (Dagster, dbt, Airflow, etc.)
  • ERP integration experience, any platform (NetSuite, Epicor, Acumatica, WhereFour, or similar)
  • TypeScript/React at a working level — not the primary need,
reactpythontypescriptaifrontendbackenddevopsdataproductdesign