Data Architect Delivery (RapidScale)
Confirmed live in the last 24 hours
Cox Enterprises
Compensation
$163,400 - $272,300/year
Job Description
Company
Cox Communications, Inc.Job Family Group
Job Profile
Management Level
Flexible Work Option
Travel %
Work Shift
Compensation
Compensation includes a base salary in the range of $163,400.00 - $272,300.00. The base salary may vary within the anticipated base pay range based on factors such as the ultimate location of the position and the selected candidate’s knowledge, skills, and abilities. Position may be eligible for additional compensation that may include an incentive program.Job Description
At RapidScale, exceptional technology is powered by exceptional people. We deliver secure, reliable managed and advisory services across private, public, and hybrid clouds, helping organizations innovate, adapt, and grow. As an Elite Broadcom VMware VCSP Partner and top partner with AWS, Azure, and Google, our solutions focus on business outcomes with embedded cyber resiliency and AI to protect today and enable tomorrow backed by the strength of the Cox family of companies.
RapidScale’s Professional Services practice is focused on helping enterprise and SMB clients modernize their data infrastructure, operationalize machine learning, and lay the foundations for AI-enabled operations. In the Data Architect Delivery role: you will own the end-to-end execution of Data and ML platform engagements — from architecture and implementation through client handoff — while carrying technical accountability for the quality and performance of every solution you deliver.
You will also contribute to presales as a technical authority, but that is a support function here, not the primary one. The right candidate is energized by leading delivery teams, solving hard data engineering problems in client environments, and building long-term client trust through consistent execution.
As a Data Architect, you will perform the following:
- Delivery Leadership (~50%)
- Own end-to-end delivery of data platform engagements: managing scope, timeline, budget, team coordination, and client satisfaction from kickoff through handoff
- Lead cross-functional delivery teams that may include project managers, business analysts, data engineers, developers, and change management professionals — ensuring the right skills are in place at each stage
- Manage engagement health proactively: identify delivery risks early, escalate appropriately, and maintain clear communication with client stakeholders throughout
- Serve as the primary client relationship owner during delivery, building trust through reliable execution and transparent progress reporting
- Conduct technical requirements gathering and capability assessments to establish a sound foundation for each engagement
- Translate complex technical architectures into clear business outcomes for non-technical client stakeholders
- Technical Architecture (~45%)
- Design and implement modern cloud data architectures across AWS, Azure, and Google Cloud — including data lakes, lakehouses, data warehouses, and real-time streaming platforms
- Lead migration of legacy on-premise data systems to cloud-native architectures, ensuring performance, scalability, and cost-efficiency
- Build and oversee ETL and data pipelines using cloud-native automation and orchestration tools
- Design and implement machine learning infrastructure on cloud platforms — including feature stores, model training pipelines, experiment tracking, and model serving and monitoring in production
- Architect data foundations that support downstream ML, AI, and agentic workflows, including intelligent document processing, knowledge retrieval, and structured/unstructured data integration
- Establish and enforce data quality, governance, and observability standards across client environments, including ML model performance monitoring and drift detection
- Evaluate and recommend appropriate tooling across the modern data and ML stack based on client context, capability, and long-term roadmap
Presales Support (~5%)
- Partner with sales teams as a technical authority during discovery calls, client workshops, and solution presentations
- Develop proposals including solution architecture, scope of work, resource requirements, and budgetary estimates
- Build conceptual architectures and executive-level presentations that articulate solution value clearly
Qualifications
Minimum Requirements
- Bachelor’s degree in a related discipline and 8 years’ experience in a related field OR a Master’s degree and 6 years’ experience OR a Ph.D. and 3 years of experience OR 12 years’ experience in a related field
- 1+ years of hands-on experience building and deploying machine learning solutions in a cloud environment — including model training pipelines, feature engineering, and production model serving using platforms such as Google Vertex AI, Amazon SageMaker, or Azure Machine Learning
- 3+ years building and maintaining ETL and data pipelines using cloud-native automation and orchestration tools
- Hands-on experience with cloud data warehousing and lakehouse platforms (e.g., Google BigQuery, Amazon Redshift, Azure Synapse Analytics, Microsoft Fabric, Databricks, Snowflake)
- Proficiency in Python for data engineering tasks; working knowledge of at least one additional language relevant to the data stack (e.g., SQL at scale, Scala, Java)
- Strong communication skills — able to present technical architectures to executive audiences and document solutions clearly for both technical and non-technical stakeholders
Preferred Qualifications
- Experience architecting data solutions specifically designed to support AI and agentic workflows — including intelligent document processing, RAG pipelines, knowledge retrieval, and structured/unstructured data integration for LLM consumption
- Experience applying ML to analytics use cases such as forecasting, anomaly detection, customer segmentation, or recommendation systems in an enterprise context
- Familiarity with prompt engineering or context engineering practices when integrating LLMs into data and ML workflows
- Experience using AI-assisted development tools (e.g., GitHub Copilot, Claude Code, Amazon Kiro, Google Gemini) in a professional engineering context
- Professional certifications from GCP, AWS, and/or Azure (data or solutions architect tracks)
Benefits
About Us
Similar Jobs
Synchrony Financial
AVP, Enterprise Complaints Analytics & Innovation (L10)
Synchrony Financial
AVP, Enterprise Complaints Analytics & Innovation (L11)
Northern Trust
Data Management Consultant, Workforce Analytics
Northern Trust
Operational Risk Analytics Consultant
Northern Trust
Lead Database Engineer - APEX development
S&P Global