Back to Search
Overview
Mid-Level

Software Development Engineer, Applied AI Solutions

Confirmed live in the last 24 hours

Amazon Development Center U.S., Inc.

Amazon Development Center U.S., Inc.

Mountain View, CA, USA
On-site
Posted April 20, 2026

Job Description

As part of the AWS Applied AI Solutions organization, we have a vision to provide business applications, leveraging Amazon’s unique experience and expertise, that are used by millions of companies worldwide to manage day-to-day operations. We will accomplish this by accelerating our customers’ businesses through delivery of intuitive and differentiated technology solutions that solve enduring business challenges. We blend vision with curiosity and Amazon’s real-world experience to build opinionated, turnkey solutions. Where customers prefer to buy over build, we become their trusted partner with solutions that are no-brainers to buy and easy to use.

The Physical AI team at AWS is developing infrastructure that enables customers to build digital twins, train autonomous systems, and deploy edge intelligence at scale. As these capabilities grow, the experience layer — how customers discover, interact with, and visualize their physical AI workloads — is a key part of making the platform accessible and useful.

As a Software Development Engineer on the Physical AI Experience team, you will own the full customer interaction surface: the console experience where customers manage spatial data and digital twins, the APIs and SDKs that power programmatic workflows, and the 3D visualization layer that renders OpenUSD-based digital twins and spatial data in the browser. You will work across the stack — from React-based console UIs and CloudScape components to WebGL/WebGPU rendering pipelines to RESTful and streaming API design — building the experience that makes petabytes of spatial data and complex digital twin orchestration feel simple and intuitive.

The Physical AI Experience team is a newly formed group within Applied AI Solutions, responsible for every surface customers touch when interacting with the Physical AI platform. We sit at the intersection of frontend engineering, API design, and 3D visualization — a rare combination that demands both breadth and depth. Our team collaborates closely with the Physical AI Platform team (backend services, SDMA, digital twin orchestration), UX design, the partnership team, and directly with lighthouse customers shaping the product. We value craft in developer experience, obsess over making complex spatial workflows feel simple, and believe the experience layer is what turns infrastructure into an adopted product. You'll have opportunities to define the interaction patterns for an entirely new AWS service category, work with 3D web rendering technologies, and see your work used by customers building the autonomous systems of the future.

Key job responsibilities
Key job responsibilities

- Design and implement the AWS console experience for the Physical AI platform, enabling customers to manage spatial data assets, configure digital twin lifecycles, and monitor autonomous operations through intuitive dashboards built with CloudScape design system components.
- Build and maintain the public API surface and SDKs that customers and partners use to programmatically interact with the Spatial Data Management Architecture (SDMA), including REST APIs for CRUD operations, streaming APIs for real-time spatial data, and CLI tooling for developer workflows.
- Develop the 3D visualization layer that renders OpenUSD scenes, NVIDIA RTX-powered digital twins, and multi-modal spatial data (point clouds, 2D overlays, sensor streams) in the browser using WebGL/WebGPU, enabling customers to inspect, annotate, and collaborate on digital twin environments without local software installation.
- Create interactive coverage dashboards and analytics views that surface operational insights from spatial data pipelines — processing volumes, digital twin health, edge device status, and autonomous system performance metrics.
- Define and enforce API design standards, versioning strategies, and backward compatibility practices across the Physical AI platform, ensuring a consistent and composable developer experience as the service surface grows.
- Partner with Physical AI Platform engineers to translate backend capabilities (SDMA, digital twin orchestration, edge-to-cloud communication) into customer-facing experiences, and with UX designers to validate interaction patterns through customer research and usability testing.
- Surface partner solution integrations within the console experience, ensuring customers can discover, activate, and navigate partner capabilities alongside native workflows without context-switching between platforms.

A day in the life
A day in the life

You'll start your morning reviewing console telemetry — page load times, API error rates, and feature adoption metrics — to identify friction points in the customer experience. You'll join a design review for the digital twin lifecycle management console, debating how to represent maturity transitions visually without overwhelming users who on
reactjavagorustawsaifrontendbackendiosdata