Engineering Deep Dives  ·  Hardware Intelligence
Hardware Teardowns
& Actuator Tech
Inside the machines — the move from hydraulic to electric, harmonic drive supply chains, sensor stacks, and the engineering decisions separating the leaders from the rest.
12
Teardowns
8
Robots Covered
4
Tech Deep Dives
All
⚙ Actuators
👁 Sensors
🔋 Power & Thermals
🏗 Structure & Materials
🧠 Compute
Boston Dynamics' Atlas ran on hydraulic actuators for nearly a decade — producing extraordinary athletic performance but requiring a tethered hydraulic power unit, complex sealing, and operating temperatures that made outdoor deployment nearly impossible. The 2023 pivot to a fully electric Atlas Gen 2 marked the industry's clearest signal: hydraulics are over for commercial humanoids.

The electric transition centers on three actuator archetypes now competing for design wins: quasi-direct-drive (QDD) motors with high-torque low-gear-ratio setups, series elastic actuators (SEA) that add compliance springs for safe human interaction, and harmonic drive gearboxes that achieve high reduction ratios in compact packages. Each involves fundamental tradeoffs in backdrivability, impact resistance, power density, and cost.
Quasi-Direct Drive Series Elastic Harmonic Drive Backdrivability Electric
Hydraulic (Legacy)
Pros: exceptional peak force density, proven in Atlas Parkour. Cons: requires HPU, leaks, thermal limits, loud, ~$80K/unit maintenance burden.
QDD Electric
Research examples: MIT Mini Cheetah. High torque at low reduction (≤10:1). Naturally backdrivable — safer for human contact. Limited peak torque vs. geared alternatives. Specific OEM adoption in commercial humanoids is not broadly confirmed publicly.
Series Elastic (SEA)
Confirmed: Agility Digit uses a spring-based leg design derived from their Cassie research platform. Figure 03 uses integrated series-elastic actuators per official spec sheet. Spring element absorbs impacts, enables torque sensing. Reduces bandwidth vs. stiff actuators.
Harmonic Drive
50–160:1 reduction ratio in compact package. Zero backlash. Fragile under shock loads — not recommended for leg joints. Widely used in arm and wrist joints across the industry, though most OEMs do not disclose their specific gearbox suppliers publicly.
🔩
Strain wave gearing — commonly called "harmonic drives" after the original trademark — is the dominant gearbox technology in humanoid arm and wrist joints. Two companies control the majority of the global supply: Harmonic Drive AG (Germany/Japan) and Nabtesco Corporation (Japan). This creates a concentrated supply chain risk that every Western humanoid OEM is aware of but few have solved.

The SHD/SHF series from Harmonic Drive achieves zero-backlash, high-reduction-ratio gearing in a package small enough to fit inside a robot's forearm. A typical humanoid uses 12–20 harmonic drives — at $300–$1,200 each depending on size, this is one of the largest single BOM line items. Unitree's cost advantage partly comes from vertically integrating their own cycloidal gearboxes for leg joints and sourcing harmonic drives at scale through volume agreements unavailable to smaller competitors.
Strain Wave Gearing Supply Chain Risk BOM Analysis Nabtesco · Harmonic Drive AG
Typical Count
Industry estimates suggest a full humanoid may use 12–20 harmonic drives across arm and wrist joints. OEMs do not publicly disclose exact counts. The more DOF a robot has, the higher this count.
Unit Cost (Est.)
Industry pricing estimates range from $300–$1,200+ per unit depending on output torque rating and frame size. These are commonly cited estimates — verified pricing requires direct supplier quotation. Actual OEM costs depend heavily on volume agreements.
Lead Time
Japanese precision gearbox suppliers typically quote 12–26 week lead times. Supply constraints were widely reported across the robotics industry during the 2023–2024 humanoid investment surge.
Alternatives
Cycloidal drives (Nabtesco RV series) for high-shock leg joints. Planetary gearboxes for cost-sensitive applications. Several OEMs including Tesla and Figure AI have disclosed plans to vertically integrate actuator components to reduce third-party dependency.
🔬
Figure 03 is Figure AI's third-generation humanoid, announced in October 2025. Figure has disclosed a limited set of hardware specifications publicly — the details below are drawn from official Figure announcements and verified third-party spec sheets. Figure vertically integrates their own actuators, batteries, sensors, structures, and electronics, per their own manufacturing announcement. Specific joint torques, gearbox suppliers, and internal control architecture have not been officially disclosed.

The robot weighs 61 kg — 9% lighter than Figure 02 — and uses integrated series-elastic actuators per the official specification. The hands include custom tactile fingertip sensors capable of detecting forces as small as 3 grams. The camera architecture delivers twice the frame rate and one-quarter the latency of Figure 02, with a 60% wider field of view per camera.
Figure 03 Series-Elastic Actuators 30 DOF 61 kg · 20 kg Payload 6-Camera Vision System
Total DOF
30 degrees of freedom per official specification. Breakdown by joint group is not publicly disclosed.
Actuator Type
Integrated Series-Elastic per official spec sheet. Figure has confirmed full vertical integration of actuator manufacturing. Specific gearbox type and torque figures are not publicly disclosed.
Tactile Sensing
Custom tactile fingertip sensors in each hand, detecting forces as small as 3 grams. Palm cameras provide real-time visual feedback for grasping in occluded spaces.
Vision System
6-camera array. 2× frame rate, ¼ latency, 60% wider FOV vs Figure 02. Expanded depth of field. Designed for high-frequency visuomotor control via Helix VLA.
Battery / Runtime
~5 hours runtime. 2 kW wireless inductive charging via feet. 10 Gbps mmWave data offload. Achieved UN38.3 battery safety certification.
Weight / Payload
61 kg total weight. 20 kg payload capacity. Speed: 2.4 m/s. Height: 173 cm.
👁
The humanoid robotics industry has split into two camps on perception — and the choice reflects deep assumptions about what AI can and cannot do reliably. LiDAR-equipped robots get precise 3D geometry data at 10–100m range regardless of lighting, but add weight, power draw, cost ($800–$3,000/sensor), and mechanical complexity. Vision-only systems bet that neural networks trained on massive datasets can infer depth, geometry, and object identity from stereo or monocular cameras alone — cheaper, lighter, but brittle in unusual lighting or novel environments.

Tesla's bet on vision-only for Optimus directly mirrors their FSD strategy: scale data, scale parameters, replace sensors with intelligence. Agility, Boston Dynamics, and most others carry at least one depth sensor — whether active stereo (Intel RealSense), structured light, or LiDAR — acknowledging that current vision models still fail in edge cases that geometric sensing handles trivially.
Vision-Only (Tesla) Active Stereo (Agility) LiDAR (Boston Dynamics) Depth Estimation Event Cameras (Figure)
Tesla Optimus
Vision-only. ~5 cameras (head stereo pair + peripheral). No LiDAR, no structured light. Depth estimated by neural network. Advantage: no moving parts, lowest BOM. Risk: failure in reflective/dark environments.
Agility Digit
LiDAR + Intel RealSense depth cameras + MEMS IMU. Confirmed by multiple official sources: Digit uses a neck-mounted LiDAR for navigation and 4× Intel RealSense depth cameras for manipulation and environment sensing. Force sensors in arms for compliant manipulation.
Boston Dynamics Atlas
LiDAR + cameras (unconfirmed specific models). Boston Dynamics has not published Atlas Gen 2 sensor specifications. Previous HD Atlas used LiDAR and stereo cameras. The electric Atlas is confirmed to use RGB-D cameras and IMU; specific sensor models not disclosed.
Figure 03
6-camera vision system (confirmed). Figure has confirmed a multi-camera array with palm cameras and tactile fingertip sensors. Specific camera model names and any supplemental depth sensors are not publicly disclosed by Figure AI.
Unitree G1
Stereo + structured light. 3D LiDAR on head for navigation, stereo camera for manipulation. Mid-range approach balancing cost and reliability.
🎯
Camera-based manipulation works for rigid, predictable objects. It fails the moment an object is soft, deformable, reflective, or wet — because cameras cannot sense the contact forces, slip onset, and surface texture that human hands detect instinctively. The absence of tactile sensing is the single largest gap between current humanoid hands and human hands for manipulation tasks.

Current approaches to tactile sensing range from strain gauges at each fingertip (used by Figure, Sanctuary AI) to optical tactile sensors like the GelSight family (MIT-derived, used in research) that embed a camera inside a transparent gel finger pad to image surface deformation. Contactile's PapillArray — 3D force vector arrays across the finger surface — is entering commercial humanoid trials in 2026.
Strain Gauge GelSight Optical PapillArray (Contactile) Slip Detection Force Control
Strain Gauge
Thin-film or foil strain gauges embedded in fingertip pads measure normal contact force. Well-established technology, inexpensive per element. Limitation: typically measures force magnitude only — directional slip detection requires additional sensors or inference. Figure 03 confirms tactile fingertip sensing detecting forces from 3 grams; implementation details not disclosed.
GelSight (Optical)
MIT-originated technology: a camera images surface deformation of a transparent elastomer gel fingertip. Provides a 2D contact geometry map including shear direction and texture. Currently used primarily in academic research settings due to form factor constraints at the finger scale. Pricing and commercial availability varies; no major humanoid OEM has publicly disclosed GelSight deployment.
PapillArray (Contactile)
Array of deformable pillars across the finger surface measuring 3D force vectors per pillar — enables slip onset detection before visible object motion. Contactile is an Australian startup commercializing this technology. Commercial humanoid deployment status as of early 2026 is not confirmed from public sources.
Fourier GR-3
Fourier Intelligence has described a distributed pressure sensor array across the GR-3 body for whole-body contact awareness. The "31-sensor" figure has been reported in coverage of the robot's ICRA 2026 paper submission. Full technical specifications have not been published.
🧠
Every humanoid robot must run perception, state estimation, motion planning, and control — simultaneously, at millisecond latencies, on battery power. The compute stack is often the most power-hungry non-motor system on the robot. NVIDIA Jetson AGX Orin (275 TOPS, 15–60W) has become the de-facto standard for prototype and first-generation commercial humanoids, but is increasingly recognized as insufficient for next-generation vision-language-action (VLA) models.

The compute hierarchy in a modern humanoid typically has three layers: a high-level reasoning node (Jetson Orin or equivalent) running perception and AI inference at 20–100Hz, a mid-level motion controller (ARM Cortex-A or FPGA) running whole-body control at 500–1000Hz, and distributed joint controllers (STM32 MCUs) running individual joint FOC at 20–40kHz.
NVIDIA Jetson AGX Orin 275 TOPS EtherCAT Bus Three-Layer Hierarchy STM32 Joint FOC
High-Level AI
NVIDIA Jetson AGX Orin — 275 TOPS, Ampere GPU, 12-core ARM. Widely used for perception and AI inference in humanoid robots. Power: 15–60W depending on workload. Confirmed users include Agility Robotics (Digit) and Apptronik (Apollo). Many OEMs use Orin or equivalent but do not specify publicly.
Motion Controller
A second compute layer running whole-body control at 500–1000Hz is standard architecture in humanoid robotics. Common choices include AMD Kria SoMs (FPGA + ARM) or ARM-based SBCs. Specific OEM choices are generally not published.
Joint Controllers
STM32 series MCUs (STMicroelectronics) are the dominant choice for real-time motor control loops in robotics broadly. Field-oriented control at 20–40kHz is standard for BLDC joint actuators. EtherCAT or CAN bus for inter-node communication is widely used in the industry.
Tesla Optimus
Tesla has stated Optimus uses compute derived from their Full Self-Driving ASIC. Specific TOPS figures for the robot application have not been disclosed.
Unitree G1 EDU
Confirmed: NVIDIA Jetson Orin NX (100 TOPS) in the EDU Ultimate configuration, alongside an 8-core primary CPU. This is one of the few humanoids where the compute stack is publicly specified by the manufacturer.
Next Gen
NVIDIA has announced Thor for robotics applications. Specific SOM configurations and TOPS figures for humanoid deployments had not been finalized as of early 2026 — check NVIDIA's developer documentation for current specs.
🔋
The average working humanoid is estimated to draw roughly 500W–2,000W depending on task intensity and locomotion speed — a commonly cited industry range, though no major OEM has published verified average power draw figures for their commercial robots. At that range, even a substantial battery pack provides only 1–5 hours of runtime, which is the most commonly cited deployment limitation.

Several architectural patterns are consistent across the industry: high-voltage battery packs reduce resistive losses in the motor drive chain, hot-swappable packs (confirmed in Digit and Apollo) allow near-continuous operation without plug-in downtime, and fast wireless charging is confirmed in Figure 03 at 2 kW. Specific voltage levels, battery chemistry, and charge times are not publicly disclosed by most OEMs.
Est. 500W–2,000W Draw Hot-Swap Confirmed (Digit, Apollo) 2kW Wireless Charge (Figure 03) Unitree G1: 9000mAh 54V Confirmed 2–5hr Runtime Range
Agility Digit
Up to 4 hours runtime per Agility's official product page. Battery chemistry and kWh capacity are not publicly disclosed. Hot-swappable battery packs supported.
Apptronik Apollo
4 hours runtime per battery pack per official Apptronik spec sheet. Hot-swappable packs replaceable in under 5 minutes — enabling near-continuous operation. Battery chemistry and kWh not disclosed.
Boston Dynamics Atlas
Boston Dynamics has not published battery capacity, chemistry, or runtime figures for the electric Atlas. Atlas features self-swappable battery packs per official product page. Runtime data is not available from official sources.
Figure 03
~5 hours runtime per Figure AI. Charges wirelessly at 2 kW via inductive pads at the feet. Battery achieved UN38.3 safety certification. Battery kWh capacity not disclosed.
Unitree G1
~2 hours runtime per Unitree spec sheet. 13-series lithium-ion battery, 9000 mAh, 54V quick-release pack. One of the few robots with confirmed published battery specifications. 54V 5A charger.
Thermal Management
Most commercial humanoids use local air cooling for motors and electronics per available specifications. Liquid cooling is used in some high-performance joints — Boston Dynamics Atlas electric is understood to use active cooling, but specifics are not published.
🏗
The structural design of a humanoid robot is a continuous negotiation between weight, stiffness, cost, and manufacturability. Most OEMs do not publicly disclose their structural material choices in detail. What is confirmed: Boston Dynamics has publicly described 3D-printed titanium and aluminum components in the electric Atlas. Tesla confirms a 57 kg total weight using lightweight materials. Figure 03 uses soft textiles and foam for the outer surface. Unitree G1 achieves 35 kg with a compact frame.

The general engineering logic favoring carbon fiber composites for primary structure, aluminum for machined housings, and titanium for highest-load joints is well-established in aerospace and high-performance robotics — but specific material allocations per robot are proprietary and not independently verified for most platforms.
Atlas: Ti + Al 3D-Printed (Confirmed) Figure 03: 61kg (Confirmed) G1: 35kg Lightest (Confirmed) CFRP / Al / Ti: General Engineering Logic
Tesla Optimus
57 kg. Confirmed by Tesla. Lightweight materials including aluminum and polymer components. Specific structural breakdown not publicly detailed.
Figure 03
61 kg per official spec. Figure describes soft textile and foam outer covering replacing hard mechanical shells from Figure 02. Internal structural materials and CFRP usage are not disclosed.
Agility Digit
~65 kg per specification sources. Digit's leg design features structural compliance in the lower leg derived from the Cassie research platform — the spring-like leg geometry provides passive shock absorption without a separate SEA element.
Boston Dynamics Atlas
89 kg per multiple confirmed sources. Boston Dynamics has publicly described use of titanium and aluminum 3D-printed structural components in the electric Atlas. Specific allocation of materials by joint is not published.
Unitree G1
35 kg per official spec — lightest commercially available humanoid. Smaller overall frame is the primary contributor to the weight advantage. Specific structural materials not published by Unitree.
Apptronik Apollo
72.5 kg (160 lbs) per official Apptronik spec. Specific structural materials not published.
Verified Specs: Weight & Runtime
From official manufacturer specs only. Joint torque figures are not published by most OEMs and have been removed.
Unitree G1
35 kg · 2 hr
Figure 03
61 kg · 5 hr
Tesla Optimus
57 kg · N/A
Digit
65 kg · 4 hr
Apollo
72.5 kg · 4 hr
Atlas
89 kg · N/A
N/A = not publicly disclosed by manufacturer
Sensor Stack by Robot
Tesla Optimus
Vision-only · 5× cameras · No LiDAR · Neural depth estimation
Vision
Boston Dynamics Atlas
RGB-D cameras + IMU confirmed. Specific sensor models not published by Boston Dynamics.
Cameras+IMU
Figure 03
6-camera array + palm cameras + tactile fingertip sensors confirmed. Specific camera models not disclosed.
Multi-Camera
Agility Digit
LiDAR + 4× Intel RealSense depth cameras + MEMS IMU confirmed per official sources
LiDAR+Stereo
Unitree G1
3D LiDAR (head) + stereo RGB + depth cam
LiDAR+Stereo
Fourier GR-3
Stereo cameras + 31-point body pressure array (whole-body tactile)
Tactile+Vision
Key Component Suppliers
Harmonic Drive AG
SHD/SHF strain wave gearing — dominant arm/wrist joint supplier for most OEMs
⚠ Critical
Nabtesco
RV cycloidal gearboxes — high-shock leg and hip joints
⚠ Critical
NVIDIA
Jetson AGX Orin — de-facto standard AI compute module for humanoids
Dominant
Maxon Group
Precision brushless DC motors — finger and small joint actuation
Tier 1
Toray Industries
T700/T800 CFRP — structural tubes, torso shells, limb segments
Tier 1
STMicroelectronics
STM32H7 MCUs — joint-level FOC controllers at 20–40kHz
Tier 1
Samsung SDI / LG ES
21700 NMC / NCMA cells — primary battery pack chemistry for most platforms
Tier 1
Engineering Glossary
Quasi-Direct Drive (QDD)
Motor with low gear ratio (≤10:1). Naturally backdrivable — safe for human contact. Low peak torque but excellent compliance and impact resistance.
Harmonic / Strain Wave Drive
Gear mechanism using flex-spline deformation. Achieves 50–160:1 reduction in a compact, zero-backlash package. Fragile under shock loads.
Backdrivability
Ability to push the output shaft backward through the gearbox to the motor. High backdrivability = safe for human contact. Low = precise positioning but rigid on contact.
Series Elastic Actuator (SEA)
Actuator with a compliant spring element between motor and load. Absorbs impacts, enables torque sensing via spring compression. Reduces control bandwidth.
Field-Oriented Control (FOC)
Advanced brushless motor control algorithm that independently controls torque-producing and flux-producing current components. Maximizes efficiency and torque density.
EtherCAT
Industrial real-time Ethernet bus. Deterministic <1ms jitter. Standard for inter-joint communication in humanoid robots requiring synchronized whole-body control.
TOPS (Tera Operations/Second)
Measure of AI inference compute throughput. Jetson AGX Orin: 275 TOPS. NVIDIA Thor: 2,000 TOPS. Higher = larger or faster neural networks at real-time latency.
Accuracy Disclaimer

The technical information on this page has been compiled from publicly available sources including manufacturer announcements, official product pages, and industry publications. While we make every effort to present only verifiable information and clearly label estimates as estimates, robotics hardware specifications change frequently, manufacturers do not always disclose full technical details, and some figures may be incomplete, outdated, or misattributed.

Do not use this page as a primary source for engineering decisions, procurement, safety assessments, or any mission-critical application. Always verify specifications directly with the manufacturer before making technical, commercial, or operational decisions based on this content. Androids.com accepts no liability for decisions made in reliance on the information presented here.

ANDROIDS.COM
Technical Specs Database
UPDATED MARCH 2026
10 ROBOTS INDEXED
// ENGINEERING INTELLIGENCE — HUMANOID ROBOTICS

TECH SPECS
DEEP-DIVE
DATABASE

Complete engineering specifications for every major humanoid robot platform — actuators, compute, sensors, DoF, payload, and performance benchmarks. Built for engineers and technical decision-makers.

10
Robots Indexed
32
Spec Fields
8
Companies
4
Available Now
Showing 10 of 10
Ask Androids — Androids.com
ANDROIDS.COM
///
INTELLIGENCE PLATFORM
28 COS
48 ROBOTS
42 SUPPLIERS
MAR 2026
// HUMANOID ROBOTICS INTELLIGENCE

ASK ANDROIDS CHAT?

Query the full androids.com knowledge base — companies, specs, pricing, SDKs, suppliers, and comparisons. Select a query below or type your own.

28 companies · 48 platforms · 42 suppliers · updated march 2026
⚠ Specs change frequently. Manufacturer figures are updated without notice. Do not rely on this data for procurement, engineering, safety, or any mission-critical decisions. Always verify directly with the manufacturer.