Nvidia, homebuilding giant PulteGroup and San Francisco startup Span are quietly running a pilot that would turn newly built American houses into nodes of a distributed AI cloud. The partnership, first detailed by CNBC on May 5 and revisited in a May 9 follow-up on stalled hyperscale projects, attaches small, liquid-cooled GPU units to the exterior walls of new homes — a striking response to a U.S. data center build-out that is increasingly bogged down by community opposition and power-grid limits.
How XFRA turns a house into a data center
The hardware at the center of the experiment is Span's XFRA unit, which the company describes as a distributed data center solution announced alongside Nvidia in April. Each node ships with enterprise-grade, liquid-cooled Nvidia RTX PRO 6000 Blackwell Server Edition GPUs, paired with Span's smart electrical panel, a home backup battery and, in some installs, solar panels.
The Span panel pinpoints unused electrical capacity in a home, then routes it to power the XFRA unit. Span owns the equipment, installs it for free and sells the resulting compute to hyperscalers and AI cloud providers. Homeowners pay a flat monthly fee of around $150 covering electricity and internet, with the savings effectively subsidized by Span's compute revenue. As one Span executive told CNBC: "If you are one of the homes that is hosting an XFRA node, XFRA will then give you compensation for energy and internet usage."
PulteGroup, the third-largest U.S. homebuilder, is in early testing. According to CNBC, the system has been deployed in just one home so far, with PulteGroup assessing the unit's capabilities, reliability and economics before any broader rollout.
A workaround for stalled hyperscale builds
The pilot lands at a moment of acute strain for centralized AI infrastructure. CNBC's May 9 piece on data center construction documented how local opposition, water and power constraints, and zoning fights are delaying or canceling projects from Virginia to Arizona — even as forecasts call for trillions of dollars in AI data center spending through 2030.
Span's pitch is that distributing compute across thousands of homes sidesteps that bottleneck. The company says it can install 8,000 XFRA nodes roughly six times faster and at five times lower cost than building a comparable 100-megawatt centralized data center. By tapping latent residential capacity rather than waiting on new substations, the model could deliver inference and lower-tier training workloads to AI customers months sooner than a greenfield facility.
Implications: Nvidia's stack reaches the front yard
For Nvidia, the deal is another expression of an aggressive strategy to seed demand for its silicon across the entire AI supply chain — from gigawatt campuses to, now, the side of a house. It also extends a 2026 pattern in which the chipmaker is taking equity stakes and supply commitments in adjacent infrastructure players: $2.1 billion with IREN and up to $3.2 billion with Corning were both struck this week, on top of more than $40 billion in equity bets reported in 2026.
For homebuilders, the model hints at a new revenue stream tied less to a buyer's mortgage and more to the AI economy underneath their roof. For homeowners, it promises lower bills in exchange for hosting industrial-grade compute on the exterior wall — a tradeoff that will likely surface fresh questions about noise, heat, cybersecurity and what happens to a household's AI tenant during a power outage.
If the pilot scales, the AI cloud may stop looking like a fortress in the desert and start looking like the wall outside your kitchen.



