From Speculation to Specification - a compound AI system that grounds generative interior design images with real-world material data, transforming AI from visualization tool to decision-support partner.
Generative AI tools produce stunning interior renders in seconds. But a photorealistic ‘stone wall’ could be faux wallpaper, MDF panels, ceramic tiles, or actual stone masonry - each with different cost, carbon footprint, and structural requirements. The most creative phase of design is completely disconnected from the most impactful data.
“I conceptually want lower-carbon designs but when iterating with Midjourney I have zero material data.”
Prompt engineering, ControlNet, RAG - constrains exploration before the designer starts.
Fine-tuning, model modifications - requires 10K+ examples, weeks of GPU time, and produces static knowledge.
Validate AFTER generation - preserves creative freedom, zero retraining, modular updates.
Visual Output + Domain Knowledge = Actionable Insights
IIGenAI operates across three planes: the Interactive Plane captures user intent, the Inference Plane runs the AI pipeline, and the Grounding Plane validates against real-world databases.
Sustainability consideration in design decisions. Baseline designers focused purely on aesthetics. With grounding, every participant explicitly addressed environmental impact using material-specific language like “rammed earth,” “mass timber,” and “low-carbon substitutes.”
Creativity Support Index with grounding (vs 41.7 baseline). Counterintuitively, grounding INCREASED creative exploration. Designers used constraints deliberately and expressed intent more clearly. Satisfaction dropped to 50% - reflecting productive tension, not tool failure.
Cost per full pipeline cycle. Image generation (~$0.015) + 5-pass VLM identification (~$0.008) + CoT retry (~$0.003) + grounding ($0). Compare: fine-tuning costs thousands and weeks of GPU time. The architecture is model-agnostic - swap any component independently.
“Satisfaction dropped because designers went from optimizing one thing - aesthetics - to balancing aesthetics against environmental impact. That tension is uncomfortable, and that's exactly what we want. The 50% tells us grounding is working.”
“IIGenAI is spell-check for buildability. It does not stop you from imagining - it tells you what is real.”
Designer types a prompt, gets a beautiful interior in seconds. This is table stakes - Midjourney and DALL-E already do this. The experience feels familiar and fast.
Every material in that image is automatically identified with confidence scores and CO₂e data from real engineering databases. This is what no other tool does. This is the moat.
Designer sees concrete at high carbon, types “replace with rammed earth,” and the system edits the same image. The CO₂e drops. They learn, they explore, they come back.
The grounding layer is a platform, not a feature. Swap the database and you get a new product.
The MCP server means any AI agent - including Claude - can plug into the grounding layer without rebuilding the pipeline.
Post-generative grounding is a responsible AI strategy applicable far beyond architecture. The pattern - Visual Output + Domain Knowledge = Actionable Insights - generalizes to any creative domain where AI generates compelling outputs that lack domain-specific verification. The key insight is timing: validate after generation, not before. This preserves creative freedom while adding real-world accountability.
Claude Code (Sonnet 4.6) · Next.js 14 · FastAPI · OpenAI gpt-image-1 · GPT-4.1-mini · o4-mini · Anthropic MCP SDK · ICE Database V4.1 · Material2050 API
Product design + manufacturing databases = tooling feasibility.
Marketing + brand guidelines = compliance scores.
Museums + provenance databases = verified cultural context.
E-commerce + sourcing APIs = supplier availability.
Miro-style infinite canvas with SAM segmentation for automatic material region detection and spatial annotations.
Interactive sliders for CO₂e, cost, durability, acoustics - designers define their own utility function.
Extend from materials to furniture and fixtures, linked to real supplier SKUs and product databases.
Sketch upload, voice commands, and 3D model integration for a synergistic design dialogue.
Export grounded material data to Rhino/Revit, bridging generative AI and parametric design workflows.