Projects
Activated assets: how Fiera Milano made warehouse data conversational
Fiera Milano / Nolostand
The breakthrough wasn't merely technical; it was operational. We moved from static extraction to dynamic conversation. By allowing the business to query its own assets without technical mediation, we transformed a legacy database into a responsive team member.
The co-creation initiative between Fiera Milano’s Data & AI Office, the operational arm, Nolostand and Elevate, evolved a rigid inventory tracking process into an agentic ecosystem. This systemic shift redefined how stock levels are monitored, forecasted, and utilized. The result is immediate transparency within a high-volume logistic environment, providing tools that bridge the gap between physical assets and digital intelligence.
Key performance indicators - KPIs
Objective: Eliminate the latency between business questions and data-driven answers.
- Over 95% democratized access across operations, removing dependency on single technical gatekeepers.
- Real-time forecasting capability for stock utilization and procurement.
- 90% Reduction in data retrieval time (from hours of manual extraction to seconds).
- Near-zero technical training: onboarding emphasizes natural language and intent over complex syntax.
A physical mandate, a digital strategy - scenario
Fiera Milano operates one of the largest exhibition districts in the world. Its parent company, Nolostand, focuses on set-up services for exhibitions, such as: design, construction, and fitting of thousands of square meters of exhibition space. This operational arm manages a massive inventory - roughly 900,000 records tracking everything from structural components to furniture. The mandate is dual: Fiera Milano provides the strategic digital infrastructure, while Nolostand ensures the physical readiness of events ranging from the Milan Furniture Fair to the upcoming Winter Olympics.
We manage a double dimension: the physical magnitude of the fairgrounds and the digital architecture that must optimize it. Our goal was to enable the operational teams to access their own data without friction.
The friction of legacy - challenge
The challenge was not a lack of data, but a lack of accessibility. The inventory information was trapped within a legacy ERP system and extracted via massive, cumbersome Excel files.
The process was defined by latency. An operator would download a zip file, wait for it to open, and then spend hours applying formulas. It was a race against the sheer weight of the file.
The existing workflow contained specific systemic vulnerabilities:
- Manual dependency and bottlenecks: The analysis relied heavily on specific individuals who understood the raw data structure. If a strategic question arose ('How many chairs of this type do we have for December?'), it triggered a manual, time-intensive extraction process.
- Reactive rather than predictive: Because the effort to retrieve current status was so high, the system was used primarily to check past records rather than to forecast future needs or optimize stock rotation.
- Semantic opacity: The raw data lacked business context. A column header in the database did not necessarily match the natural language a business user would use to describe an item, creating a translation gap that only technical staff could bridge.
The objective - expected outcome
The goal was to transition from this high-friction, manual process to an intelligent layer that could:
- Democratize access by allowing non-technical users to query the database using natural language.
- Accelerate decision-making by automating the retrieval and visualization of stock trends.
- Enhance predictive capability by moving from static reporting to AI-assisted forecasting.
- Preserve legacy infrastructure by building an intelligent layer on top of existing systems, avoiding a costly 'rip and replace' scenario. Retaining the ERP was a strategic application of the 'maximum value, minimal disruption' principle. AI acts as a 'resilience layer,' extending the lifespan of legacy investments and allowing the company to self-fund next-generation systems with today’s savings.
The Agentic Data Layer - the solution
The project was executed through a 'laboratory' approach - a rapid, agile Proof of Concept (PoC) delivered in under three months. Working with the Google Cloud ecosystem, the team developed a platform of Agentic AI tailored to the specific lexicon of exhibition logistics.
We didn't aim for a monolithic enterprise overhaul. We aimed for a pragmatic, agent-based architecture that could speak the language of the warehouse.
The solution is an intelligent system structured around four specific agents:
- Forecasting Agent: Analyzes historical data to predict stock requirements based on square footage sold for upcoming events.
- Substitution Agent: Identifies available alternative products when primary stock is depleted, ensuring operational continuity.
- Stock Control Agent: Provides instant visibility into current inventory levels across different categories.
- Flexible Reporting Agent: Allows users to ask complex questions ('Show me the consumption trend of wood panels in Q4') and receive immediate visual graphs and data tables.
Efficiency, autonomy, and semantic alignment - the outcome
Delivered in 3 months by a joint team of 5 from Fiera Milano, Nolostand and Elevate, the project drove a shift from manual data wrangling to strategic data interaction: From extraction to interaction The 'Excel bottleneck' has been removed. Users no longer wait for files to open; they ask questions. The system interprets the intent, retrieves the data from the Data Lake, and presents the answer. This has freed up the R&D and operational teams to focus on planning rather than data cleaning. Semantic data restructuring: A crucial, invisible outcome was the semantic mapping of the warehouse. The team didn't just dump data into an AI; they tagged and structured the metadata so the AI could understand the difference between a 'chair' and a 'stool' in the context of a fair. This effectively captured the tacit knowledge of senior employees and embedded it into the system. Enhanced governance and scalability The system provides a blueprint for broader application. By proving that AI can bridge the gap between a legacy ERP and a modern user interface, Fiera Milano has established a replicable model. The architecture is designed to scale, with plans to expand into a broader 'Data Catalog' agent that will democratize data governance across the entire organization.
Tech Stack & Implementation
The solution combines advanced technologies for optimal performance:
Lessons learned: a model for pragmatic AI
The project established an operational model for introducing GenAI in traditional sectors.
The technology is fast, but the preparation must be deliberate. We learned that AI is only as intelligent as the semantic structure you provide it. The value wasn't in the algorithm alone, but in teaching the algorithm our business context.
Co-designing for semantic precision: The most critical phase was not coding, but defining the 'prompt catalog' and data tags with the business users. This ensured that when a user asked for 'wood,' the system retrieved the correct material codes. Collaboration was the method for ensuring accuracy., Agility as a risk mitigation strategy: By choosing a 3-month PoC timeline, the organization minimized risk and maximized focus. This 'laser' approach allowed them to validate the technology on a specific use case (warehouse) before considering wider adoption, turning skepticism into advocacy.
Key insights for your organization
The experience with Fiera Milano and Nolostand provides a replicable roadmap:
- Democratization requires translation: To make data accessible to everyone, you must build a layer that translates technical database logic into business language.
- Don't wait for perfect infrastructure: You don't need to replace your legacy ERP to get modern insights. An intelligent agentic layer can act as the bridge, unlocking value from old systems immediately.
- Start with high-friction tasks: Identify where your smartest people are doing the most boring work (like opening zip files). That is where AI delivers immediate ROI.
- Structure is a prerequisite for intelligence: GenAI is not magic. It requires a well-governed, semantically tagged data foundation to function reliably in a corporate environment.
- Evolution is iterative: Start with a 'laboratory' mindset. Prove the value in one vertical (inventory), then scale the logic to others (finance, HR, governance).