Managers
Align business and IT through shared understanding. Eliminate costly misalignment with validated business terminology and clear communication.
The business case for Fact Oriented Modeling
"The biggest risk in IT projects isn't technology failure—it's misunderstanding. Different definitions, unclear processes, and hidden assumptions cause costly mistakes before a project even begins."
|
Reduce Rework Catch misunderstandings early, when they're cheap to fix. Validated models ensure IT builds what the business actually needs. |
One Vocabulary Publish a business glossary that everyone uses. No more "what did you mean by customer?" |
|
Faster Delivery Clear specifications mean fewer iterations. Development teams can build with confidence. |
Knowledge Retention Business knowledge captured in models, not just in people's heads. Protect against turnover. |
-
From Ambiguity to Precision
The Journey from 'Customer → Order' to a Formal Semantic Layer Consider five ways to represent the same business reality: Customer → Order Customer has Order Customer buys Order The person places order . "The person or organisation, identified in our system as customer number 123, placed an order for our products or services identified by order number 20260991." The final expression isn't just "better documentation"—it's the foundation of a semantic layer : a formal, structured representation of business knowledge that sits between your SMEs' expertise and your data systems. This semantic layer is the result of applying FCO-IM (Fully Communication-Oriented Information Modeling), not just an improvement in notation.
-
The Agreement That Erases the Meaning
Why harmonized definitions are the problem dressed as the solution — and why agreeing to disagree is the better answer The Definition Was Agreed. The Problem Got Worse. There is a moment in almost every enterprise data governance initiative that feels like progress but isn't. The working group has been convened. The stakeholders from HR, Finance, Legal, and Operations are in the room. The term on the table is "Employee." Everyone agrees that the inconsistency is a problem. Everyone agrees that a shared definition is the solution. After several sessions, a definition is ratified. It is entered into the business glossary. The initiative is declared a success. Six months later, the systems are still inconsistent. The reports still don't reconcile. The data team is still getting contradictory answers to the same query depending on which source they use. What went wrong? Nothing went wrong. The process worked exactly as designed. That is the problem.
-
The Data That Doesn't Know What It Means
How entity modeling quietly discarded the meaning AI is now desperately trying to recover The Billion-Dollar Semantic Recovery Project There is a quiet crisis at the heart of enterprise AI adoption, and almost nobody is naming it correctly. Organizations are investing heavily in semantic layers, knowledge graphs, ontologies, and AI-powered data catalogues. The pitch is always some variation of the same idea: we will make our data understandable — to machines, to analysts, to the business. We will surface meaning from our data assets. What is rarely said out loud is the uncomfortable premise underneath all of that investment: the data doesn't already know what it means. That is not a technology problem. It is not something a better LLM will fix, or a richer graph schema, or a more expressive ontology language. It is a modeling problem — specifically, a problem that was baked in decades ago when organizations chose how to represent information, and what to keep and what to throw away. This article is about what was thrown away, why it is so hard to get back, and why there is a family of modeling approaches that never threw it away in the first place.
-
Bridging the Gap
An interview with Marco Wobben, information modeling expert and creator of CaseTalk The Lost Art of Understanding Data In an era where organizations are drowning in data yet starving for meaning, there's a methodology developed decades ago that addresses a problem more relevant today than ever: how do we ensure that the people building IT systems truly understand what the business needs? Marco Wobben has been working on fact-based modeling since the early 2000s, when a university professor handed him the source code of a modeling tool and asked him to maintain it. "I had to learn it from the inside out," he explains. "And now, with a lot of professors retired and the young people not having caught on yet, I'm kind of being considered the expert."
-
Governance with DMBOK
The DAMA organization has created an enormous body of knowledge called the Data Management Body of Knowkledge ( DMBOK ). In version 2 they introduced Fact Oriented Modeling for the first time. Both ORM and FCO-IM are condensely described. To illustrate how governance is central in the huge field of data management is shown by the DMBOK Wheel.
-
Advantages of Fact Based Modeling
In the realm of data modeling, the FCO-IM (Fully Communication Oriented Information Modeling) method has emerged as a powerful successor to the well-established NIAM (Natural Information Analysis Method). Both approaches rely on the use of natural language to construct data models. FCO-IM, however, takes this methodology a step further, offering numerous advantages that make it an ideal choice for creating elegant and highly adaptable models, all the while ensuring ease of validation by both business experts and users.
-
Why Communication Matters
Fully Communication Oriented Information Modeling (FCO-IM) emphasizes modeling the communication aspect of data. Employees constantly exchange facts through communication, such as "Invoice 1238 has an invoice date 1-1-1999". FCO-IM models such communication and facts in natural language. This article highlights the importance of communication in developing a fact-based data model.

Download 14.4.0