BIM Model Data Validation

Validating the quality of the model data is equally important and neglected. Unfortunately, I still don’t hear many projects introducing it. That is truly very wrong! Meanwhile hearing the advantages of BIM for the umpteenth time you’ve definitely stumbled upon the “early clash detection” phrase.

Don’t get me wrong: it is important.

However, the design digitalization doesn’t stop where the duct clashes with the wall. It can go way deeper inside what both the duct and wall are made of, their name, type and who should mount it later on a building site. Having such (and much more) correct data gives us the tremendous potential for regaining control over the chaotic and unpredictable construction process.

Yet to use data, it has to be correct. This happens through data validation and I will show you its potential in this article.

Table of Contents

The following article is included in the Data Management in BIM series. In case it’s the first post you came across, I encourage you to read the introduction "What is data? Introduction to Data Management in BIM". I describe there the basic concepts and at the bottom you will find table of content. All that to make sure you can get the most out of the series. Have a good read.

Data Validation

Referring to the data categorization I made in the previous article from the series, let me scan through each of them and define some of the rules and checks worth performing.

As it tends to happen for such a practical subject – the list is not extensive, moreover heavily project-dependent. I recommend using the article more as an idea generator and adjusting it to your project requirements.

Object Basic data

This data exists in every BIM model, besides all other properties created during design. Generally, we check whether the data keeps project standards – naming convention and usage of additional (non-required) available properties.

Examples of checks:

  • Objects have a correct category
  • Type, Name, and Classification follow the defined text structure (syntax)
  • If Mark/Description is used on a project, check if the expression is according to the rules
  • Object types must be from the agreed list
Classification data validation ruleset
A rule checking if classification syntax is correct.

Design data

These checks tend to be technical and best if performed within so-called disciplinary coordination. Accordingly, the BIM responsible for each discipline performs a model check, before sending it for multidisciplinary coordination. A structural engineer checks the structural model, an electrical engineer – the electrical model, etc.

It is important that the discipline leader / discipline BIM responsible undertake this task since BIM Coordination usually doesn’t have in-depth technical knowledge regarding that given discipline.

Examples of checks:

  • Correct value of properties “Is External” and “Load Bearing”
  • Property values must be from an agreed list (e.g. fire rating class, steel class)
  • Check if each duct in the model has information about its length and diameter/width
  • Gross/Net area ratio analysis (building efficiency)
  • Check that only listed beam and column profiles are used in the model
  • Fire zones need to have the correct types of walls, doors and windows
Example of a rule set checking if walls only have values according to fire rating class.

Project Management data

Data check in this category has to be the most extensive. The reason being these are additional user-defined properties and have to be filled out manually by the designer (or preferably by a script with defined conditions). To my knowledge, there is no software nowadays that allows the creation of correct coding based on the object’s location, category or design progress. Please do prove me wrong.

If you are wondering now, why those weird, problematic properties are so important to check, I would like to refer you to our series about VDC and BIM on-site. Those parameters are the practical deployment of the VDC and Lean Construction principles. It all boils down to quality control of the model and using it on-site. In very, very short:

  • Control Area is for a scheduler to plan work (Takt planning).
  • MMI/LOD parameter is for BIM Coordinator to know which objects are ready to check
  • Responsibility is for contractors to know which objects they are responsible to mount on-site

Examples of checks:

  • Control Area properties have a correct syntax
  • Objects within a given control area have a corresponding property value
  • MMI/LOD property is from an agreed list
  • MMI/LOD in a given control area have a given value (e.g. 350)
  • Object types and responsibility values match one another (MEP objects can have only MEP contractor number/name)

In the end, we have to create a rule for each property required in EIR (Solibri also lacks well-defined templates here). In the video below I show you how to create a custom rule, that checks if every object has the correct Control Area property:

Checking external data

The principles of maintaining high-quality project data I laid down here, are the same when it comes to data validation of external data connected with our model.

I have chosen the two examples below to show you two different points and methods to check for data quality

  1. Data inspection checks – what if we are only a middleman for data?
  2. Data entry checks – how software can ease our work?

Data inspection checks

This method is about our good old friend Excel. This is necessary if we receive data from a third party and use it somewhere else afterwards. Data inspection is the first step in the data cleaning process (described here), therefore, I always perform it when given any data set.

Since the data is filled out manually in the Excel spreadsheet that has no rules and no connection to other sources, every file has to be quality checked before importing it anywhere else. I create a simple pivot that gives a good overview of the set. Afterwards, I do the cleaning process in the spreadsheet beforehand importing it to another system.

Example

In the project we collect and absorb huge quantities of spreadsheets received from different parties. Irrespective of whether I like it or not, this is still the way the industry works with data.

We gather metadata for Suppliers and Manufacturers connected to each type of element mounted on site. This is in addition to MOM (Maintenance, Operation & Management) documents delivered in pdf format. The process is as follows:
Data comes as a spreadsheet filled out by a contractor.
I perform data inspection and cleaning within an Excel spreadsheet and correct findings on the spot.
Import the spreadsheet to the database. It rejects cells with incorrect organisation numbers.
Additional log check after import – correcting what has failed

I reckon this is the most common way of checking the data quality in nowadays construction projects. If you are wondering how to do it, I recommend this entry.

Data entry checks

Generally, it boils down to limiting the values available to enter in each data point in the database. Usually, this requires a separate database software (or a really well-prepared Excel spreadsheet). This method is indeed better for overall data quality level, however, it requires a working knowledge of additional software and a budget to pay for the license.

This software-driven, automatic data check is very convenient for data managers since software takes a huge chunk of enforcing quality by restricting possible erratic entries. Of course, the data still requires further control, because the errors are potentially more difficult to uncover and their presence can have a more significant influence on the final result.

Example

In the New SUS project (about the project in Norwegian), for tendering, we use a dedicated software called dRofus to connect medical equipment data with the project design.

To connect the objects from a database with the design software we use a database item ID as a unique identifier and send it to Revit parameters. In the database, the equipment planning team specifies both requirements for each medical equipment and tendering data. Some of the software data entry rules encompass

  • Using only predefined item types (equipment)
  • Allows assigning to an item only a predefined tender name and number
  • Cost in currency and date of its last update

I have made a presentation on how such software works in one of my previous posts. At the bottom of that link, you will also find shortcuts to the full series about designing with the usage of an external database.

Summary

As you might have noted, data validation is not only a process that happens within a model and in quality assurance software. It goes well beyond that. The more digitalised our projects are, the more important it is to assure the delivered data is correct.

Checking for data quality within the model is actually the easiest and fastest step in the validation process. The challenge is with everything else that goes beyond the model but is still connected with the data within the design.

I know, that still many (if not most) of the construction projects are not there yet, and much of the data is gathered and exchanged on paper, but well – digital is the place we are heading to and sooner or later it becomes standard.

Did you like that post ? Share it with others !

We spend a lot of time and effort creating all of our articles and guides. It would be great if you could take a moment to share this post !

Share:

Comments:

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments

Author:

Download BIM CASE STUDIES:

After reading this guide you will learn:

  • How BIM is used on the biggest projects in Norway
  • What were the challenges for the design team and how were they solved
  • What were the challenges on the construction site and what was our approach to them

Newest articles: