A clash report is only as good as the person writing it. Fifteen years on MEP projects means I know the difference between a real conflict, a builder's work opening, and a test result that doesn't matter. You get a report you can act on, not a 400-page PDF nobody opens.
Running the tests is the easy part. Any technician with Navisworks can do it. The hard part is knowing what the results mean and what to do with them — and that's where most reports fall apart.
Builder's work openings flagged as real clashes. Lighting showing up as a conflict with the ceiling it's meant to sit in. Hundreds of results that aren't actually problems, burying the few that are.
Knowing which discipline should move and why takes experience. Without it, every clash gets treated the same, and the report ends up useless.
Lighting usually wins over diffusers in a tight ceiling. Diffusers have placement tolerances that a test engine won't account for. These are the rules you pick up on the job, not from a clash parameter.
Clashes that look fine on screen but would be a nightmare to actually build. If the person writing the report has never been on a site, this happens a lot.
Send me the model files. I run the tests, work through the results, and send back a report with the real issues flagged and commentary on each one. Most jobs turn around in 48–72 hours.
One pair tested — say mechanical vs structural. You get a report with prioritised findings, annotated images, and notes on each clash.
All MEP disciplines tested against structure and architecture. Priority matrix, commentary per clash, and carry-forward tracking so you can see what's new, active, and resolved between rounds.
Already got a report and not sure whether to trust it? I'll go through it, flag the false positives and anything missed, and tell you what I'd do differently.
For active projects with regular rounds. Three tiers — Essentials, Standard, Intensive — depending on how many projects and how often you need reports.
Ad-hoc clash review, model health checks, or VDC team support. Rush rates if you need something turned around in under 48 hours.
NWC or NWF via whichever cloud share you prefer. Let me know the scope and anything discipline-specific I should watch for.
I filter out the noise, check the real issues against how MEP actually routes on site, and figure out what should move where.
You get a prioritised report with images, locations, and notes on each clash. If it's a round 2 or later, I carry forward the history so you can see what's new.
Questions on edge cases or re-tests are included in retainer tiers. For one-off jobs, I'll still answer a quick clarification — I'd rather you got it right.
Hi, I'm Nathan. I've spent the last fifteen years in MEP — starting as a technician and working my way up to senior BIM coordinator. I've worked across healthcare, retail, education, hospitality, and manufacturing.
I started ClashReporting because I kept seeing the same pattern: clash reports flagging hundreds of issues but missing the ones that actually caused problems on site. Everything treated the same. Reports that made more work instead of solving it.
So the idea behind ClashReporting is pretty simple — the person writing the report needs to know what they're looking at. Whether a clash is real. Which discipline should move. Whether the resolution is actually buildable. That's what I bring.
Fifteen minutes, no pitch deck. We'll go through your current setup, where it's causing headaches, and whether I can actually help. If I'm not the right fit, I'll say so.
Drop a few details about your project or current setup — or just say hi. I'll get back within a business day.
If it's quicker, book a 15-minute call or email me directly.
Thanks — I'll get back to you within one business day.