Related Posts with Thumbnails

« Some Friday fun: 123D Sculpt | Main | Autodesk BIM Workshop »

August 23, 2011


Feed You can follow this conversation by subscribing to the comment feed for this post.

Splitting up a single discipline model for the sake of performance would be significantly reduced if there were no penalties for file size/complexity. That said, we would need to display only parts of the model for ease of task work. We would probably still separate out other disciplines if they are being developed by external consultants. The only way around this - again, if there were no performance penalties - would be to also include a more robust permissions system; for example, to prevent a mechanical engineer from deleting the architect's curtain wall.

I'm typically all for getting everyone at the table, but I would be hesitant to have everyone in the same model all the time. Here are some points that come to mind:
- How would we know what was finished and ready for coordination and what was still in progress. Everyone always pointing out every error sounds like there would be a lot of time wasted thinking about things that are in-progress
- It helps to be able to split projects and project teams into manageable pieces. You dont want everyone to be able to edit everything.
- What would be the benefit of having everybody working in the same model? I see the benefits of being able to combine models for coordination, but not necessarily for workflows.
- A very highly sophisticated method of controlling access and ownership would be needed. Also, sophisticated communication systems would need to be in place to manage interactions between different team members in the model.

This is a good discussion, lots of interesting ideas come out of this...

At our multidisciplinary practice we split the project into the respective disciplines.

The danger with making any system sophisticated is that... it's sophisticated!

Model management should be simple. It costs to much to train people to a high technical competency!

The big issue that we have with large models is by vertue of the software, you can make dramatic changes to this data with a few clicks of a button. This power in the wrong hands can cause catastrophic damage to projects.

My suggestion would be to have better control over linked elements. and more secure "locking" systems. e.g we haven't found a way to lock the shared coordinate system yet!!!! (we have tried every trick in the book to secure their positions)

Perhaps a user/login secure system that works with the worksets could be used to secure critical items such as base points, grids, structure model etc etc.

Another issue with high density models is analysis. Some models can take days to get results from environmental analysis.

Large files would only work if they were cloud based and the processing of integrating the model was done local to the storage device of the model. I would like to make fast changes out in the field/on site and this would only be possible by either large files in clouds or small files at a local level.

Good discussion point!

As much as I would love to see all disciplines in the same central model with careful and rigorous use of worksets, we have yet to see a WAN solution to provide enough speed and robustness to try. We frequently share the architectural model across our different offices, but not through replication, but instead, we set up Citrix workstations in the local office where the central file resides.

As for file size, we break up buildings on a campus per building, but some files do reach 250-300 Mb just for architectural. The linked consultant revit files are 50-250 mb each. The main issue becomes lack or worksharing discipline (on the part of users) and opening time (upwards of 10-15 minutes).

Personally I would like to know if anyone has compared ArchiCAD's BIM server to Revit's solutions to file size and worksharing and give feedback on what could improve Revit (ArchiCAD is not really an option for us, but I can't help but be envious of it's compatibility with Mac and worksharing methods).

Thanks for bringing up the topic as this affects our profitability.

The major issue we came across while working on large projects and trying to keep everyone in the same model (Just M,E & P) was not so much file size as number of people simultaneously working in the model. At one point we had almost 30 people in one model!

This led to some interesting problems with trying to synchronize to central (specially around lunch time and close of business day). It began to take upwards of 25 minutes to save, even if you had just saved an hour ago.

It became the main reason to spilt up our models and we now routinely spilt a model if we anticipate that more than 10-12 people will work on a particular project.

Apart from all of the very good points brought up in the comments above, this has been our biggest issue on large projects.

We run our Multi-Discipline Models in upwards of 400mb that can include all disciplines. For us we run as many engineering disciplines that are one the job in one file, we break the Projects up by area's. So Larger buildings may be broken up by Basement & Tower then Typicals in linekd files.
This reduces our issues of running many files and allow all teams to co-ordinate across teh Revit server between multiple offices. However strict adherence to worksets is necessary and well setup views, sheets, filters,. My only complaint is the inability to folder and filter schedules and Legends as this would aid the ability of our disciplines a great deal.

We're working with an M&E consultant that has split their model up by floor range (lower, upper, roof) for performance and worksharing purposes. This is no extra hassle for us in terms of coordination (although it seems to be causing them some issues, as there are regular errors where a a pipe transitions from one model to the next). However, it is a pain when we want to alter visibility settings of the linked models in a view. We have to go through each linked model in turn and alter their visibility settings one by one. It would be nice to be able to apply the same settings to multiple links at the same time. Unless it already is possible, in which case, how?

As a multi-disipline firm, we have used linked files and single models for projects. While there are benefits to both, we still find it necessary to split large models.

For us, decisions on where and how to split are often based on the complexity of the system (M+E) or the size of the model (Architectural) and less often on the project deliverables. The combination of these decisions can make a single project look like a maze when navigating through the files!

Hmmm, no penalty for file size/complexity. Wouldn't that be nice! Then our decision could really be based on other factors like staggered tender packages or internal versus external consultants.

Tom, You may be able to use view templates to control the links yet is an all or nothing proposition int he current implementation.

All, Thank you for sharing these comments. These help point to additional areas that need improvement (many known) and also validates our understanding of how the product is being used.

When I'm asked whether to use a linked model workflow, my first question is always: Are you thinking about it because of performance problems? If the answer is no, then I always suggest to reconsider and perhaps leave everything in one model as there are a lot more issues that stem from linking which do not exist in a single-model environment. I'm mainly talking about separating one discipline into multiple models. Separating different disciplines is not that big of a deal although that too has its set of problems.

View templates need further granularity. If I have 10 links, I need to be able to control whether specific view template control certain links or not. They cannot just be an all on or all off affair.


Our 400-person firm does both, depending solely on project size, which is directly connected to the performance bone.

Your proposals include:

"no penalty for file size/complexity"

How would that work, exactly? Have you invented terabit networking? 20 ghz processors? Quantum compute-based compression/decompression of the database? Do tell.

You also suggest "sophisticated visual filtering of the model based on element meta data". My observation of Revit is that I spend 20 hours per week designing MEP systems, and another 20 hours per week farking around with visibility issues - EVERY WEEK. An analogy: Imagine a theater with 300 very sophisticated lights for the stage. Then imagine that every light has at least a 4-way switch (some have more, but you can't always know that), and the switches are located all over the building, many in the attic, some the basement, and an unknown number out in the parking lot behind the dumpster. This is Revit.

One small suggestion for simplification of visibility control: Allow the user to select what entities will be automatically assigned to worksets. In the dialog box where we create worksets (say, M-Ducts), add a column where we can assign elements to be put on that workset when they are created (ducts, duct fittings, duct accessories). A small step, as I said, but in the right direction.

Please work to streamline visibility control.


The comments to this entry are closed.


  • Subscribe