I'm going to get a little philosophical with this post.
I recall when I first started working on Revit how it pushed the CPU on my Pentium more than anything I had ever seen before - much more than the CPU would move on our company mail server which I thought was doing quite a bit of work. Later I taught students my Revit theory on the conservation of work. The theory posits that the effort of making a set of drawings consistent after... say a floor to floor change does not disappear but instead moves to the processor. The effort shifts from man to machine.
I also remember having a conversation with one of our developers. I asked if Moore's law might work in our favor and Revit was just a little bit ahead of the curve and processing capacity would soon catch up. He confidently assured me that new capacity would be eaten up by new product features and/or changes in what user's model. He has been right so far, despite several major efforts to enhance performance.
My question to the readers is does this ever stop? If so when? Think to the future. What will you expect to model then and why? Will it be less or more? Perhaps model more and annotate less? What is the projection and can we even discern a logical end?