The software needs to be capable of storing increased data without causing technical issues or imposing financial burdens. Hence, the requirement calls for effective data storage management that balances cost-effectiveness, risk mitigation, and system performance.
Scenario: A wholesale business is operating in several markets, each of them experiencing uncertainty. The business creates new what-if scenarios in all of these markets which in effect creates an entirely new database each time. This causes the product to slow down and increases database storage size 2x each time a what-if is created.
Solution: The user works with the software vendor to analyze their usage. They determine that, by only cloning a subset of the model, they're able to accomplish the same thing without cloning the entire size of the database. They also work with the vendor to upgrade their service level to a more enterprise grade, giving them access to most elasticity as they grow.
Some systems will become bloated very fast as more models are created. This can be due to the way they store data, counting millions of empty cells as data, and subsequently charging you for it. This is particularly acute in OLAP or cube database environments.
Version creation does tend to make the storage footprint per customer instance large. This is because some systems lack the sophistication to copy just one portion of a version to create a scenario - they must clone the entire thing and, once cloned, remain independent. If you plan to create many what-if scenarios, make sure to ask about this.