Memory increase in 32 bit Excel

32-bitmemory
8 posts / 0 new
Last post
X 0
Memory increase in 32 bit Excel

For anyone who wasn't aware, my IT guys just told me that in December 2019 the maximum accessible memory for 32 bit excel running on 64 bit windows was increased from 2GB to 4GB. My ongoing battle to get 64 bit Excel playing nicely can take a break until next year when my model has doubled in size again:-)

Source: https://docs.microsoft.com/en-us/office/troubleshoot/excel/laa-capabilit...

Michael Hutchens A+ 162

Hi Guys,

From my experience the real challenge with larger models is not file size on disk - as Excel supports opening 1GB+ files without formulas - it's the number of calculation iterations required by the model each time it calculates.

Most Excel users don't really think about how the Excel calculation actually works, and the fact that it's one big iterative loop. As a result, they commonly include a huge number of SUMIF, INDEX and MATCH functions in their models, often when doing so can/should be avoided, and thereby make Excel sweat and become unstable in the process.

I've almost never come across a model that can't be made significantly faster via a combination of restructuring to remove repeated calculations - as Jun discusses above - and removing avoidable lookup-based functions. The last model I optimized went from an 9 second calculation time down to 0.6 seconds.

The big thing to think about is whether or not each large calculation block really needs to calculate each time Excel calculates. For example, I regularly see financial models that use tens/hundreds of thousands of SUMIF functions to restructure historical financial data into a format consistent with historical financial statements. This approach is disastrous, because the historical data is usually only updated once a month, but the SUMIF functions end up completely killing the performance of the model all the time.

A few simple workarounds to this issue are:

  1. Put the filtering of the historical data into a separate workbook, which is only opened when historical data is updated; or
  2. Use a macro to insert and remove the SUMIF formulas before and after historical data is updated; or
  3. Use a macro/API to bring the historical data into the model at the click of a button rather than via a huge bunch of formulas.

I use these approaches to aim for workbooks <= 20MB file size with a calculation time of < 1 second, but I must admit it can still be hard given the scoping requirements of some of our users...

M.

Michael Hutchens A+ 162

Yes the really painful thing about memory limits is that the most likely crash time is during save, which is also a worst-case scenario.

Ultimately, the only really safe way to mitigate your issues is to run extremely powerful hardware and avoid 32-bit, but at least the Large Address Aware capability change for Excel has created some respite for users not easily able to move from 32-bit to 64-bit operating systems.