quote:
I have also had 'out of memory' problems, although only when running a group analysis with ER-fMRI data (24 subjects, 3 conditions, 30 trials each condtion with 4 scans per trial). I've spent a bunch of time playing with my OS (apparently Windows is not so awesome at getting large peices of contiguous memory ready for matlab to make a huge datamat) and even partitioned my drive and installed a different OS to see if that could help. I was able to tweak things so that I could run a mean-centered PLS on all 24 subjects, but when I tried to run a behavior PLS with the same subject's datamats (3 behav variables) it crashed giving me an, 'out of memory' error when it got to the first BS (after 100 perms).
I'm not sure if this is an unusually large data set, or if I should keep fiddling with memory management on my machine, or if I can do something to make matlab better at handling the large matrices. Back when I couldn't even run the mean-centered PLS, I could get it to completely run on an older linux machine, but it literally took 24 hours to do EACH perm and BS. So I'd love to get my newer machine to handle the behavior so I can peek at my seed analyses.
Hi Katerie,
Your dataset is not tremendously large so I am not sure why you are getting out of memory errors. On a Windows machine, it is usually important to minimize overhead as much as possible, so one thing that can help is runnign MATLAB without the java manager:
matlab -nojvm
this may help. The latest version of PLS will convert data to single precision (if you are running matlab v7+), which also helps a great deal.
Randy