Back to PLS Help

out of memory?
archived_post
Posted on 02/02/07 12:43:43
Number of posts: 100
archived_post posts:

Author: Drew (142.20.199.---)
Date:   01-23-07 16:35

PLS newbie here
ERfmri
10 runs, each with 490 scans, 7 conditions

When attempting to "Create ST Datamat", the progress bar gets about 85% complete before i get an "out of memory" error. (see below)

I have 1GB, is that not enough? is there anyway I can calculate my memory requirements beforehand?

What do I do? buy more RAM? increase virtual memory? smaller data size?

Thanks for any insights


??? Error using ==> vertcat
Out of memory. Type HELP MEMORY for your options.

Error in ==> fmri_get_datamat at 457
datamat = [datamat; single(double(evt_datamat) / num_onsets)];

Error in ==> fmri_create_datamat_ui>RunGenDatamat at 1532
fmri_get_datamat(session_file,i,options.Threshold, ...

Error in ==> fmri_create_datamat_ui at 71
RunGenDatamat;

??? Error using ==> waitfor
Error while evaluating uicontrol Callback.

Replies:

Re: out of memory?
archived_post
Posted on 02/02/07 12:44:05
Number of posts: 100
archived_post replies:

Author: Jimmy (---.rotman-baycrest.on.ca)
Date:   01-23-07 16:39

Try to run it on Matlab 7, which supports single precision calculation. Otherwise, you may need to resample the data to smaller size.

Drew wrote:
> PLS newbie here
> ERfmri
> 10 runs, each with 490 scans, 7 conditions
>
> When attempting to "Create ST Datamat", the progress bar gets
> about 85% complete before i get an "out of memory" error. (see
> below)
>
> I have 1GB, is that not enough? is there anyway I can
> calculate my memory requirements beforehand?
>
> What do I do? buy more RAM? increase virtual memory? smaller
> data size?
>
> Thanks for any insights
>
>
> ??? Error using ==> vertcat
> Out of memory. Type HELP MEMORY for your options.
>
> Error in ==> fmri_get_datamat at 457
> datamat = [datamat;
> single(double(evt_datamat) / num_onsets)];
>
> Error in ==> fmri_create_datamat_ui>RunGenDatamat at 1532
> fmri_get_datamat(session_file,i,options.Threshold,
> ...
>
> Error in ==> fmri_create_datamat_ui at 71
> RunGenDatamat;
>
> ??? Error using ==> waitfor
> Error while evaluating uicontrol Callback.


Re: out of memory?
archived_post
Posted on 02/02/07 12:44:21
Number of posts: 100
archived_post replies:

Author: Drew (142.20.199.---)
Date:   01-23-07 17:01

I am running Matlab 7(.0.4)

brain volumes are 54x64x50 voxels. Too big?


Re: out of memory?
archived_post
Posted on 02/02/07 12:44:36
Number of posts: 100
archived_post replies:

Author: Jimmy (---.rotman-baycrest.on.ca)
Date:   01-23-07 17:16

> I am running Matlab 7(.0.4)
>
> brain volumes are 54x64x50 voxels. Too big?

The size of datamat depends on the number of onset and the temporal window size.

Datamat_size = sum_for_all_conditions_and_runs(number_of_onsets * temporal_window * brain_vol * precision)

Let's say you have 7 conditions, each has 50 onsets with temporal window of 8, and precision is 32 bits (4 bytes). Your datamat size will be:

size = 10*7*50*8*4*54*64*50 = 19 GB

You may re-calculate this based on your own situation.


Re: out of memory?
archived_post
Posted on 02/02/07 12:44:52
Number of posts: 100
archived_post replies:

Author: Drew (142.20.199.---)
Date:   01-23-07 18:13

hmm
19G seems kinda large - is this a normal size? How does anyone process a file that huge? with a requirement 19G of memory? Am I doing this right? Or should I be creating 1 datamat for each subject? (a mere 1.9G each?) (even though the original input data file is only 0.3G...)
Or am I just going to get thwarted later in the run stage when PLS tries to put them all together?

Anyways, thanks for the info.

Drew


Re: out of memory?
archived_post
Posted on 02/02/07 12:45:12
Number of posts: 100
archived_post replies:

Author: Jimmy (---.rotman-baycrest.on.ca)
Date:   01-23-07 18:19

> should I be creating 1 datamat for each
> subject? (a mere 1.9G each?) (even though the original input
> data file is only 0.3G...)

As I mentioned earlier, you can "...resample the data to smaller size...", or even just analyze a few of the entire slices, ...

> Or am I just going to get thwarted later in the run stage when
> PLS tries to put them all together?

Yes, it is still possible that you will get "out of memory" in the run stage.


Re: out of memory?
archived_post
Posted on 02/02/07 12:45:32
Number of posts: 100
archived_post replies:

Author: Randy (---.rotman-baycrest.on.ca)
Date:   01-23-07 18:24

The dataset estimate that you got assumes all voxels are considered brain, which is not the case as the corners of the image volume are zero'd out in the datamat creation.

One thing that I wonder aobut is your statement about doing a datamat for each subject. Is this not what you are doing already? Are the 10 runs you mention actually 10 subjects? If so, they should run independently so there is a session and datamat for each that is recombined later.

Drew wrote:

> hmm
> 19G seems kinda large - is this a normal size? How does anyone
> process a file that huge? with a requirement 19G of memory? Am
> I doing this right? Or should I be creating 1 datamat for each
> subject? (a mere 1.9G each?) (even though the original input
> data file is only 0.3G...)
> Or am I just going to get thwarted later in the run stage when
> PLS tries to put them all together?
>
> Anyways, thanks for the info.
>
> Drew


Untitled Post
katerimcrae
Posted on 02/13/07 20:10:50
Number of posts: 2
katerimcrae replies:

I have also had 'out of memory' problems, although only when running a group analysis with ER-fMRI data (24 subjects, 3 conditions, 30 trials each condtion with 4 scans per trial). I've spent a bunch of time playing with my OS (apparently Windows is not so awesome at getting large peices of contiguous memory ready for matlab to make a huge datamat) and even partitioned my drive and installed a different OS to see if that could help. I was able to tweak things so that I could run a mean-centered PLS on all 24 subjects, but when I tried to run a behavior PLS with the same subject's datamats (3 behav variables) it crashed giving me an, 'out of memory' error when it got to the first BS (after 100 perms).

I'm not sure if this is an unusually large data set, or if I should keep fiddling with memory management on my machine, or if I can do something to make matlab better at handling the large matrices. Back when I couldn't even run the mean-centered PLS, I could get it to completely run on an older linux machine, but it literally took 24 hours to do EACH perm and BS. So I'd love to get my newer machine to handle the behavior so I can peek at my seed analyses.


Ideas to help with memory overhead
rmcintosh
Posted on 02/13/07 20:29:58
Number of posts: 394
rmcintosh replies:

quote:
I have also had 'out of memory' problems, although only when running a group analysis with ER-fMRI data (24 subjects, 3 conditions, 30 trials each condtion with 4 scans per trial). I've spent a bunch of time playing with my OS (apparently Windows is not so awesome at getting large peices of contiguous memory ready for matlab to make a huge datamat) and even partitioned my drive and installed a different OS to see if that could help. I was able to tweak things so that I could run a mean-centered PLS on all 24 subjects, but when I tried to run a behavior PLS with the same subject's datamats (3 behav variables) it crashed giving me an, 'out of memory' error when it got to the first BS (after 100 perms).

I'm not sure if this is an unusually large data set, or if I should keep fiddling with memory management on my machine, or if I can do something to make matlab better at handling the large matrices. Back when I couldn't even run the mean-centered PLS, I could get it to completely run on an older linux machine, but it literally took 24 hours to do EACH perm and BS. So I'd love to get my newer machine to handle the behavior so I can peek at my seed analyses.
Hi Katerie,

Your dataset is not tremendously large so I am not sure why you are getting out of memory errors.  On a Windows machine, it is usually important to minimize overhead as much as possible, so one thing that can help is runnign MATLAB without the java manager:

matlab -nojvm

this may help.  The latest version of PLS will convert data to single precision (if you are running matlab v7+), which also helps a great deal.

Randy


Untitled Post

I'm Online
nlobaugh
Posted on 02/13/07 22:55:58
Number of posts: 229
nlobaugh replies:

Kateri
if you are not averaging across trials & runs, that will also increase your memory requirements.
Nancy


Untitled Post
katerimcrae
Posted on 02/14/07 04:00:47
Number of posts: 2
katerimcrae replies:

quote:
Hi Katerie,

Your dataset is not tremendously large so I am not sure why you are getting out of memory errors.  On a Windows machine, it is usually important to minimize overhead as much as possible, so one thing that can help is runnign MATLAB without the java manager:

matlab -nojvm

this may help.  The latest version of PLS will convert data to single precision (if you are running matlab v7+), which also helps a great deal.

Randy
Thanks, Randy! The version I was using was not the very latest. However, when I downloaded the new code and tried to run the same analysis, I got an error from Windows about sgemm ("The specified procedure could not be found") and then Matlab crashed in a serious way and gave me a message about a segmentation violation. A quick google search on sgemm shows that it has something to do with single precision arrays -- might it be related to the new code, or a horrible coincidence?



Login to reply to this topic.

  • Keep in touch

Enter your email above to receive electronic messages from Baycrest, including invitations to programs and events, newsletters, updates and other communications.
You can unsubscribe at any time.
Please refer to our Privacy Policy or contact us for more details.

  • Follow us on social
  • Facebook
  • Instagram
  • Linkedin
  • Pinterest
  • Twitter
  • YouTube

Contact Us:

3560 Bathurst Street
Toronto, Ontario
Canada M6A 2E1
Phone: (416) 785-2500

Baycrest is an academic health sciences centre fully affiliated with the University of Toronto