Big matrices in MATLAB












0















Following my previous question in this post:



I'm trying to build a 128 row matrix with N column (up to many millions). The sole purpose of this matrix is to calculate the mean or median of all 128 rows column wise and save it as a vector (same number of columns as my data).



The size of my data files varies a lot and with smaller ones I've been able to perform just this without any issues using the code from the previous post (see above). But of course, if I'm dealing with bigger datasets I run out of memory. Keep in mind that the error is in concatenating the 128 rows into a new matrix.



EDIT the code used for concatenating the data in the files is the following:



for k = TTs;   %TTs to plot
cd (strcat('TT',num2str(k))); %TT folder

for w = 1:4;
load(strcat('TT',num2str(k),'ch',num2str(w),'.mat'));
allChs1(4*(k-1)+w,:) = data(1,:); %concatenate into one matrix
end

cd ..
end


I've considered averaging 128 rows (on a column by column basis) and consecutively save that value, but have been utterly unsuccessful in doing so...



Any idea on how I could implement this? And, might there be a better way of getting the average of 128 rows on column by column basis?



Cheers,
Oiko










share|improve this question

























  • Show what code you've tried, a Minimal, Complete, and Verifiable example (using only a small example, we can imagine some bigger input), and point to why it doesn't work.

    – Wolfie
    Nov 15 '18 at 16:01











  • @Wolfie thanks, just added the code for the concatenation of data files

    – Oiko
    Nov 15 '18 at 16:09













  • Are you pre-allocating allChs1? With the code you show here, it gets re-allocated every time you add a row, meaning you're using way more memory than necessary.

    – Cris Luengo
    Nov 15 '18 at 16:41






  • 1





    What is many millions? An 128x10,000,000 matrix of doubles needs less than 10 GB. How much memory do you have available?

    – Cris Luengo
    Nov 15 '18 at 16:43











  • @Cris Luengo In fact I am not preallocating... The size of the matrix is not known before hand and I haven't managed to change the code adequately in order to allocate progressively. Concerning the size of the matrix, it ranges between a few dozens to hundreds of millions

    – Oiko
    Nov 16 '18 at 11:42


















0















Following my previous question in this post:



I'm trying to build a 128 row matrix with N column (up to many millions). The sole purpose of this matrix is to calculate the mean or median of all 128 rows column wise and save it as a vector (same number of columns as my data).



The size of my data files varies a lot and with smaller ones I've been able to perform just this without any issues using the code from the previous post (see above). But of course, if I'm dealing with bigger datasets I run out of memory. Keep in mind that the error is in concatenating the 128 rows into a new matrix.



EDIT the code used for concatenating the data in the files is the following:



for k = TTs;   %TTs to plot
cd (strcat('TT',num2str(k))); %TT folder

for w = 1:4;
load(strcat('TT',num2str(k),'ch',num2str(w),'.mat'));
allChs1(4*(k-1)+w,:) = data(1,:); %concatenate into one matrix
end

cd ..
end


I've considered averaging 128 rows (on a column by column basis) and consecutively save that value, but have been utterly unsuccessful in doing so...



Any idea on how I could implement this? And, might there be a better way of getting the average of 128 rows on column by column basis?



Cheers,
Oiko










share|improve this question

























  • Show what code you've tried, a Minimal, Complete, and Verifiable example (using only a small example, we can imagine some bigger input), and point to why it doesn't work.

    – Wolfie
    Nov 15 '18 at 16:01











  • @Wolfie thanks, just added the code for the concatenation of data files

    – Oiko
    Nov 15 '18 at 16:09













  • Are you pre-allocating allChs1? With the code you show here, it gets re-allocated every time you add a row, meaning you're using way more memory than necessary.

    – Cris Luengo
    Nov 15 '18 at 16:41






  • 1





    What is many millions? An 128x10,000,000 matrix of doubles needs less than 10 GB. How much memory do you have available?

    – Cris Luengo
    Nov 15 '18 at 16:43











  • @Cris Luengo In fact I am not preallocating... The size of the matrix is not known before hand and I haven't managed to change the code adequately in order to allocate progressively. Concerning the size of the matrix, it ranges between a few dozens to hundreds of millions

    – Oiko
    Nov 16 '18 at 11:42
















0












0








0








Following my previous question in this post:



I'm trying to build a 128 row matrix with N column (up to many millions). The sole purpose of this matrix is to calculate the mean or median of all 128 rows column wise and save it as a vector (same number of columns as my data).



The size of my data files varies a lot and with smaller ones I've been able to perform just this without any issues using the code from the previous post (see above). But of course, if I'm dealing with bigger datasets I run out of memory. Keep in mind that the error is in concatenating the 128 rows into a new matrix.



EDIT the code used for concatenating the data in the files is the following:



for k = TTs;   %TTs to plot
cd (strcat('TT',num2str(k))); %TT folder

for w = 1:4;
load(strcat('TT',num2str(k),'ch',num2str(w),'.mat'));
allChs1(4*(k-1)+w,:) = data(1,:); %concatenate into one matrix
end

cd ..
end


I've considered averaging 128 rows (on a column by column basis) and consecutively save that value, but have been utterly unsuccessful in doing so...



Any idea on how I could implement this? And, might there be a better way of getting the average of 128 rows on column by column basis?



Cheers,
Oiko










share|improve this question
















Following my previous question in this post:



I'm trying to build a 128 row matrix with N column (up to many millions). The sole purpose of this matrix is to calculate the mean or median of all 128 rows column wise and save it as a vector (same number of columns as my data).



The size of my data files varies a lot and with smaller ones I've been able to perform just this without any issues using the code from the previous post (see above). But of course, if I'm dealing with bigger datasets I run out of memory. Keep in mind that the error is in concatenating the 128 rows into a new matrix.



EDIT the code used for concatenating the data in the files is the following:



for k = TTs;   %TTs to plot
cd (strcat('TT',num2str(k))); %TT folder

for w = 1:4;
load(strcat('TT',num2str(k),'ch',num2str(w),'.mat'));
allChs1(4*(k-1)+w,:) = data(1,:); %concatenate into one matrix
end

cd ..
end


I've considered averaging 128 rows (on a column by column basis) and consecutively save that value, but have been utterly unsuccessful in doing so...



Any idea on how I could implement this? And, might there be a better way of getting the average of 128 rows on column by column basis?



Cheers,
Oiko







matlab for-loop signal-processing






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 15 '18 at 16:08







Oiko

















asked Nov 15 '18 at 15:46









OikoOiko

83




83













  • Show what code you've tried, a Minimal, Complete, and Verifiable example (using only a small example, we can imagine some bigger input), and point to why it doesn't work.

    – Wolfie
    Nov 15 '18 at 16:01











  • @Wolfie thanks, just added the code for the concatenation of data files

    – Oiko
    Nov 15 '18 at 16:09













  • Are you pre-allocating allChs1? With the code you show here, it gets re-allocated every time you add a row, meaning you're using way more memory than necessary.

    – Cris Luengo
    Nov 15 '18 at 16:41






  • 1





    What is many millions? An 128x10,000,000 matrix of doubles needs less than 10 GB. How much memory do you have available?

    – Cris Luengo
    Nov 15 '18 at 16:43











  • @Cris Luengo In fact I am not preallocating... The size of the matrix is not known before hand and I haven't managed to change the code adequately in order to allocate progressively. Concerning the size of the matrix, it ranges between a few dozens to hundreds of millions

    – Oiko
    Nov 16 '18 at 11:42





















  • Show what code you've tried, a Minimal, Complete, and Verifiable example (using only a small example, we can imagine some bigger input), and point to why it doesn't work.

    – Wolfie
    Nov 15 '18 at 16:01











  • @Wolfie thanks, just added the code for the concatenation of data files

    – Oiko
    Nov 15 '18 at 16:09













  • Are you pre-allocating allChs1? With the code you show here, it gets re-allocated every time you add a row, meaning you're using way more memory than necessary.

    – Cris Luengo
    Nov 15 '18 at 16:41






  • 1





    What is many millions? An 128x10,000,000 matrix of doubles needs less than 10 GB. How much memory do you have available?

    – Cris Luengo
    Nov 15 '18 at 16:43











  • @Cris Luengo In fact I am not preallocating... The size of the matrix is not known before hand and I haven't managed to change the code adequately in order to allocate progressively. Concerning the size of the matrix, it ranges between a few dozens to hundreds of millions

    – Oiko
    Nov 16 '18 at 11:42



















Show what code you've tried, a Minimal, Complete, and Verifiable example (using only a small example, we can imagine some bigger input), and point to why it doesn't work.

– Wolfie
Nov 15 '18 at 16:01





Show what code you've tried, a Minimal, Complete, and Verifiable example (using only a small example, we can imagine some bigger input), and point to why it doesn't work.

– Wolfie
Nov 15 '18 at 16:01













@Wolfie thanks, just added the code for the concatenation of data files

– Oiko
Nov 15 '18 at 16:09







@Wolfie thanks, just added the code for the concatenation of data files

– Oiko
Nov 15 '18 at 16:09















Are you pre-allocating allChs1? With the code you show here, it gets re-allocated every time you add a row, meaning you're using way more memory than necessary.

– Cris Luengo
Nov 15 '18 at 16:41





Are you pre-allocating allChs1? With the code you show here, it gets re-allocated every time you add a row, meaning you're using way more memory than necessary.

– Cris Luengo
Nov 15 '18 at 16:41




1




1





What is many millions? An 128x10,000,000 matrix of doubles needs less than 10 GB. How much memory do you have available?

– Cris Luengo
Nov 15 '18 at 16:43





What is many millions? An 128x10,000,000 matrix of doubles needs less than 10 GB. How much memory do you have available?

– Cris Luengo
Nov 15 '18 at 16:43













@Cris Luengo In fact I am not preallocating... The size of the matrix is not known before hand and I haven't managed to change the code adequately in order to allocate progressively. Concerning the size of the matrix, it ranges between a few dozens to hundreds of millions

– Oiko
Nov 16 '18 at 11:42







@Cris Luengo In fact I am not preallocating... The size of the matrix is not known before hand and I haven't managed to change the code adequately in order to allocate progressively. Concerning the size of the matrix, it ranges between a few dozens to hundreds of millions

– Oiko
Nov 16 '18 at 11:42














1 Answer
1






active

oldest

votes


















2














You can incrementally compute an average, so that you only have one dataset and the average value in memory:



mean[n] = value[n]/n + mean[n-1](n-1)/n*



avg_vector=0; % It will be changed to a vector at first iteration
for k = TTs; %TTs to plot
folder=['TT',num2str(k)]; %TT folder

for w = 1:4;
file = ['TT' num2str(k) 'ch' num2str(w) '.mat'];
count = 4*(k-1)+w;
load(fullfile(folder,file));
avg_vector = (1/count) * data(1,:) + ((count-1)/count) * avg_vector;
end
end


Provided that the number of columns is not very large (in which case some precision may be lost to round-off errors) this will give the average. The only large vectors in memory are avg_vector and data



For a median, this more complicated as there is no incremental formula. You may have to add another loop over some subset of 1:N and do a selection.



filename=@(k,w) fullfile(['TT',num2str(k)],['TT' num2str(k) 'ch' num2str(w) '.mat']);
load(filename(1,1));
N=size(data,2);
median_all = zeros(1,N);

stride = 1e6;

for nn=1:stride:N
rng = nn:min(N,nn+stride-1);
MAT=zeros(128,length(rng));
for k=TTs
for w=1:4
load(filename(k,w));
MAT(4*(k-1)+w,:)=data(1,rng);
end
end
median_all(1,rng) = median(MAT,1);
clear MAT
end


There will be 128 million values at most in matrix MAT, so about 1GB if data is a 64-bit type (e.g. double). The downside is, file will have to be read several times. The balance is shifted from memory consumption to file I/O.






share|improve this answer


























  • Thanks I'll be trying this out in a moment!

    – Oiko
    Nov 16 '18 at 11:43











  • I've tested your solution for the average and avg_vector is in fact the same as the last file opened in the loop. No average was computed. Any idea on where the error might be?

    – Oiko
    Nov 18 '18 at 13:37











  • There is a typo in the formula. it should be avg_vector = (1/count) * data(1,:) + ((count-1)/count) * avg_vector(1,:);. I'll edit the answer

    – Brice
    Nov 19 '18 at 9:14











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53323068%2fbig-matrices-in-matlab%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









2














You can incrementally compute an average, so that you only have one dataset and the average value in memory:



mean[n] = value[n]/n + mean[n-1](n-1)/n*



avg_vector=0; % It will be changed to a vector at first iteration
for k = TTs; %TTs to plot
folder=['TT',num2str(k)]; %TT folder

for w = 1:4;
file = ['TT' num2str(k) 'ch' num2str(w) '.mat'];
count = 4*(k-1)+w;
load(fullfile(folder,file));
avg_vector = (1/count) * data(1,:) + ((count-1)/count) * avg_vector;
end
end


Provided that the number of columns is not very large (in which case some precision may be lost to round-off errors) this will give the average. The only large vectors in memory are avg_vector and data



For a median, this more complicated as there is no incremental formula. You may have to add another loop over some subset of 1:N and do a selection.



filename=@(k,w) fullfile(['TT',num2str(k)],['TT' num2str(k) 'ch' num2str(w) '.mat']);
load(filename(1,1));
N=size(data,2);
median_all = zeros(1,N);

stride = 1e6;

for nn=1:stride:N
rng = nn:min(N,nn+stride-1);
MAT=zeros(128,length(rng));
for k=TTs
for w=1:4
load(filename(k,w));
MAT(4*(k-1)+w,:)=data(1,rng);
end
end
median_all(1,rng) = median(MAT,1);
clear MAT
end


There will be 128 million values at most in matrix MAT, so about 1GB if data is a 64-bit type (e.g. double). The downside is, file will have to be read several times. The balance is shifted from memory consumption to file I/O.






share|improve this answer


























  • Thanks I'll be trying this out in a moment!

    – Oiko
    Nov 16 '18 at 11:43











  • I've tested your solution for the average and avg_vector is in fact the same as the last file opened in the loop. No average was computed. Any idea on where the error might be?

    – Oiko
    Nov 18 '18 at 13:37











  • There is a typo in the formula. it should be avg_vector = (1/count) * data(1,:) + ((count-1)/count) * avg_vector(1,:);. I'll edit the answer

    – Brice
    Nov 19 '18 at 9:14
















2














You can incrementally compute an average, so that you only have one dataset and the average value in memory:



mean[n] = value[n]/n + mean[n-1](n-1)/n*



avg_vector=0; % It will be changed to a vector at first iteration
for k = TTs; %TTs to plot
folder=['TT',num2str(k)]; %TT folder

for w = 1:4;
file = ['TT' num2str(k) 'ch' num2str(w) '.mat'];
count = 4*(k-1)+w;
load(fullfile(folder,file));
avg_vector = (1/count) * data(1,:) + ((count-1)/count) * avg_vector;
end
end


Provided that the number of columns is not very large (in which case some precision may be lost to round-off errors) this will give the average. The only large vectors in memory are avg_vector and data



For a median, this more complicated as there is no incremental formula. You may have to add another loop over some subset of 1:N and do a selection.



filename=@(k,w) fullfile(['TT',num2str(k)],['TT' num2str(k) 'ch' num2str(w) '.mat']);
load(filename(1,1));
N=size(data,2);
median_all = zeros(1,N);

stride = 1e6;

for nn=1:stride:N
rng = nn:min(N,nn+stride-1);
MAT=zeros(128,length(rng));
for k=TTs
for w=1:4
load(filename(k,w));
MAT(4*(k-1)+w,:)=data(1,rng);
end
end
median_all(1,rng) = median(MAT,1);
clear MAT
end


There will be 128 million values at most in matrix MAT, so about 1GB if data is a 64-bit type (e.g. double). The downside is, file will have to be read several times. The balance is shifted from memory consumption to file I/O.






share|improve this answer


























  • Thanks I'll be trying this out in a moment!

    – Oiko
    Nov 16 '18 at 11:43











  • I've tested your solution for the average and avg_vector is in fact the same as the last file opened in the loop. No average was computed. Any idea on where the error might be?

    – Oiko
    Nov 18 '18 at 13:37











  • There is a typo in the formula. it should be avg_vector = (1/count) * data(1,:) + ((count-1)/count) * avg_vector(1,:);. I'll edit the answer

    – Brice
    Nov 19 '18 at 9:14














2












2








2







You can incrementally compute an average, so that you only have one dataset and the average value in memory:



mean[n] = value[n]/n + mean[n-1](n-1)/n*



avg_vector=0; % It will be changed to a vector at first iteration
for k = TTs; %TTs to plot
folder=['TT',num2str(k)]; %TT folder

for w = 1:4;
file = ['TT' num2str(k) 'ch' num2str(w) '.mat'];
count = 4*(k-1)+w;
load(fullfile(folder,file));
avg_vector = (1/count) * data(1,:) + ((count-1)/count) * avg_vector;
end
end


Provided that the number of columns is not very large (in which case some precision may be lost to round-off errors) this will give the average. The only large vectors in memory are avg_vector and data



For a median, this more complicated as there is no incremental formula. You may have to add another loop over some subset of 1:N and do a selection.



filename=@(k,w) fullfile(['TT',num2str(k)],['TT' num2str(k) 'ch' num2str(w) '.mat']);
load(filename(1,1));
N=size(data,2);
median_all = zeros(1,N);

stride = 1e6;

for nn=1:stride:N
rng = nn:min(N,nn+stride-1);
MAT=zeros(128,length(rng));
for k=TTs
for w=1:4
load(filename(k,w));
MAT(4*(k-1)+w,:)=data(1,rng);
end
end
median_all(1,rng) = median(MAT,1);
clear MAT
end


There will be 128 million values at most in matrix MAT, so about 1GB if data is a 64-bit type (e.g. double). The downside is, file will have to be read several times. The balance is shifted from memory consumption to file I/O.






share|improve this answer















You can incrementally compute an average, so that you only have one dataset and the average value in memory:



mean[n] = value[n]/n + mean[n-1](n-1)/n*



avg_vector=0; % It will be changed to a vector at first iteration
for k = TTs; %TTs to plot
folder=['TT',num2str(k)]; %TT folder

for w = 1:4;
file = ['TT' num2str(k) 'ch' num2str(w) '.mat'];
count = 4*(k-1)+w;
load(fullfile(folder,file));
avg_vector = (1/count) * data(1,:) + ((count-1)/count) * avg_vector;
end
end


Provided that the number of columns is not very large (in which case some precision may be lost to round-off errors) this will give the average. The only large vectors in memory are avg_vector and data



For a median, this more complicated as there is no incremental formula. You may have to add another loop over some subset of 1:N and do a selection.



filename=@(k,w) fullfile(['TT',num2str(k)],['TT' num2str(k) 'ch' num2str(w) '.mat']);
load(filename(1,1));
N=size(data,2);
median_all = zeros(1,N);

stride = 1e6;

for nn=1:stride:N
rng = nn:min(N,nn+stride-1);
MAT=zeros(128,length(rng));
for k=TTs
for w=1:4
load(filename(k,w));
MAT(4*(k-1)+w,:)=data(1,rng);
end
end
median_all(1,rng) = median(MAT,1);
clear MAT
end


There will be 128 million values at most in matrix MAT, so about 1GB if data is a 64-bit type (e.g. double). The downside is, file will have to be read several times. The balance is shifted from memory consumption to file I/O.







share|improve this answer














share|improve this answer



share|improve this answer








edited Nov 19 '18 at 9:14

























answered Nov 15 '18 at 16:58









BriceBrice

1,400110




1,400110













  • Thanks I'll be trying this out in a moment!

    – Oiko
    Nov 16 '18 at 11:43











  • I've tested your solution for the average and avg_vector is in fact the same as the last file opened in the loop. No average was computed. Any idea on where the error might be?

    – Oiko
    Nov 18 '18 at 13:37











  • There is a typo in the formula. it should be avg_vector = (1/count) * data(1,:) + ((count-1)/count) * avg_vector(1,:);. I'll edit the answer

    – Brice
    Nov 19 '18 at 9:14



















  • Thanks I'll be trying this out in a moment!

    – Oiko
    Nov 16 '18 at 11:43











  • I've tested your solution for the average and avg_vector is in fact the same as the last file opened in the loop. No average was computed. Any idea on where the error might be?

    – Oiko
    Nov 18 '18 at 13:37











  • There is a typo in the formula. it should be avg_vector = (1/count) * data(1,:) + ((count-1)/count) * avg_vector(1,:);. I'll edit the answer

    – Brice
    Nov 19 '18 at 9:14

















Thanks I'll be trying this out in a moment!

– Oiko
Nov 16 '18 at 11:43





Thanks I'll be trying this out in a moment!

– Oiko
Nov 16 '18 at 11:43













I've tested your solution for the average and avg_vector is in fact the same as the last file opened in the loop. No average was computed. Any idea on where the error might be?

– Oiko
Nov 18 '18 at 13:37





I've tested your solution for the average and avg_vector is in fact the same as the last file opened in the loop. No average was computed. Any idea on where the error might be?

– Oiko
Nov 18 '18 at 13:37













There is a typo in the formula. it should be avg_vector = (1/count) * data(1,:) + ((count-1)/count) * avg_vector(1,:);. I'll edit the answer

– Brice
Nov 19 '18 at 9:14





There is a typo in the formula. it should be avg_vector = (1/count) * data(1,:) + ((count-1)/count) * avg_vector(1,:);. I'll edit the answer

– Brice
Nov 19 '18 at 9:14




















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53323068%2fbig-matrices-in-matlab%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Bressuire

Vorschmack

Quarantine