The Origin Forum
File Exchange
Try Origin for Free
The Origin Forum
Home | Profile | Register | Active Topics | Members | Search | FAQ | Send File to Tech support
 All Forums
 Origin Forum for Programming
 LabTalk Forum
 Multi Table Mean Value, Element by Element

Note: You must be registered in order to post a reply.
To register, click here. Registration is FREE!

Screensize:
UserName:
Password:
Anti-Spam Code:
Format Mode:
Format: BoldItalicizedUnderlineStrikethrough Align LeftCenteredAlign Right Horizontal Rule Insert HyperlinkUpload FileInsert Image Insert CodeInsert QuoteInsert List
   
Message:

* HTML is OFF
* Forum Code is ON
Smilies
Smile [:)] Big Smile [:D] Cool [8D] Blush [:I]
Tongue [:P] Evil [):] Wink [;)] Clown [:o)]
Black Eye [B)] Eight Ball [8] Frown [:(] Shy [8)]
Shocked [:0] Angry [:(!] Dead [xx(] Sleepy [|)]
Kisses [:X] Approve [^] Disapprove [V] Question [?]

 
Check here to subscribe to this topic.
   

T O P I C    R E V I E W
Sonafab Posted - 09/21/2015 : 2:57:57 PM
OriginPro Ver. 9.0.0
Operating System: Win7

Hello, I'm new to Originlab so show some understanding.


I have n tables of size mxm. I want a new table filled with the mean values of every element of the n tables (element by element). For example the (1,1) element of the new table will be the mean value of the (1,1) elements of the n tables.

I want the process to be automated, so I won't have to change all the variables every time I run it for different sets of tables.

Thanks a lot,
Mike
4   L A T E S T    R E P L I E S    (Newest First)
Sonafab Posted - 09/28/2015 : 5:19:40 PM
Thanks a lot, that's some very useful information! I'll use vector calculation then.
Hideo Fujii Posted - 09/28/2015 : 3:27:44 PM
Hi Mike,

For such a large task, you would be better off by using vector calculation rather than by the scalar one.
For example, I have tried the following experiment where a three columns with 1000000 rows in three
worksheets are averaged at each row to output:

///////////////////////////////////////
range d1=1!col(1); //input, output in TEST3
range d2=2!col(1); //input
range d3=3!col(1); //input
range d4=4!col(1); //output in TEST1 and TEST2
///////// TEST1 by looping
for(ii=1; ii<=1000000; ii++) d4[ii]=(d1[ii]+d2[ii]+d3[ii])/3;
///////// TEST2 by vector expression
d4=(d1+d2+d3)/3;
///////// TEST3 by vector self-assignment (destructive)
d1+=d2;
d1+=d3;
d1/=3;
////////////////////////////////////////////////////////////

The result lap time of the experiments were:

1) By Scalar: 156.3 sec
2) By Vector: 1.8 sec
3) By Self-assigning Vector: 0.61 sec

So, vector method is 86.3 times faster than scalar's, and self-assigning vector method is 256.6 times
faster than Scalar.

So, you should adopt the vector calculation, and if no need to preserve the original column,
you can even adopt the self-assigning vector calculation. (In addition, using the vector method,
you don't have to know the number of rows in the columns.)

--Hideo Fujii
OriginLab
Sonafab Posted - 09/21/2015 : 3:07:59 PM
The problem with this is that it will take hours to do it for n=100 tables of 1 bilion elements, and like 200 times, cuz the data for my reserach are huge. So I'm trying to figure out a code that will spare a lot of time.

By the way, I don't know the number of elements (aka dimension of the table) so I need to count it.

I also don't know anything about C or C++ (only fortran and python)
lkb0221 Posted - 09/21/2015 : 3:02:25 PM
Simply loop through all wks to add ith column up (i = 1 ~ M), then divide them by N in the end.

The Origin Forum © 2020 Originlab Corporation Go To Top Of Page
Snitz Forums 2000