The Origin Forum
File Exchange
Try Origin for Free
The Origin Forum
Home | Profile | Register | Active Topics | Members | Search | FAQ | Send File to Tech support
Username:
Password:
Save Password
Forgot your Password? | Admin Options

 All Forums
 Origin Forum for Programming
 LabTalk Forum
 Multi Table Mean Value, Element by Element
 New Topic  Reply to Topic
 Printer Friendly
Author Previous Topic Topic Next Topic Lock Topic Edit Topic Delete Topic New Topic Reply to Topic

Sonafab

3 Posts

Posted - 09/21/2015 :  2:57:57 PM  Show Profile  Edit Topic  Reply with Quote  View user's IP address  Delete Topic
OriginPro Ver. 9.0.0
Operating System: Win7

Hello, I'm new to Originlab so show some understanding.


I have n tables of size mxm. I want a new table filled with the mean values of every element of the n tables (element by element). For example the (1,1) element of the new table will be the mean value of the (1,1) elements of the n tables.

I want the process to be automated, so I won't have to change all the variables every time I run it for different sets of tables.

Thanks a lot,
Mike

lkb0221

China
497 Posts

Posted - 09/21/2015 :  3:02:25 PM  Show Profile  Edit Reply  Reply with Quote  View user's IP address  Delete Reply
Simply loop through all wks to add ith column up (i = 1 ~ M), then divide them by N in the end.
Go to Top of Page

Sonafab

3 Posts

Posted - 09/21/2015 :  3:07:59 PM  Show Profile  Edit Reply  Reply with Quote  View user's IP address  Delete Reply
The problem with this is that it will take hours to do it for n=100 tables of 1 bilion elements, and like 200 times, cuz the data for my reserach are huge. So I'm trying to figure out a code that will spare a lot of time.

By the way, I don't know the number of elements (aka dimension of the table) so I need to count it.

I also don't know anything about C or C++ (only fortran and python)

Edited by - Sonafab on 09/21/2015 3:11:39 PM
Go to Top of Page

Hideo Fujii

USA
1582 Posts

Posted - 09/28/2015 :  3:27:44 PM  Show Profile  Edit Reply  Reply with Quote  View user's IP address  Delete Reply
Hi Mike,

For such a large task, you would be better off by using vector calculation rather than by the scalar one.
For example, I have tried the following experiment where a three columns with 1000000 rows in three
worksheets are averaged at each row to output:

///////////////////////////////////////
range d1=1!col(1); //input, output in TEST3
range d2=2!col(1); //input
range d3=3!col(1); //input
range d4=4!col(1); //output in TEST1 and TEST2
///////// TEST1 by looping
for(ii=1; ii<=1000000; ii++) d4[ii]=(d1[ii]+d2[ii]+d3[ii])/3;
///////// TEST2 by vector expression
d4=(d1+d2+d3)/3;
///////// TEST3 by vector self-assignment (destructive)
d1+=d2;
d1+=d3;
d1/=3;
////////////////////////////////////////////////////////////

The result lap time of the experiments were:

1) By Scalar: 156.3 sec
2) By Vector: 1.8 sec
3) By Self-assigning Vector: 0.61 sec

So, vector method is 86.3 times faster than scalar's, and self-assigning vector method is 256.6 times
faster than Scalar.

So, you should adopt the vector calculation, and if no need to preserve the original column,
you can even adopt the self-assigning vector calculation. (In addition, using the vector method,
you don't have to know the number of rows in the columns.)

--Hideo Fujii
OriginLab

Edited by - Hideo Fujii on 09/28/2015 3:29:57 PM
Go to Top of Page

Sonafab

3 Posts

Posted - 09/28/2015 :  5:19:40 PM  Show Profile  Edit Reply  Reply with Quote  View user's IP address  Delete Reply
Thanks a lot, that's some very useful information! I'll use vector calculation then.
Go to Top of Page
  Previous Topic Topic Next Topic Lock Topic Edit Topic Delete Topic New Topic Reply to Topic
 New Topic  Reply to Topic
 Printer Friendly
Jump To:
The Origin Forum © 2020 Originlab Corporation Go To Top Of Page
Snitz Forums 2000