Hi Mike,
For such a large task, you would be better off by using vector calculation rather than by the scalar one.
For example, I have tried the following experiment where a three columns with 1000000 rows in three
worksheets are averaged at each row to output:
///////////////////////////////////////
range d1=1!col(1); //input, output in TEST3
range d2=2!col(1); //input
range d3=3!col(1); //input
range d4=4!col(1); //output in TEST1 and TEST2
///////// TEST1 by looping
for(ii=1; ii<=1000000; ii++) d4[ii]=(d1[ii]+d2[ii]+d3[ii])/3;
///////// TEST2 by vector expression
d4=(d1+d2+d3)/3;
///////// TEST3 by vector self-assignment (destructive)
d1+=d2;
d1+=d3;
d1/=3;
////////////////////////////////////////////////////////////
The result lap time of the experiments were:
1) By Scalar: 156.3 sec
2) By Vector: 1.8 sec
3) By Self-assigning Vector: 0.61 sec
So, vector method is 86.3 times faster than scalar's, and self-assigning vector method is 256.6 times
faster than Scalar.
So, you should adopt the vector calculation, and if no need to preserve the original column,
you can even adopt the self-assigning vector calculation. (In addition, using the vector method,
you don't have to know the number of rows in the columns.)
--Hideo Fujii
OriginLab