Author |
Topic |
|
tdh10
USA
1 Posts |
Posted - 11/13/2018 : 02:52:07 AM
|
For my thesis I tripped over a problem which is causing more trouble than I thought. I measured the deviation of self-produced parts. Now I have lots and lots of Files which need to be compared.
Every file has lots of points which indicate the deviation of one specific part to the CAD model. The mean for parts produced the same way should be compared to the target state. In order to do that, I have to order, copy, order again, do a mean, do another mean and plot the statistics of e.g. three deviation matrices.
e.g. I have file H-2.00-1-L and there I want to choose 25 points with the same y-value. So I have to order column G (y-values). Then I copy out the fixed x-values (F1 - F25), the target z-values (H1-H25) and the acutal z-values (O1-O25). Together with the other actual values from worksheets H-2.00-2-L and H-2.00-3-L (as well in [O1-O25]). I can calculate their mean and plot it vs target values.
I understand basic programming but never worked with Origin, so I am tumbling quite a lot here. I wrote a pseudo-code but that's as far as I got.
Maybe someone understands what I am looking for and can give me helpful clues? In the end I can invest a week or two to manually do everything but somehow I don't like the feeling of that.
Cheers to anyone who read my problem, if you have any questions or suggestions feel free to ask. I'll try moving along the script-path a bit further. |
|
YimingChen
1630 Posts |
Posted - 11/13/2018 : 2:08:09 PM
|
Hi,
If you want to extract the X, targetZ, ActualZ values of certain Y value, run the following Labtalk script in Window->Script Window:
======= function dataset dsSearch(dataset ds1, dataset ds2, double dd) { dataset ds3; int iRow = 1; for(int ii = 1 ; ii <= ds1.GetSize() ; ii++) { if(ds1[ii] == dd) { ds3[iRow] = ds2[ii]; iRow++; } } return ds3; }
range r1 = [Book2]1!2; range r2 = [Book2]1!1; range r3 = [Book3]1!1;
r3 = dsSearch(r1, r2, 0.5); =======
You can change the range definition to work on your own data. Thanks.
James |
|
|
|
Topic |
|
|
|