| T O P I C    R E V I E W | 
               
              
                | udhaya | 
                Posted - 07/22/2023 : 09:08:59 AM  I am trying to fit a dataset to an exponential decay function (nonlinear fit). As I start the fitting with the rough estimates of parameters, for the first few cycles of iteration I get a fitting curve which looks visually acceptable. The error values are low (< 5% of the three fitting parameters) and COD (R2) > 0.985. As soon as I increase the iterations by hitting "fit until converge", the resulting final parameters have error values > 100% of the fitting parameters with COD (R2) being > 0.99. I understand, the default criteria of fitting is to achieve COD(R2) as close to one as possible.
   Is there a way to achieve low error in fitting parameters at the cost of COD(R2)? It is possible to tell origin to iterate to maintain the error within a limit, while achieving the COD(R2) closest to unity! Will appreciate any help.
  Thanks | 
               
              
                | 1   L A T E S T    R E P L I E S    (Newest First) | 
               
              
                | YimingChen | 
                Posted - 07/24/2023 : 08:41:12 AM  If the error of the fitted parameter is large, it means the fitting function is over-parameterized. You may consider reducing the number of parameters in the model. Please refer to this page. https://www.originlab.com/doc/Origin-Help/The_Reason_Why_Fail_to_Converge#Over-parameterized_functions
  James | 
               
             
           | 
         
       
       
     |