Unexpected surprises in archiving

Image Description

Peter Humaj

June 29 2021, 8 min read

As a person who has been responsible for maintaining and developing the D2000 Archive process for several years and correcting any possible errors, theoretically, I should not be surprised by the operation of the archive. The reality is, however, different from the general theory, and I would like to share one such surprise today.

Firstly, an introduction for those who do less with the D2000 Archive, or have not had any training offered by Ipesoft.

Four types of archived objects or four purposes of archiving are known in the D2000 system:

· Archivation of object values: these archive objects, which are also called primary, are used for archiving the values ​​of other D2000 objects (measured points, evaluated points, user variables, etc.). Primary archives can be periodic or on-change.

· Calculation of archived values by a statistical function: these archival objects, also referred to as statistical archives, are used to calculate statistics over other archives (e.g. primary or statistical). There are a number of functions available - some are standard (average, weighted average, sum, maximum, minimum), others above standard (increment, delta, number of local maxima/mins, sum/arithmetic average of positive/negative numbers, time slice and others). Statistical archives are always periodic.

· Calculation of archived values ​​by user-defined statement: these archive objects, also called calculated archives, allow to calculate the value of an archive object from other archive objects (these can again be of any type). The calculated archives can be periodic or on-change.

· Archive filled from a script (Value storage): these archive objects are intended only for storing data from the script. Again, they can be on-change or periodic. To avoid mistakes - it is possible to write to other types of archives from a script (e.g. in case of  manual corrections or reading values ​​after a communication failure).

Figure 1: The four types of archived objects. The configuration of the primary archive is displayed - it archives the value of the calculated point P.H_M_S when the value changes. The size of the change to be archived is defined on the filter tab.

Today I would like to focus especially on a calculated archive. Unlike other SCADA or MES systems, the D2000 offers the option of defining an archive, which combines other archive objects in virtually arbitrary way and performs various mathematical operations on them.

A school example would be power calculation as a product of the archived current voltage:

Figure 2: A simple power calculation.

However, the calculation can be more complicated, e.g. using conditions, loops, and other conveniences allowed by the extended syntax of expressions.

Figure 3: Power calculation that handles invalid values: in case of invalid voltage it is replaced by 220 (V), in case of invalid current 3 (A) and if neither voltage nor current is valid, the result is 0.

One may ask why they should use calculated archives when they still have a D2000 Calc process that handles calculated points. The basic advantage of the implemented archival arithmetic is that the D2000 Archive maintains a consistent state of calculated and statistical archives - e.g. after the arrival of delayed values from the communication or after the connection is re-established with the D2000 KOM process  using the KOM archive functionality.

The strength of the D2000 archive is also that computed and statistical archives can be the source archives for other computed and statistical archives. Thus, for example, minute averages can be calculated from the on-change values, and from these averages the 15-minute, hourly and daily averages. Or it is possible to add up the power of generators within units, power plants or the entire electricity system. Alternatively, to evaluate the quality of delivered ancillary services based on a comparison of required power and real power.

So what was it that surprised me last time?

Somehow, I intuitively assumed that the continuous calculation of the calculated archive should give the same result as if I had it recalculated later - for example, via the RECALC tell command. Now I want to show you an example where this is not the case.

The D2000 offers the ARC_GetValue function for use in calculated archives, which allows the value of an archive object to be determined at any time. When specifying the time, the "constant" @EvalTime can be used, indicating the time for which the value of the archive object is being calculated.

The following figure shows the calculated archive, which calculates the difference in the value of the source archive H.Counter_Sum_1m in the current time or at computation time (@EvalTime) and a minute ago (@EvalTime-60). If the difference is at least 0.5, the result is 1, otherwise, the result is 0.

Figure 4: A calculation of a pump run using the extended syntax (INIT, FINALLY) and ARC_GetValue function

This calculated archive was used in a specific application to find out if the pump was running in the last minute (and thus the flow meter counted some change). The source archive H.Counter_Sum_1mcontains the total amount of counted pulses from the pump. This is a periodic minute archive - and I would like to stop at its time parameters.

A quick quiz for the reader: when calculating any minute statistic - what timestamp should it have? Minute start time or end time?

In order to satisfy the users with both opinions, in the configuration of time parameters, there is a possibility of storing the start or the end time of the period.

Figure 5: The stored timestamp in the configuration of statistical archives as well as the calculated periodic archives.

What happens to our computed archive if its source archive H.Counter_Sum_1m stores the beginning of the period? The following figure shows the status at 07:53.

· The H.Counter_Sum_1m archive was recalculated at 07:53, value 25371 was stored with time 07:52

· The H.Counter_Changed archive was also recalculated at 07:53. The value of H.Counter_Sum_1m was subtracted at the current time (which is the last calculated 25371 with a time of 07:52) and at the time before the minute, i.e at 07:52 (which is the same value). Their difference is 0, which is less than 0.5, so the result of the calculation is 0.

Figure 6: The data of source statistical archive (left) and calculated archive (right) at 07:53.

But what happens later (after 07:54) if we repeat the H.Counter_Changed  calculation? At 07:55 we started the recalculation for the interval 07:50 to 07:53. For the time 07:53, the values of the source archive are subtracted from the calculation time (07:53) and one minute older (07:52) during the conversion, i.e. 25372 - 25371 = 1. In this case, the difference is greater than 0.5, so the conversion returns to 1.

The difference between the initial calculation and the later calculation is that later the value of the source archive at time 07:53 is already available, and it is different from the value at time 07:52 (since the pump is running).

Figure 7: The interval recalculation can also be initiated from the Cnf tool using the context menu Extended actions -> RECALC (it is possible to select and have recalculated several objects at once).

The result after recalculation can be seen in the following figure. The values in the recalculated interval 07:50 to 07:53 have changed to 1.

Figure 8: The data of source archive (left) and calculated archive (right) after interval recalculation in time 07:55.

How can this problem be solved so that the initial calculation would provide us with the desired result? (I deliberately did not write that correct, because the result is correct given the available data.)

There are two simple ways:

· modifying the source archive H.Counter_Sum_1m to store the result with the end time of the interval. Then, during the first calculation of the H.Counter_Changed archive, values with the start and end time of the interval are already available.

· modifying the calculated H.Counter_Changed archive to subtract the values one minute before and two minutes before the evaluation time, which are also available during the first calculation.

Both ways are correct, just the first of them requires modification of the source archive - older data will be stored in it with the start time of the interval and newer with the end time. Therefore, when using the first solution, it would be appropriate to have the H.Counter_Sum_1m.archive object recalculated.

At the same time, this behaviour in deeper analysis shows one specialty that you need to be aware of when using the ARC_GetValue function. Above, I wrote that the D2000 Archive maintains a consistent state of calculated and statistical archives. This rule has one exception, and that is the ARC_GetValue function. If the source object changes, the D2000 Archive recalculates the dependent archive objects - but for the time in which the source object changed. However, the ARC_GetValue function allows the value to be determined at any time (which, moreover, may not be constant). This violates the rule of consistency.

This means that:

· Statistical archives and calculated archives that do not use ARC_GetValue are consistent

· calculated archives that use ARC_GetValue are consistent if there are no changes to the source data (delayed values from communication, manual corrections, etc.)

· if corrections occur, the application must address the consistency "manually" - e.g. if we know that the corrections are within the current hour (or day), then after the end of this time it is possible to invoke a recalculation from the script for the relevant period

The D2000 Archive is a powerful tool. In addition to primary and statistical archives, it offers the possibility of creating calculated archives - something I have not yet found in any competing product (if you know of any, please let us know :) ). Our customers use its capabilities in applications addressing the energy balances of entire companies or evaluating the quality of ancillary services in the electricity industry. Unlike today's simple example, these are entire "trees" of archives, which are usually structured (you can read about structured archives in the older blog about the D2000 Archive).

In this article, I wanted to point out the specialties of using the ARC_GetValue function, which you may encounter in practice - and which can surprise you. However, such surprises should be taken as a tax on the wide range of possibilities that the D2000 Archive offers.

Ing. Peter Humaj, www.ipesoft.com

Subscription was successful

Thank you for submitting form.

Image Description

Your message was successfully sent.

Thank you for submitting the form.

Image Description

Your message was successfully sent.

Thank you for submitting the form.

Image Description

Your message was successfully sent.

Thank you for submitting the form.

Image Description