Saturday, September 29, 2007

Cancelling Events

http://blogs.streamsql.org/streamsql/2006/08/handling_revisi.html

Mitch Cherniak poses a question that I posed a few weeks ago about "compensating events".

Mitch seems to be involved in that same project that produced Streambase. I am not sure if Cherniak is involved with Streambase in any way, but it would be interesting in the Streambase folks have thought about this same topic.

©2007 Marc Adler - All Rights Reserved

4 comments:

Richard Tibbetts said...

Mitch is involved with StreamBase as one of our academic advisors (see StreamBase Technical Advisory Board). He is also an active participant in the research community and a leader of the Borealis project. The automated processing of revision records (what you call compensating events) is one of the goals of the Borealis project.

At StreamBase, we find that fully-automated processing of revisions can be difficult for developers to work with. Applications with side-effects (eg, trading apps) must generally handle revisions in an application-specific manner. For realtime apps then, the focus is on enabling the application developer to handle revisions and other new information that might arrive using StreamSQL logic.

For historical data archiving, in the StreamBase Chronicle product, handling of corrections is more straightforward and is generally part of the solution. For data archives, corrections can generally be applied in a batch a few times per day, each night, or on the weekend.

We are obviously keeping a close eye on our academic partners, and will include more automated processing of revisions when the theory has developed enough that it can work for our customers.

Richard

Richard Tibbetts said...

Mitch is involved with StreamBase as one of our academic advisors (see StreamBase Technical Advisory Board). He is also an active participant in the research community and a leader of the Borealis project. The automated processing of revision records (what you call compensating events) is one of the goals of the Borealis project.

At StreamBase, we find that fully-automated processing of revisions can be difficult for developers to work with. Applications with side-effects (eg, trading apps) must generally handle revisions in an application-specific manner. For realtime apps then, the focus is on enabling the application developer to handle revisions and other new information that might arrive using StreamSQL logic.

For historical data archiving, in the StreamBase Chronicle product, handling of corrections is more straightforward and is generally part of the solution. For data archives, corrections can generally be applied in a batch a few times per day, each night, or on the weekend.

We are obviously keeping a close eye on our academic partners, and will include more automated processing of revisions when the theory has developed enough that it can work for our customers.

Richard

Anonymous said...

Only in financial services can people belittle time rifts by creating fancy terminology like 'Revision Records'.

The effect changing history will have on your models will be the same as going back in time and altering the past. You have to deal with the grandfather paradox etc. etc.

In the end 'Revision Records' are nothing more than new events in the stream that compensate for bad data transmitted earlier on... at least until someone works out 1) what bad data is and 2) how to undo all the external events it generated.

c.

Anonymous said...

The article linked to is from over a year ago... it would be interesting to know if the author still has the same issue.. or just move on with option 1.