Thanks to Steve Barber and some of the other members of the Streambase tech support team. I could not get an out-of-process, .NET-based adapter to communicate with the Streambase engine on my laptop, although it worked fine on my office machine.
The Streambase people worked during the slow holiday season to diagnose the problem. It was caused by a DLL called BMNET.DLL that was installed by my Cingular/ATT Communications Manager. I have a Cingular broadband card that I use to connect to the internet when I am on the road and don't have a free wireless connection to tap into. BMNET.DLL provides data acceleration to the Internet.
Microsoft references this problem here: http://support.microsoft.com/kb/910435
©2007 Marc Adler - All Rights Reserved
Saturday, December 29, 2007
Aleri Evaluation
Just a small word about the Aleri evaluation. Several of you have repeatedly pinged me to find out what I thought of Aleri, so I am going to write down some of my impressions. The usual disclaimers apply, such as this is my opinion and does not necessarily represent the opinions of my employer.
My impressions were formed after a superficial evaluation of Aleri against some of the other vendors. I have not gotten to the point yet where I am testing the latency and throughput of the CEP engines. I have not soaked and stressed the CEP engines to see if any of them are leaking memory. I have not tried them on a variety of processors, and I have not examined their performance under multicore processors.
In a nutshell, my biggest area of concern with Aleri was the "spit-and-polish" of the product. They very well might have the fastest CEP engine out there. However, I was stymied by the quality of the documentation, and my perceptions of their Aleri Studio. It also seemed that they were more of a "system integrator" that some of the other CEP firms, taking separate products like OpenAdaptor and JPivot and trying to fit them into a holistic offering.
An example of this was reflected in the difficult time I had in getting Aleri integrated with SQL Server 2005 through Open Adaptor. The documentation was non-obvious, and it took many hours with their sales engineer to finally get it connected. I compare this to Streambase and Coral8, where it took all of 5 minutes to hook up an SQL Server 2005 database to their systems (disclaimer: there is a problem getting Streambase, Vista and SQL Server to work together, although Streambase has not yet released an official version for Vista).
That being said, the salient points are:
1) Aleri's management team (Don DeLoach and Jeff Wootton) fully realize their short-comings in the department of aesthetics, and have promised me that they are actively addressing it. I would highly recommend that Aleri look at Streambase, whose total package is good with regards to documentation and tutorials. (However, I still find a lot of pockets of geekiness in the Streambase documentation.)
2) The Aleri sales engineering team, led by Dave, got a variant of my initial use case to work. However, there are features that Aleri does not yet have, such as jumping windows and pattern matching, that make Coral8 and Streambase stand out.
3) Going through Open Adaptor is not fun. Streambase and Coral8 make it simple to write adapters in C#. The Aleri sales engineer told me that he usually has to help clients to get adapters to work. That is really not the message that a company wants to hear if they have many legacy systems to interface with.
4) Aleri has real-time OLAP, using JPivot. To my knowledge, they are the only CEP company to offer real-time OLAP. I did not really get to see this feature, but true real-time OLAP is something that a lot of financial companies are interested in. We want to be able to slice and dice our order flow in real time over different dimensions.
5) The Aleri Studio uses Eclipse, just like Streambase, and the icons even look exactly like Streambase's icons. However, the user interaction seemed a bit shaky at times, and there were moments when I got myself into "trouble" with the Aleri Studio by clicking on one thing before clicking on another thing. Again, Streambase seems more solid. And, Coral8 does not try to impose a GUI builder on the developer. The guys at Aleri told me that they are addressing these issues.
I was really pulling for Aleri, since the development center is about 15 minutes from my house in New Jersey. They are right down the block from the hallowed halls of Bell Labs in Mountainside, and some of the developers live in the next town over from me. You couldn't meet a company of nicer guys, and the CEO is a very low-key guy compared to other CEOs that I have met. I was impressed by the fact that, at the Gartner conference on CEP, he stood up in front of the audience and exhorted us to try different CEP products.
I anxiously look forward to the 3.0 version of Aleri's offerings, and to see tighter, easier integration between their various components, enhanced documentation, enhanced support for .NET, and a cleaner version of their Aleri Studio. Given the quality of the developers there, I am sure that this version will kick some butt.
©2007 Marc Adler - All Rights Reserved
My impressions were formed after a superficial evaluation of Aleri against some of the other vendors. I have not gotten to the point yet where I am testing the latency and throughput of the CEP engines. I have not soaked and stressed the CEP engines to see if any of them are leaking memory. I have not tried them on a variety of processors, and I have not examined their performance under multicore processors.
In a nutshell, my biggest area of concern with Aleri was the "spit-and-polish" of the product. They very well might have the fastest CEP engine out there. However, I was stymied by the quality of the documentation, and my perceptions of their Aleri Studio. It also seemed that they were more of a "system integrator" that some of the other CEP firms, taking separate products like OpenAdaptor and JPivot and trying to fit them into a holistic offering.
An example of this was reflected in the difficult time I had in getting Aleri integrated with SQL Server 2005 through Open Adaptor. The documentation was non-obvious, and it took many hours with their sales engineer to finally get it connected. I compare this to Streambase and Coral8, where it took all of 5 minutes to hook up an SQL Server 2005 database to their systems (disclaimer: there is a problem getting Streambase, Vista and SQL Server to work together, although Streambase has not yet released an official version for Vista).
That being said, the salient points are:
1) Aleri's management team (Don DeLoach and Jeff Wootton) fully realize their short-comings in the department of aesthetics, and have promised me that they are actively addressing it. I would highly recommend that Aleri look at Streambase, whose total package is good with regards to documentation and tutorials. (However, I still find a lot of pockets of geekiness in the Streambase documentation.)
2) The Aleri sales engineering team, led by Dave, got a variant of my initial use case to work. However, there are features that Aleri does not yet have, such as jumping windows and pattern matching, that make Coral8 and Streambase stand out.
3) Going through Open Adaptor is not fun. Streambase and Coral8 make it simple to write adapters in C#. The Aleri sales engineer told me that he usually has to help clients to get adapters to work. That is really not the message that a company wants to hear if they have many legacy systems to interface with.
4) Aleri has real-time OLAP, using JPivot. To my knowledge, they are the only CEP company to offer real-time OLAP. I did not really get to see this feature, but true real-time OLAP is something that a lot of financial companies are interested in. We want to be able to slice and dice our order flow in real time over different dimensions.
5) The Aleri Studio uses Eclipse, just like Streambase, and the icons even look exactly like Streambase's icons. However, the user interaction seemed a bit shaky at times, and there were moments when I got myself into "trouble" with the Aleri Studio by clicking on one thing before clicking on another thing. Again, Streambase seems more solid. And, Coral8 does not try to impose a GUI builder on the developer. The guys at Aleri told me that they are addressing these issues.
I was really pulling for Aleri, since the development center is about 15 minutes from my house in New Jersey. They are right down the block from the hallowed halls of Bell Labs in Mountainside, and some of the developers live in the next town over from me. You couldn't meet a company of nicer guys, and the CEO is a very low-key guy compared to other CEOs that I have met. I was impressed by the fact that, at the Gartner conference on CEP, he stood up in front of the audience and exhorted us to try different CEP products.
I anxiously look forward to the 3.0 version of Aleri's offerings, and to see tighter, easier integration between their various components, enhanced documentation, enhanced support for .NET, and a cleaner version of their Aleri Studio. Given the quality of the developers there, I am sure that this version will kick some butt.
©2007 Marc Adler - All Rights Reserved
Where are the CEP Customers?
It seems that, lately, every blog posting I make on CEP generates further blog postings from the vendors and the subject-matter experts in the CEP space. It's great to see the CEP blogs become more active, so that I can tap into the collective wisdom of people like Tim, Marco, Mark, etc.
However, where are the postings from other customers? Isn't there someone from Goldman, Lehman, Merrill, etc who wonder about the same things that I do? Or, do these people mainly purchase the pre-packaged algo trading packages that the vendors have to offer.
One thing that was very interesting was the copy of the Aite Group report on Complex Event Processing that somebody had forwarded to me. It seems that most of the CEP companies number their customers in the dozens, rather than the hundreds or thousands. It seems that we are either at the very beginning of the explosive group that Aite predicts, or not many companies are finding a use for CEP, relying instead on their legacy apps to deal with streaming events. Certainly, in my own company, we have hand-written code for the various algo and HFT trading systems, code that I am sure the developers have profiled the hell out of in order to get the max performance. We would be hard pressed to replace this code with a generic CEP system.
If you are a customer or potential customer of CEP, I offer the opportunity to send me private comments at magmasystems at yahoo.
©2007 Marc Adler - All Rights Reserved
However, where are the postings from other customers? Isn't there someone from Goldman, Lehman, Merrill, etc who wonder about the same things that I do? Or, do these people mainly purchase the pre-packaged algo trading packages that the vendors have to offer.
One thing that was very interesting was the copy of the Aite Group report on Complex Event Processing that somebody had forwarded to me. It seems that most of the CEP companies number their customers in the dozens, rather than the hundreds or thousands. It seems that we are either at the very beginning of the explosive group that Aite predicts, or not many companies are finding a use for CEP, relying instead on their legacy apps to deal with streaming events. Certainly, in my own company, we have hand-written code for the various algo and HFT trading systems, code that I am sure the developers have profiled the hell out of in order to get the max performance. We would be hard pressed to replace this code with a generic CEP system.
If you are a customer or potential customer of CEP, I offer the opportunity to send me private comments at magmasystems at yahoo.
©2007 Marc Adler - All Rights Reserved
Wednesday, December 26, 2007
Visualizations Update
Stephen Few is rapidly positioning himself as the guru of business visualizations. His name has been brought to my attention several times over the past few weeks as someone to pay attention to .... "a new Edward Tufte", if you will.
Few has an online library with a lot of free articles to read. Right now, I'm reading Multivariate Analysis using Heatmaps. This is especially worthwhile reading following last week's visit by Richard and Markus of Panopticon, who showed more reasons why we should graduate from the free Microsoft heatmap control to the more feature-laden, doubleplusunfree, Panopticon product. As Panopticon adds more features in the value chain, it will be increasingly difficult to justify using a free product.
------------------------------------
Which brings me to another point that I have been thinking of ... a point that I raised on my previous blog posting. In the field of Enterprise Software, where do the responsibilities of a vendor begin and where do they end?
Take Panopticon, for instance. You can bind a streaming "dataset" to Panopticon, and Panopticon will render a realtime updating Heatmap to visualize that dataset. Of course, you ask how you get data into Panopticon, and you come back with the concept of input adapters.
Then, gradually, you wonder if their input adapters cover KDB, Wombat, Reuters, Vhayu, OpenTick, generic JMS, sockets, etc.
Then you wonder if Panopticon has input adapters that take the output of CEP engines, like Coral8 and Streambase. Or, you have a crazy thought like Panopticon embedding a copy of Esper/NEsper inside of itself.
Then, you get really greedy and wonder if Panopticon provides built-in FIX adapters that will devour a FIX 4.4 stream of orders and executions and show you what exchanges are slow today.
Then you wonder what kinds of analytical tools Panopticon might interface with ... since Panopticon is doing parsing and analysis of the streaming data anyway, can't it just take an extra step and analyze the silly data.
But, then if you are demanding all of these things of Panopticon and Coral8, how do you hook them together? Does the dog wag the tail or does the tail wag the dog?
Or, do we just consider Panopticon a simple visualization tool, demanding nothing more of it then the ability to display brightly colored rectangles of streaming data, and likewise, do we ask nothing more of Coral8 than to do what it does best ... recognize patterns and perform filtering and aggregations.
As Dali-esque as these thoughts may appear, this is the kind of things that I need to consider. In my quest for an ecosystem around the CEP engine, do we ask for the CEP engine vendors to expand outwards, or do we take the outer layer of components (ie: the visualization and analysis tools) and ask them to expand inwards to meet the CEP engine. Whatever it is, my wish would be for a true plug-and-play architecture between the CEP engine, its input components, and its output components.
©2007 Marc Adler - All Rights Reserved
Few has an online library with a lot of free articles to read. Right now, I'm reading Multivariate Analysis using Heatmaps. This is especially worthwhile reading following last week's visit by Richard and Markus of Panopticon, who showed more reasons why we should graduate from the free Microsoft heatmap control to the more feature-laden, doubleplusunfree, Panopticon product. As Panopticon adds more features in the value chain, it will be increasingly difficult to justify using a free product.
------------------------------------
Which brings me to another point that I have been thinking of ... a point that I raised on my previous blog posting. In the field of Enterprise Software, where do the responsibilities of a vendor begin and where do they end?
Take Panopticon, for instance. You can bind a streaming "dataset" to Panopticon, and Panopticon will render a realtime updating Heatmap to visualize that dataset. Of course, you ask how you get data into Panopticon, and you come back with the concept of input adapters.
Then, gradually, you wonder if their input adapters cover KDB, Wombat, Reuters, Vhayu, OpenTick, generic JMS, sockets, etc.
Then you wonder if Panopticon has input adapters that take the output of CEP engines, like Coral8 and Streambase. Or, you have a crazy thought like Panopticon embedding a copy of Esper/NEsper inside of itself.
Then, you get really greedy and wonder if Panopticon provides built-in FIX adapters that will devour a FIX 4.4 stream of orders and executions and show you what exchanges are slow today.
Then you wonder what kinds of analytical tools Panopticon might interface with ... since Panopticon is doing parsing and analysis of the streaming data anyway, can't it just take an extra step and analyze the silly data.
But, then if you are demanding all of these things of Panopticon and Coral8, how do you hook them together? Does the dog wag the tail or does the tail wag the dog?
Or, do we just consider Panopticon a simple visualization tool, demanding nothing more of it then the ability to display brightly colored rectangles of streaming data, and likewise, do we ask nothing more of Coral8 than to do what it does best ... recognize patterns and perform filtering and aggregations.
As Dali-esque as these thoughts may appear, this is the kind of things that I need to consider. In my quest for an ecosystem around the CEP engine, do we ask for the CEP engine vendors to expand outwards, or do we take the outer layer of components (ie: the visualization and analysis tools) and ask them to expand inwards to meet the CEP engine. Whatever it is, my wish would be for a true plug-and-play architecture between the CEP engine, its input components, and its output components.
©2007 Marc Adler - All Rights Reserved
CEP Vendors and the Ecosystem
While my wife and son cavort around Australia and New Zealand for the next few weeks (I get to stay home and watch my daughter, who only has one week off from high school), I hope to be able to catch up on some of the blog posts that I owe people.
One of the things that is most important for me in choosing a CEP vendor is the ecosystem that surrounds the CEP engine. In a company such as mine, we need to interface with many different legacy systems. These legacy systems can hold crucial data, such as historical orders, customer trades, market data, volatility curves, customer and security reference data, etc. This data may reside statically in a database, be published out as flow over some kind of middleware, or interfaced with an object cache or data fabric. We have every color and shape of database technology in our firm, whether it be more traditional relational databases like Oracle, SQL Server, and Sybase, or newer tick databases like KDB+.
From the input and output points of the CEP engine, we need seamless integration with all sorts of systems. Most CEP engines have the concept of in-process and out-of-process adapters. In-process adapters are more performant that out-of-process adapters. We would love to see as many in-process adapters delivered out-of-the-box by our CEP vendor. We do not want to spend time writing our own in-process adapters.
So far, none of the CEP vendors support KDB+ as an out-of-the-box solution. In fact, many of the CEP vendors did not even know what KDB+ was. (Is the same true for Vhayu as well?) My feeling is that, if a CEP vendor is going to be successful on Wall Street, then they must support KDB+. Is it even feasible for the CEP vendors to provide an abstraction layer around KDB+, and let the CEP developer write all queries in SQL instead of writing them in K or Q?
One of the most important things that I would like to see from the CEP vendors are tools to enable the analysis of all of the data that pass through the CEP engine. Many groups might not have the budget to hire a specialized mathematician or quant to perform time-series analysis on the data. Learning specialized languages like R or SPlus might not be possible for smaller groups that do not have a mathematical bent. The same goes for packages like Mathematica and Matlab.
Would it be worth it for the CEP vendors to come out with a pre-packaged "stack" for various financial verticals that incorporates analysis tools? Or, would writing a detailed cookbook be better? And, where does the responsibility of the CEP vendor end? Should we expect the CEP vendor to provide a one-stop shop for all of our needs, or should be just expect the CEP vendors to provide strong integration points?
Better yet, does this open up an opportunity for a third party company to provide this service? Like the many laptop vendors who buy a motherboard (the CEP engine), and slap together a disk drive, CD drive, screen and keyboard to make a complete system?
In examining the various CEP vendors, I have come to the conclusion that the offerings from Streambase, Coral8 and Aleri are very similar. Given another year, I might expect each vendor to fill in the gaps with regards to their competitors' offerings, and at that point, we might have practically identical technologies from 3 different vendors. In my opinion, the real win for these CEP vendors will come in the analysis tools they provide.
©2007 Marc Adler - All Rights Reserved
One of the things that is most important for me in choosing a CEP vendor is the ecosystem that surrounds the CEP engine. In a company such as mine, we need to interface with many different legacy systems. These legacy systems can hold crucial data, such as historical orders, customer trades, market data, volatility curves, customer and security reference data, etc. This data may reside statically in a database, be published out as flow over some kind of middleware, or interfaced with an object cache or data fabric. We have every color and shape of database technology in our firm, whether it be more traditional relational databases like Oracle, SQL Server, and Sybase, or newer tick databases like KDB+.
From the input and output points of the CEP engine, we need seamless integration with all sorts of systems. Most CEP engines have the concept of in-process and out-of-process adapters. In-process adapters are more performant that out-of-process adapters. We would love to see as many in-process adapters delivered out-of-the-box by our CEP vendor. We do not want to spend time writing our own in-process adapters.
So far, none of the CEP vendors support KDB+ as an out-of-the-box solution. In fact, many of the CEP vendors did not even know what KDB+ was. (Is the same true for Vhayu as well?) My feeling is that, if a CEP vendor is going to be successful on Wall Street, then they must support KDB+. Is it even feasible for the CEP vendors to provide an abstraction layer around KDB+, and let the CEP developer write all queries in SQL instead of writing them in K or Q?
One of the most important things that I would like to see from the CEP vendors are tools to enable the analysis of all of the data that pass through the CEP engine. Many groups might not have the budget to hire a specialized mathematician or quant to perform time-series analysis on the data. Learning specialized languages like R or SPlus might not be possible for smaller groups that do not have a mathematical bent. The same goes for packages like Mathematica and Matlab.
Would it be worth it for the CEP vendors to come out with a pre-packaged "stack" for various financial verticals that incorporates analysis tools? Or, would writing a detailed cookbook be better? And, where does the responsibility of the CEP vendor end? Should we expect the CEP vendor to provide a one-stop shop for all of our needs, or should be just expect the CEP vendors to provide strong integration points?
Better yet, does this open up an opportunity for a third party company to provide this service? Like the many laptop vendors who buy a motherboard (the CEP engine), and slap together a disk drive, CD drive, screen and keyboard to make a complete system?
In examining the various CEP vendors, I have come to the conclusion that the offerings from Streambase, Coral8 and Aleri are very similar. Given another year, I might expect each vendor to fill in the gaps with regards to their competitors' offerings, and at that point, we might have practically identical technologies from 3 different vendors. In my opinion, the real win for these CEP vendors will come in the analysis tools they provide.
©2007 Marc Adler - All Rights Reserved
Saturday, December 22, 2007
At Eaton Airport, October 2007
Wednesday, December 19, 2007
Blog Momentum
On December 18th, I looked at my ClustrMap and saw that I had 239 visitors on the previous day. While this is nothing compared to the big blogs out there, it is definitely an improvement from the 30-40 per day that I had last year. I am not sure if my involvement with CEP has anything to do with it, or if it is becoming more well known across the Wall Street community (I believe it's the latter).
The levels of comments have increased too. Also, the number of private emails that I get have also increased, as a lot of you share your opinions about working on Wall Street, evals of CEP vendors, etc.
Some of you wonder if I can get into trouble with my company for the blog. Let me tell you that there are a HUGE number of people from my company who read my blog, not only from the tech side, but from the business side as well. The former CTO of our company was a big blog reader of mine. The CIO of our company has his own internal blog, and I know that people have recommended my blog to him.
Take a look at what other big financial companies are doing with blogs and wikis. Enlightened management at financial institutions will embrace social networking and a more open community, not run scared at the thoughts of what a competitor might pick up from a blog. Wall Street and The City are very close-knit communities, and more information passes between the employees at Lehman and Goldman during Happy Hour than whatever can be revealed from a blog.
©2007 Marc Adler - All Rights Reserved
The levels of comments have increased too. Also, the number of private emails that I get have also increased, as a lot of you share your opinions about working on Wall Street, evals of CEP vendors, etc.
Some of you wonder if I can get into trouble with my company for the blog. Let me tell you that there are a HUGE number of people from my company who read my blog, not only from the tech side, but from the business side as well. The former CTO of our company was a big blog reader of mine. The CIO of our company has his own internal blog, and I know that people have recommended my blog to him.
Take a look at what other big financial companies are doing with blogs and wikis. Enlightened management at financial institutions will embrace social networking and a more open community, not run scared at the thoughts of what a competitor might pick up from a blog. Wall Street and The City are very close-knit communities, and more information passes between the employees at Lehman and Goldman during Happy Hour than whatever can be revealed from a blog.
©2007 Marc Adler - All Rights Reserved
Tuesday, December 18, 2007
Bonus Season
Word is trickling across the Street about Goldman's bonuses. Everyone knows the numbers of an average bonus of over $600,000 per employee. But the whispers that I have heard from a number of people are that the IT employees below Managing Director level did not fare very well.
In conversations that I have had with a number of current and ex colleagues, we are speculating whether most Wall Street companies will turn off the bonus spout, and admonish employees to "Just try to go anywhere else! Nobody is hiring now!".
What will be the likely scenario is a 10% reduction across the board of the lowest performers. I am going long in the steak-knife manufacturers.
What you don't want to see is the de-motivation of the normal, punch-the-clock worker in the IT field. These are the people who score in the middle of the road on their yearly evaluations, the people who have been at a company long enough to know one system inside and out, the people who don't blog nor read blogs, the people who don't work on weekends and who don't think about the next great product that they want to write for the traders. These are the people that usually hold the keys to the systems and the data, the people who you need to convince to open the gates to your applications. Giving these people the doughnut bonus will slow down processes, resulting in further roadblocks for the people who do want to get things done.
Nevertheless, I still have open headcount for the CEP project and for the .NET client framework team.
©2007 Marc Adler - All Rights Reserved
In conversations that I have had with a number of current and ex colleagues, we are speculating whether most Wall Street companies will turn off the bonus spout, and admonish employees to "Just try to go anywhere else! Nobody is hiring now!".
What will be the likely scenario is a 10% reduction across the board of the lowest performers. I am going long in the steak-knife manufacturers.
What you don't want to see is the de-motivation of the normal, punch-the-clock worker in the IT field. These are the people who score in the middle of the road on their yearly evaluations, the people who have been at a company long enough to know one system inside and out, the people who don't blog nor read blogs, the people who don't work on weekends and who don't think about the next great product that they want to write for the traders. These are the people that usually hold the keys to the systems and the data, the people who you need to convince to open the gates to your applications. Giving these people the doughnut bonus will slow down processes, resulting in further roadblocks for the people who do want to get things done.
Nevertheless, I still have open headcount for the CEP project and for the .NET client framework team.
©2007 Marc Adler - All Rights Reserved
Farewell Kaskad
A player in the CEP space, Kaskad, is no more. Colin Clark writes in to say that, since the Boston Stock Exchange (BSX) has ceased to function, Colin had to disband Kaskad. I assume that since Kaskad did the work for the BSX as consultants, the BSX maintained all or most of the IP rights. Or, perhaps, Kaskad felt that since their only client was no longer funding their development, they could not gain further VC money in this financial environment.
According to the Aite report on CEP vendors, Kaskad had 16 employees. Since they are based up in Boston, I wonder if Streambase is looking at adding some additional talent.
Colin is looking for new opportunities in the CEP space, so contact him if you have anything that might interest him. Colin was involved in the old NEON (New Era for Networks) back in the dotcom boom, so he has the entrepreneural streak running through him.
©2007 Marc Adler - All Rights Reserved
According to the Aite report on CEP vendors, Kaskad had 16 employees. Since they are based up in Boston, I wonder if Streambase is looking at adding some additional talent.
Colin is looking for new opportunities in the CEP space, so contact him if you have anything that might interest him. Colin was involved in the old NEON (New Era for Networks) back in the dotcom boom, so he has the entrepreneural streak running through him.
©2007 Marc Adler - All Rights Reserved
Friday, December 14, 2007
Streambase (yet again ...)
After vowing to bypass Streambase in my CEP engine evaluation, I may be forced to eat crow. I agreed to let Streambase into the evaluation process because I need to have two CEP engines in my project ... one as primary and one as "cold backup". And, for various reasons, Aleri and Esper did not pan out for me.
The new CEO of Streambase, Chris Ridley, came down to New York to meet with me, with his chief architect, Richard Tibbetts, in tow. They acknowledged some of the errors of their over-aggressive marketing, and told me about their sharpened focus on the financial industry marketplace.
They also let me have an eval version of Streambase that is not constrained by any license key, and in the interests of expediency, they graciously allowed me to bypass their eval agreement (which would have taken weeks to make it through my company's legal processes at this time of the year).
I installed Streambase on my laptop. My first impressions are ..... "slick". In other words, all the superficial, glossy stuff that gives the initial impression to a prospective customer is all there. Nice documentation with plenty of graphics, a great interactive tutorial, etc. I was "warned" that Streambase puts a lot of time into their studio and help system, and I can definitely concur. Nice job, guys.
I am going through the tutorials now. Several things jump out at me right away:
1) They use Eclipse as the foundation of their Streambase Studio. I am quickly becoming a fan of Eclipse, especially the way that you can automatically update Eclipse plugins.
2) The development methodology is more "file-based" than the other products. A familiar paradigm to Java/C# developers.
3) There are two ways to develop apps. The Event Flow method uses a GUI-based method. You can also program in StreamSQL. Unfortunately, there is no tie-in between the Event Flow and the Stream SQL files. In other words, unlike Coral8, if you make a change in the Event Flow, it does not get reflected in the StreamSQL file. In your project, you can have multiple Event Flow files and multiple StreamSQL files. I would love to be able to develop in either system, and have them automatically translated to the other system.
4) There are certain things that you need to do in the Event Flow system that you cannot do in StreamSQL. There are comments in their demo programs to this effect. I would welcome a document that outlined these differences.
5) I noticed that the icons used in the tool palette are identical to the ones that Aleri uses. Interesting. Someone looked at the other company's product.
6) Richard Tibbetts and Mark Tzimelson are very respectful to each other's work. Nice to see that kind of respect at the technical level.
©2007 Marc Adler - All Rights Reserved
The new CEO of Streambase, Chris Ridley, came down to New York to meet with me, with his chief architect, Richard Tibbetts, in tow. They acknowledged some of the errors of their over-aggressive marketing, and told me about their sharpened focus on the financial industry marketplace.
They also let me have an eval version of Streambase that is not constrained by any license key, and in the interests of expediency, they graciously allowed me to bypass their eval agreement (which would have taken weeks to make it through my company's legal processes at this time of the year).
I installed Streambase on my laptop. My first impressions are ..... "slick". In other words, all the superficial, glossy stuff that gives the initial impression to a prospective customer is all there. Nice documentation with plenty of graphics, a great interactive tutorial, etc. I was "warned" that Streambase puts a lot of time into their studio and help system, and I can definitely concur. Nice job, guys.
I am going through the tutorials now. Several things jump out at me right away:
1) They use Eclipse as the foundation of their Streambase Studio. I am quickly becoming a fan of Eclipse, especially the way that you can automatically update Eclipse plugins.
2) The development methodology is more "file-based" than the other products. A familiar paradigm to Java/C# developers.
3) There are two ways to develop apps. The Event Flow method uses a GUI-based method. You can also program in StreamSQL. Unfortunately, there is no tie-in between the Event Flow and the Stream SQL files. In other words, unlike Coral8, if you make a change in the Event Flow, it does not get reflected in the StreamSQL file. In your project, you can have multiple Event Flow files and multiple StreamSQL files. I would love to be able to develop in either system, and have them automatically translated to the other system.
4) There are certain things that you need to do in the Event Flow system that you cannot do in StreamSQL. There are comments in their demo programs to this effect. I would welcome a document that outlined these differences.
5) I noticed that the icons used in the tool palette are identical to the ones that Aleri uses. Interesting. Someone looked at the other company's product.
6) Richard Tibbetts and Mark Tzimelson are very respectful to each other's work. Nice to see that kind of respect at the technical level.
©2007 Marc Adler - All Rights Reserved
Labels:
CEP,
Complex Event Processing,
Coral8,
Streambase Apama Bile
Tuesday, December 11, 2007
Acropolis Shrugged
http://blogs.msdn.com/gblock/archive/2007/12/07/if-acropolis-is-no-more-what-s-our-commitment.aspx
CAB? Acropolis? CAB? Acropolis? CAB? Acropolis? CAB? Acropolis? CAB? Acropolis?
It looks like Acropolis may be going by the wayside, and the Patterns and Practice Group has decided that, for today, they will refocus on CAB.
This is precisely why we build our own .NET frameworks in my investment bank. Because, we need a high quality framework that has some domain knowledge of what capital markets needs. Goldman, Wachovia and Morgan Stanley have done the same.
And, this time last year, Microsoft came in and tried to get us to adopt CAB, and the week after that, they told us that Acropolis was the new flavor of the day. Thank the lord that we were focused on a mission to build what we wanted and needed, without all of the background noise from Microsoft. (Sorry Joe...)
©2007 Marc Adler - All Rights Reserved
CAB? Acropolis? CAB? Acropolis? CAB? Acropolis? CAB? Acropolis? CAB? Acropolis?
It looks like Acropolis may be going by the wayside, and the Patterns and Practice Group has decided that, for today, they will refocus on CAB.
This is precisely why we build our own .NET frameworks in my investment bank. Because, we need a high quality framework that has some domain knowledge of what capital markets needs. Goldman, Wachovia and Morgan Stanley have done the same.
And, this time last year, Microsoft came in and tried to get us to adopt CAB, and the week after that, they told us that Acropolis was the new flavor of the day. Thank the lord that we were focused on a mission to build what we wanted and needed, without all of the background noise from Microsoft. (Sorry Joe...)
©2007 Marc Adler - All Rights Reserved
Get Rich Quick with KDB+
I am convinced that the world needs more KDB+ consultants. The supply of these creatures is so small, that if you end up needing one in a hurry, you probably have to go through First Derivatives.
KDB+ is used by most of the Wall Street companies --- I can name Citigroup, Barclays, Bank of America, and Lehman as big KDB+ users --- to store tick and order data. KDB+ 's biggest competitor is probably Vhayu.
The main blockade to learning KDB+ is their programming languages - K and Q - which can make APL look verbose!
If you are affected by the upcoming layoffs on Wall Street, and if you are looking for a new, exciting career change, and you don't relish the idea of selling steak knives door-to-door, then there is room in this world to be a KDB+ consultant.
©2007 Marc Adler - All Rights Reserved
KDB+ is used by most of the Wall Street companies --- I can name Citigroup, Barclays, Bank of America, and Lehman as big KDB+ users --- to store tick and order data. KDB+ 's biggest competitor is probably Vhayu.
The main blockade to learning KDB+ is their programming languages - K and Q - which can make APL look verbose!
If you are affected by the upcoming layoffs on Wall Street, and if you are looking for a new, exciting career change, and you don't relish the idea of selling steak knives door-to-door, then there is room in this world to be a KDB+ consultant.
©2007 Marc Adler - All Rights Reserved
Sunday, December 09, 2007
OpenAdaptor, SQL Server 2005, and Aleri
This posting was made in the interests of any Aleri or OpenAdaptor users who are trying to connect to a named SS2005 instance.
I have multiple "instances" installed of SQL Server 2005. According to the documentation at http://msdn2.microsoft.com/en-us/library/ms378428.aspx, this JDBC connection string should have worked:
jdbc:sqlserver://MAGMALAPTOP\RPT;databaseName=MyDatabase;integratedSecurity=true;
In Microsoft's JDBC Driver 1.2 for SQL Server 2005, there is a sample Java app called connectURL. With the connection string above, this sample app worked fine, and was able to connect to the RPT instance of my SS2005 database.
However, I could not get OpenAdaptor to work with this connect string. In case you are wondering why I was messing around with OpenAdaptor, it is because this is what Aleri uses for its adapters to external data sources.
After spending several hours this weekend trying to get Aleri to connect to SQL Server using the connection string above, I finally stumbled upon an alternative syntax for the connection string.
The new connection string is:
jdbc:sqlserver://MAGMALAPTOP;instanceName=RPT;databaseName=MyDatabase;integratedSecurity=true;
Notice that the instanceName is specified with a separate parameter.
So, there may be an issue with OpenAdaptor. Or, another theory that I have is that the backslash character in the connection string is being considered as an escape character.
©2007 Marc Adler - All Rights Reserved
I have multiple "instances" installed of SQL Server 2005. According to the documentation at http://msdn2.microsoft.com/en-us/library/ms378428.aspx, this JDBC connection string should have worked:
jdbc:sqlserver://MAGMALAPTOP\RPT;databaseName=MyDatabase;integratedSecurity=true;
In Microsoft's JDBC Driver 1.2 for SQL Server 2005, there is a sample Java app called connectURL. With the connection string above, this sample app worked fine, and was able to connect to the RPT instance of my SS2005 database.
However, I could not get OpenAdaptor to work with this connect string. In case you are wondering why I was messing around with OpenAdaptor, it is because this is what Aleri uses for its adapters to external data sources.
After spending several hours this weekend trying to get Aleri to connect to SQL Server using the connection string above, I finally stumbled upon an alternative syntax for the connection string.
The new connection string is:
jdbc:sqlserver://MAGMALAPTOP;instanceName=RPT;databaseName=MyDatabase;integratedSecurity=true;
Notice that the instanceName is specified with a separate parameter.
So, there may be an issue with OpenAdaptor. Or, another theory that I have is that the backslash character in the connection string is being considered as an escape character.
©2007 Marc Adler - All Rights Reserved
Saturday, December 08, 2007
Getting intermediate results in Streams
Let's say that we want to keep a running total of the number of shares that we have traded, and at 4:00 PM every day, we want to dump out the total. In Coral8, we can do something like this:
CREATE LOCAL STREAM Totals (TotalShares INTEGER);
INSERT INTO Totals
SELECT SUM(shares)
FROM TradeInputStream KEEP EVERY 1 DAY OFFSET BY 16 HOURS
OUTPUT EVERY 1 DAY OFFSET BY 16 HOURS;
This looks pretty straightforward. The Totals stream retains the totals until 4:00PM. At 4:00 every day, it outputs the total shares to any other stream that is "subscribed" to Totals, and then resets itself to start accumulating new totals.
This is something that CEP engines are good at, whether it be Coral8, Aleri, or Esper.
Now, let's enhance this a little bit.
Let's say we give the traders a .NET GUI application, and on this GUI is a "Status" button. The traders can press this button any time they want to know how many shares have been traded so far that day. So, at 2:00, a trader pushes a button on the GUI and we need to return to him the number of orders seen so far that day, the number of shares seen, the notional value of all orders, etc.
So, there are two questions:
1) How can we "dump out" these accumulators on demand? In other words, is there a way to tell these CEP engines to give me the contents of an aggregation stream AS OF THIS MOMENT ?
2) How can we "call into" our CEP engine to retrieve these values? Do the CEP engines support an API that I can use from within the GUI to say "Give me the current value of a certain variable in my module"? Something like
IntegerFieldValue field = Coral8Service.GetObject("ccl://localhost:6789/Default/SectorFlowAnalyzer", "sum(Shares)") as IntegerFieldValue;
int shares = field.Value;
In a standard C# application, this would be as simple as putting a Getter on a variable, and just calling the getter. If I was using Web Services, then I could call into a Web Service and just ask for the values of some variables or for some sort of object. But, from a C# app, how can I get the current value of a stream that is aggregating totals?
Another way of accumulating the total number of shares in a CEP engine is to step into the procedural world, and just define a variable. In Coral8, it would be something like this:
CREATE VARIABLE TotalShares INTEGER = 0;
ON TradeInputStream
SET TotalShares = TotalShares + TradeInputStream.shares;
Then, we would need a "pulse" to fire at 4:00PM every day, and upon this pulse firing, we could send the TotalShares to another stream.
I am sure that there are patterns in every CEP engine for accessing intermediate results, but something that is a no-brainer in a procedural language may not be so easy in a CEP vendor variant of SQL.
©2007 Marc Adler - All Rights Reserved
CREATE LOCAL STREAM Totals (TotalShares INTEGER);
INSERT INTO Totals
SELECT SUM(shares)
FROM TradeInputStream KEEP EVERY 1 DAY OFFSET BY 16 HOURS
OUTPUT EVERY 1 DAY OFFSET BY 16 HOURS;
This looks pretty straightforward. The Totals stream retains the totals until 4:00PM. At 4:00 every day, it outputs the total shares to any other stream that is "subscribed" to Totals, and then resets itself to start accumulating new totals.
This is something that CEP engines are good at, whether it be Coral8, Aleri, or Esper.
Now, let's enhance this a little bit.
Let's say we give the traders a .NET GUI application, and on this GUI is a "Status" button. The traders can press this button any time they want to know how many shares have been traded so far that day. So, at 2:00, a trader pushes a button on the GUI and we need to return to him the number of orders seen so far that day, the number of shares seen, the notional value of all orders, etc.
So, there are two questions:
1) How can we "dump out" these accumulators on demand? In other words, is there a way to tell these CEP engines to give me the contents of an aggregation stream AS OF THIS MOMENT ?
2) How can we "call into" our CEP engine to retrieve these values? Do the CEP engines support an API that I can use from within the GUI to say "Give me the current value of a certain variable in my module"? Something like
IntegerFieldValue field = Coral8Service.GetObject("ccl://localhost:6789/Default/SectorFlowAnalyzer", "sum(Shares)") as IntegerFieldValue;
int shares = field.Value;
In a standard C# application, this would be as simple as putting a Getter on a variable, and just calling the getter. If I was using Web Services, then I could call into a Web Service and just ask for the values of some variables or for some sort of object. But, from a C# app, how can I get the current value of a stream that is aggregating totals?
Another way of accumulating the total number of shares in a CEP engine is to step into the procedural world, and just define a variable. In Coral8, it would be something like this:
CREATE VARIABLE TotalShares INTEGER = 0;
ON TradeInputStream
SET TotalShares = TotalShares + TradeInputStream.shares;
Then, we would need a "pulse" to fire at 4:00PM every day, and upon this pulse firing, we could send the TotalShares to another stream.
I am sure that there are patterns in every CEP engine for accessing intermediate results, but something that is a no-brainer in a procedural language may not be so easy in a CEP vendor variant of SQL.
©2007 Marc Adler - All Rights Reserved
Friday, December 07, 2007
Coral8 and Transparency
I just tried to get some info on the Apama Event Processing solution (not their Algo Trading platform,just the simple ESP platform). I filled out a form, and now I have to wait for a Progress sales rep to call to arrange a demo. Even if I want to see an Apama webcast, I need to fill out a form.
Let's contrast this what Coral8 has to offer. Coral8 lets you download the entire developer platform, with all of the documentation included. Everything is included .... there are no important packages that are missing with the eval version. There is no 30-day license key that you have to get. There is no waiting for a salesperson to get in touch. As far as I know, you get everything is ready to go from the time you download the package.
I fail to understand why certain vendors make it so difficult to evaluate a package. In a big financial institution like the one I work for, if you use software in production and this software is not properly licensed and paid for, it is grounds for termination of your job.
Coral8 has the right attitude. Just get it into the hands of as many people as possible as spread the word around.
©2007 Marc Adler - All Rights Reserved
Let's contrast this what Coral8 has to offer. Coral8 lets you download the entire developer platform, with all of the documentation included. Everything is included .... there are no important packages that are missing with the eval version. There is no 30-day license key that you have to get. There is no waiting for a salesperson to get in touch. As far as I know, you get everything is ready to go from the time you download the package.
I fail to understand why certain vendors make it so difficult to evaluate a package. In a big financial institution like the one I work for, if you use software in production and this software is not properly licensed and paid for, it is grounds for termination of your job.
Coral8 has the right attitude. Just get it into the hands of as many people as possible as spread the word around.
©2007 Marc Adler - All Rights Reserved
NEsper Docs
Good solid docs on the Java version. Unfortunately, the NEsper version references the Java docs. The NEsper version only comes with an auto-generated CHM file.
Looks like I will have to compile the examples and dig through the source code of the examples in order to see how to use NEsper. Thomas, it may make sense to outline the difference between Esper and NEsper in the master documentation ... maybe using highlighted text boxes to outline the differences.
It also may make sense to include the PDFs in the NEsper distrubution.
©2007 Marc Adler - All Rights Reserved
Looks like I will have to compile the examples and dig through the source code of the examples in order to see how to use NEsper. Thomas, it may make sense to outline the difference between Esper and NEsper in the master documentation ... maybe using highlighted text boxes to outline the differences.
It also may make sense to include the PDFs in the NEsper distrubution.
©2007 Marc Adler - All Rights Reserved
Per Se
One of the perks of working for Mega-Bank is that you get invited to sales and marketing functions that were previously out of reach to me when I was a consultant. Such was the case this past Wednesday, when I was invited to a marketing function by Autonomy at Per Se.
If you have never heard of Per Se, then maybe you have heard of The French Laundry, a Napa Valley eatery that is run by reknowned chef, Thomas Keller. Per Se is Keller's New York City version of the French Laundry, and is one of the most difficult reservations to get in New York.
Autonomy has quarterly executive briefings at Per Se, where they bring together the sales of management of Autonomy, various Autonomy business partners, current customers, and future prospects. In addition to a fantastic lunch, we got to see how Standard and Poors used Autonomy to help their analysts get through the millions of pages of regulatory filings. S&P has a team of PHD mathematicians that have have developed some fairly sophisticated models in Autonomy to help them extract the "meat" out of their stream of documents.
Autonomy seems to positioning themselves as a major add-on in the Sharepoint marketplace, adding very sophisticated document searching. It would be interested to compare Autonomy with things like Google Search and X1.
©2007 Marc Adler - All Rights Reserved
If you have never heard of Per Se, then maybe you have heard of The French Laundry, a Napa Valley eatery that is run by reknowned chef, Thomas Keller. Per Se is Keller's New York City version of the French Laundry, and is one of the most difficult reservations to get in New York.
Autonomy has quarterly executive briefings at Per Se, where they bring together the sales of management of Autonomy, various Autonomy business partners, current customers, and future prospects. In addition to a fantastic lunch, we got to see how Standard and Poors used Autonomy to help their analysts get through the millions of pages of regulatory filings. S&P has a team of PHD mathematicians that have have developed some fairly sophisticated models in Autonomy to help them extract the "meat" out of their stream of documents.
Autonomy seems to positioning themselves as a major add-on in the Sharepoint marketplace, adding very sophisticated document searching. It would be interested to compare Autonomy with things like Google Search and X1.
©2007 Marc Adler - All Rights Reserved
On to Esper/NEsper
I have had to expand the CEP evaluation process. I am going to start looking at NEsper, and maybe, Apama (after some recommendations by some of our Asia folks who seemed pleased at Apama's performance over heavy loads).
I just downloaded NEsper and I am starting to go over some of the docs. I am sure that Thomas and Aaron will correct me if I say anything incorrect about Esper. Two things stand out about the Esper offering:
1) No GUI Builder or visual tools, probably because .....
2) Esper/Nesper is a component designed to be incorporated into your application ... in other words, it is treated as a third-party .NET assembly, just like things like Syncfusion, Log4Net, etc.
So, unlike Coral8, where you run a separate Coral8 server process, you need to write an application that "contains" Esper/NEsper. While this solution does not favor quick, out-of-the-box prototyping as Coral8 does, it gives you more control over the CEP actions. Everything with Esper/Nesper is "in-process" to your application.
Also, Esper/Nesper is Open Source. I downloaded it, and I have the full source code sitting on my hard drive. I have to talk to the financial guys at my company, but I don't think that we would have to do the amount of financial due dilligence with an Open Source effort as we would with a company who does not follow the Open Source model. Maybe we will have to count the number of moths that fly out of Thomas' wallet.
©2007 Marc Adler - All Rights Reserved
I just downloaded NEsper and I am starting to go over some of the docs. I am sure that Thomas and Aaron will correct me if I say anything incorrect about Esper. Two things stand out about the Esper offering:
1) No GUI Builder or visual tools, probably because .....
2) Esper/Nesper is a component designed to be incorporated into your application ... in other words, it is treated as a third-party .NET assembly, just like things like Syncfusion, Log4Net, etc.
So, unlike Coral8, where you run a separate Coral8 server process, you need to write an application that "contains" Esper/NEsper. While this solution does not favor quick, out-of-the-box prototyping as Coral8 does, it gives you more control over the CEP actions. Everything with Esper/Nesper is "in-process" to your application.
Also, Esper/Nesper is Open Source. I downloaded it, and I have the full source code sitting on my hard drive. I have to talk to the financial guys at my company, but I don't think that we would have to do the amount of financial due dilligence with an Open Source effort as we would with a company who does not follow the Open Source model. Maybe we will have to count the number of moths that fly out of Thomas' wallet.
©2007 Marc Adler - All Rights Reserved
Sunday, December 02, 2007
Reducing Lock Contention
Here is a good blog posting on 10 Ways to Reduce Lock Contention. Even though our .NET-based client-side framework behaves well, I should take these hints and examine our framework with a fine-tooth comb.
I am also following all of the recent developments in Parallel Programming coming out of Microsoft. I wonder how much Joe Duffy's team interacted with the Accelerator guys from Microsoft Labs. I am also very interested to see if our Derivatives Analytics team, which is very strong in .NET, can leverage this new technology instead of/in additon to some of the proprietary technology offered by hardware acceleration vendors.
By the way ... I just started using Google reader. This blog is one of the blogs that Google Reader automatically recommended.
©2007 Marc Adler - All Rights Reserved
I am also following all of the recent developments in Parallel Programming coming out of Microsoft. I wonder how much Joe Duffy's team interacted with the Accelerator guys from Microsoft Labs. I am also very interested to see if our Derivatives Analytics team, which is very strong in .NET, can leverage this new technology instead of/in additon to some of the proprietary technology offered by hardware acceleration vendors.
By the way ... I just started using Google reader. This blog is one of the blogs that Google Reader automatically recommended.
©2007 Marc Adler - All Rights Reserved
Some Random Comments
1) Go out and see the new Coen Brothers' film, No Country for Old Men. Absolutely startling. I saw it over a week ago, and I still can't stop thinking about it.
2) Please pay careful attention to the Comments section of each post here. Many of the CEP vendors are responding with interesting and important comments.
3) I need to start thinking about a notification framework for the CEP project. Notifications are a bit more complicated in financial firms where Chinese Walls have to exist, and where compliance officers are constantly over your shoulder. It's a good idea to involve the compliance guys right from the start, and to have THEM tell you where the Chinese Walls should be. Nothing will get their attention more that discovering that your prop traders are getting notification about your customer order flow!
4) I just subscribed to Alex's blog. Some interesting posts. Takeaways from skimming through his blog include:
a) I need to narrow down CPU pricing from the CEP vendors. Let's assume quad-core machines.
b) I am more curious about NEsper than I was before.
c) How will Esper and BEA stay in sync? Seems a bit annoying that BEA has seen fit to change some of the syntactic sugar of the original product.
©2007 Marc Adler - All Rights Reserved
2) Please pay careful attention to the Comments section of each post here. Many of the CEP vendors are responding with interesting and important comments.
3) I need to start thinking about a notification framework for the CEP project. Notifications are a bit more complicated in financial firms where Chinese Walls have to exist, and where compliance officers are constantly over your shoulder. It's a good idea to involve the compliance guys right from the start, and to have THEM tell you where the Chinese Walls should be. Nothing will get their attention more that discovering that your prop traders are getting notification about your customer order flow!
4) I just subscribed to Alex's blog. Some interesting posts. Takeaways from skimming through his blog include:
a) I need to narrow down CPU pricing from the CEP vendors. Let's assume quad-core machines.
b) I am more curious about NEsper than I was before.
c) How will Esper and BEA stay in sync? Seems a bit annoying that BEA has seen fit to change some of the syntactic sugar of the original product.
©2007 Marc Adler - All Rights Reserved
Subscribe to:
Posts (Atom)