Friday, November 30, 2007

Consultant Wanted - C# and Market Data Expertise

A vendor who we do business with (not a CEP vendor!) asked me to help them find an independent consultant for a few months. This consultant should be a C#/.NET expert, know market data APIs, and probably (I am guessing) be decent at communications. This consultant will be working with developers from the vendor, all of whom know C# at a basic level.

I will help them identify this consultant. The resulting work will be consumed by my team, as well as other financial firms that rely on real-time market data.

You can contact me at

XXXXX magmasystems XXXXXXX

©2007 Marc Adler - All Rights Reserved

Coral8 Update

We are finishing up the first phase of the Coral8 evaluation. This week, I met with Terry Cunningham (who flew his Falcon 10 out to meet us), and I had a great session with Henry, their pre-sales engineer. Terry was the creator of Crystal Reports, and later, the head of Seagate Software. I always have a soft spot in my heart for a fellow pilot .... even if his plane can go faster and higher than mine!

We validated that Coral8 was putting out the same output as our custom app, and I was enlightened on some of Coral8's capabilities that were not so easy to find in their wads of documentation. Although there is much good in Coral8, there were also some gotchas.

- Documentation needs to be consolidated a bit. There are a lot of separate manuals plus technical articles. There needs to be a "cookbook" on their CCL language.

- You cannot test a simple user-defined function without writing an intermediate stream. There is no simple way to dump a variable to the console. In other words, I would like to do this simple thing:

SET commission = CalculateCommission(); -- this is my user-defined function

- We managed to get the Coral8 Studio to freeze consistently. Luckily, no work was lost. The Coral8 Studio is written using wxWidgets, so I wonder how they do unit-testing on the studio.

My opinion is that, although it is great to have the advanced features, you still need to pay attention on the everyday, little tasks that developers need to do. Henry tells me that, in the future, Coral8 will move to a more Visual Studio, file-based way of developing. I certainly welcome this. Henry spent 8 hours watching me drive. When I was having problems, I verbalized the issues so that Henry could see what I was going through and he could bring the issues back to his management.

On the plus side :

- I have been reading about Coral8's pattern matching capabilities. We will definitely be exploring this.

- Coral8 has a relatively inexpensive barrier to entry. If we have a production, development, and COB servers (all dual or quad-core machines), then it won't break our budget.

- Their software does have any time limits on the evaluation versions. One thing that I do not like is a license key that is only good for 30 days. Given the nature of financial companies, we often get pulled into a lot of side projects. I don't want my time to be in the "thick of things", only to find out that the license key elapsed. Coral8 is very friendly to the evaluator.

Now, on to Aleri. I will be using their new 2.4 release.

©2007 Marc Adler - All Rights Reserved

Financial Due Diligence

In my super-mega Investment Bank, we work with all kinds of vendors. In fact, in our old group, one of the things that we were charged with was investigating all kinds of esoteric technologies that could give us an edge in the trading world.

If you use a vendor for a "bet the farm"-type application, then you want to make sure that the vendor behind the product is rock solid. My boss, who is the Global Head of Equities Technology for the firm, asked me who the "800 lbs gorilla" is in the CEP space. He wanted to make sure that we were safe in our choice, and that no matter how good the technology is, we did not put all of our eggs into a guy working nights in his basement (although, those kinds of companies usually make the best software!).

There is really no 800-lbs gorilla is the "pure" CEP space. By "pure" CEP players, I am talking about guys like Aleri, Coral8, Streambase, Esper, Apama, Truviso, Kaskad, etc. In this space, most of these companies number their customers in the dozens rather than the thousands. These companies are all competing for that big reference customer, the one that companies can stand back and say "This investment bank doubled their trading revenues because of our product."

Compounding this fact is the whole credit crunch and subprime mess. The New York Times had an article yesterday that described the effects that the credit crunch is starting to have on all sorts of companies ... even softeware companies and web-design shops were mentioned. So, we need to make sure that the CEP vendor that we choose is not affected, nor will be affected by the credit crunch. Coincidentally, one ex-employee of a CEP vendor sent me private email that mentioned that his company was undergoing a round of layoffs.

No matter which CEP vendor you choose, you should always have one backup. This is standard practice when dealing with small vendors. You should have the vendor's source code in escrow. You should have your financial people do a deep dive on a vendor's financials. Is the vendor self-funded or are they VC funded? Does the VC have the appetite to wait 5 to 7 years for a good return on their investment? What is the past behavior of the VC firms with regards to startups? What happens if the chief architect of the product leaves the company? Is the product written in a mainstream language, in case you need to take possession of the source code that's in escrow?

These are all questions that you should ask before making the final selection of your vendor.

©2007 Marc Adler - All Rights Reserved

Saturday, November 24, 2007

First Use Case Done with Coral8

I now have Coral8 detecting when a sector has abnormal activity, and I have a Coral8 Output Stream publishing into a .NET application for visualization. If I want to, I can take the data from the Coral8 alert, transform it into a JMS message, and publish it out on the Tibco EMS bus for other applications in the organization to consume. Or, I can publish it out to another Coral8 stream.

Well done, Coral8 team!

Now, it's on to the Aleri evaluation. The good people at Aleri's sales engineering team have done most of this first use case, but now that I am armed with more Coral8 knowledge, I need to try to rebuild the Aleri use case from scratch by myself.

©2007 Marc Adler - All Rights Reserved

Thursday, November 22, 2007

More on the first CEP Use Case

Yesterday, I had a great two-hour session with Henry and Bob from Coral8 in which most of the use case was done.

Henry is our designated pre-sales engineer. His job is to do what it takes to make sure that the prospective customer is happy with the product before making a decision to purchase the product. Bob is the head architect of Coral8, and his job (as he described it) is to make sure that the product is as easy to use as possible.

Between Henry and Bob, two solutions were offered. I will go into the first solution in this blog entry. The second solution revolves around custom timestamping of messages by the input adapter, and this topic deserves a blog entry of its own.

The main problem was to analyze the order flow for each sector over a one minute timeslice, and determine if any sectors showed abnormal activity. The problem that I was faced with was that the concept of “time” was determined by the TransactTime field in the FIX message, and not by the “clock on the wall”. So, if for some reason, I received two FIX messages in a row, one whose TransactTime field was 14:24:57 and one whose TransactTime field was 14:25:01, then the receipt of the second FIX message should cause a new timeslice, regardless of what the wall clock said.

The solution that Henry came up with was to use a pulse in a stream. Although the concept of raising an event is very common is programming, it is not really something that you tend to do in SQL stored procedure. The thing is that programming in Coral8’s CCL (as well as the SQL-like dialects that many of the CEP vendors have) is a combination of procedural and SQL programming, and the trick is to find the correct “pattern” to solve your problem. This is where many of the CEP vendors can improve; they can publish a listing of patterns, they can come up with FAQs, etc. I mentioned this to Bob of Coral8, so expect to see some movement on this front from the Coral8 folks.

Here is what the pulse stream looks like in Coral8’s CCL:

-- LastTimeSlice holds the maximum timeslice (0 to 389) of the order stream.
-- When we see an order with a TransactTime greater than the current max timeslice,
-- then we set the new max timeslice. We also use this as a signal (pulse)
-- to one of the streams below.

INSERT INTO stream_Pulse
TimeToTimeBucket(FlattenNewOrder.TransactTime) AS epoch
TimeToTimeBucket(FlattenNewOrder.TransactTime) > LastTimeSlice;

-- When we insert a new timeslice into the stream_Pulse stream, we also
-- set the new maxmimum timeslice.
ON stream_Pulse
SET LastTimeSlice = stream_Pulse.epoch;

We have a global variable that keep the maximum timeslice that is flowing through our system. Since there are 6.5 hours in the trading day, there are 390 minute-sized timeslices that we want to consider.

In the INSERT statement, if the timeslice from the incoming FIX message is greater than the current maximum timeslice, then we insert a new record into the pulse stream.

The ON statement functions like a trigger. When a new record is inserted into a stream, you can have one or more ON statements that react to the event of inserting a record into the stream. Here, we set the new maximum timeslice.

We need to maintain a Window that contains all of the orders for the current timeslice. The order information includes the stock ticker, the sector that the stock belongs to, the number of shares in the order, and the current timeslice. In Coral8, a Window provides retention of records. You can specify a retention policy on a Window, whether it been a time-based retention policy (keep records in the window for 5 minutes) or a row-based retention policy (keep only the last 100 rows). What is missing here is a retention policy based on a boolean expression or on a certain column value changing. Streambase has this, and Coral8 knows that this feature should be implemented down the road.

-- The TickerAndSector window holds all FIX orders for the current timeslice.
-- Each row of the window contains the FIX order and the sector information.
-- When we see a new timeslice, the TickerAndSelector window is cleared
-- using a DELETE statement.
SCHEMA (Ticker STRING, SectorName STRING, SectorId INTEGER, Shares INTEGER, TransactTimeBucket INTEGER)

INSERT INTO TickerAndSector
TickerToSectorMap.Ticker = FlattenNewOrder.Ticker
AND TimeToTimeBucket(FlattenNewOrder.TransactTime) >= LastTimeSlice;

Now that we have a list of orders that occur for the current timeslice, we need to know when a new timeslice occurs. At this point, we need to analyze the orders for the current timeslice, find out which sectors are showing abnormal activity, and clear out the TickerAndSector window so that new orders can be accumulated for the new timeslice.

-- The OrdersPerSectorPerMinute window contains the aggregated totals
-- for each sector for the previous timeslice. The aggregated totals include
-- the number of orders for each sector and the total number of shares for each sector.
-- The interesting part of this is the join between the TickerAndSector window
-- and the stream_Pulse. The stream_Pulse will be triggered when we see a new
-- timeslice.
-- When we insert rows into the OrdersPerSectorPerMinute window, we will trigger
-- a deletion of the old info in the TickerAndSector window.
CREATE WINDOW OrdersPerSectorPerMinute
SCHEMA (SectorName STRING, SectorId INTEGER, OrderCount INTEGER, TotalShares INTEGER, Timeslice INTEGER)

INSERT INTO OrdersPerSectorPerMinute
tas.SectorName, tas.SectorId, COUNT(*), SUM(tas.Shares), stream_Pulse.epoch
TickerAndSector tas, stream_Pulse

ON OrdersPerSectorPerMinute
DELETE FROM TickerAndSector
WHERE TransactTimeBucket < LastTimeSlice;

As you can see from the above code, when a new timeslice appears, we aggregate the number of orders and the total number of shares that are in the TickerAndSector window. The interesting thing here, and the thing that I might not have figured out on my own, was that we need to join with the pulse stream that we talked about before. The pulse stream here is being used to “kick start” the calculating and dumping of the records in the current timeslice.

Finally, since we have aggregated the information for each sector for the current timeslice, we want to see if any sector exceeded the maximum “normal” number of orders.

-- This output stream will alert the user when a sector exceeds the
-- max orders for that timeslice.
R.SectorId, R.SectorName, R.OrderCount, R.TotalShares
OrdersPerSectorPerMinute AS R, NormalOrdersPerSectorPerTimeslice AS H
R.SectorId = H.SectorId AND R.Timeslice = H.Timeslice AND R.OrderCount > H.MaxOrders;

And, that’s it! If we attach a JMS output adapter to the AlertStream, we can generate a new, derived event, put that event back on the EMS bus (or we can send it into another Coral8 stream), and alert some kind of monitoring application.

Thanks to the Coral8 guys for helping me slog my way through the learning process.

©2007 Marc Adler - All Rights Reserved

Tuesday, November 20, 2007

Our First CEP Use Case (and thoughts on Coral8 and Aleri)

For the Complex Event Processing (CEP) engine evaluation, we have chosen a very simple use case. This use case is:

Tell us when orders for a sector show a greater-than-normal level.

Even though this use case seems very simplistic, and would not tend to be an ideal use case to test a CEP engine, it is an ideal use case for our environment. Why? It forces us to get at various data streams that have previously been inaccessible to most people, and it forces the owners of these streams of data to make there data clean.

(Note: this use case is a very generic use case and test for CEP. I am not giving away any special use cases that would give my company a competitve edge, not will I ever do so in this blog.)

At the Gartner CEP Summit last September, Mary Knox of Gartner mentioned that one of the obstacles for doing successful CEP projects at large organization was the process of liberating all of the data sources that you need, and getting the various silos to talk to each other. We have found this to be the case at our organization too. We figure that if we can get this simple use case to work, then we have won 50% of the battle.

What kind of data do we need to implement this use case?

  • We need to tap into the real-time order flow. Order flow comes to us through FIX messages, and for older systems, through proprietary messages that will one day be deprecated. Luckily, we have found a system that provides us this information. Although this system is a monitoring GUI, we have identified its importance to our company, and we are working with the product owner to split his app into a subscribable order service and a thinner GUI.
  • We need historical order data in order to determine what “normal activity” is for a sector. Luckily, we have this data, and we are in the process of getting access to it. We also need to understand what we mean by “abnormal activity”? Does this mean “2 standard deviations above the 30-day moving average for a sector”?
  • We need to be able to get a list of sectors, and for each order, we need to map each ticker symbol to its sector. Sectors are signified by something called GIC codes, and there are 4 levels of GIC’s. The important thing that we need is to ensure that all corporate actions get percolated down to these mapping tables. So, if a company changes it ticker symbol (like SUNW to JAVA), then the new ticker symbol needs to be automatically added to these mapping tables.

  • Let’s say that we are able to get all of the data that we need, and that the stream of data is pristine. We have to get it into the CEP engine for analysis.

    If you think if writing a normal, procedural program (i.e.: a C# app) to do this analysis, the steps are pretty easy.

    1) Read in all of the reference data. This includes the ticker-to-sector mappings and the list of normal activity per sector per time-slice. We will consider a timeslice to be a one-minute interval. In a 6.5 hour trading day, there are 390 minutes. There are also 11 “GIC0” sectors. So, a timeslice will be an integer from 0 to 389.

    2) Subscribe to a stream of FIX orders.

    3) As each order comes in, extract the ticker and map it to a sector. We are also interested in the number of shares in the order and the time that the order was placed. For each order, increment a running total for that sector and for that timeslice.

    4) Any orders that come in that are past the current timeslice are ignored. Also, any orders that come outside of the normal trading day are ignored. This way, we don’t consider any orders that may have been delayed through our systems.

    5) If we detect a new and later timeslice, then examine all of the sectors for the previous timeslice. If any of the sectors show heightened activity, then alert the user. Then, clear the totals for all of the sectors, and start accumulating new totals for all of the sectors.

    This looks pretty easy. I would assign this to a good C# developer, and hope to get a finished program in one or two days.

    Now, the task is to map this into a CEP engine.

    Most of the CEP engines have a language that is based on SQL. So, you can imagine all of the processing steps above passing through multiple streams in the CEP engine. For step 1) above, we would have two input streams, one for the ticker-to-sector mapping data and the other for the “normal sector activity” data. You can imagine two simple SELECT statements in SQL that read this data from some external database, and construct two in-memory tables in the CEP engine.

    For step 2, you need to write a specialized input adapter that subscribes to a communications channel (sockets or JMS) and reads and decodes the FIX orders. Most orders come through as NewOrderSingle messages (FIX message type = ‘D’). There are various versions of FIX, but let’s say that everything comes in as FIX 4.2 messages.

    Most of the CEP vendors support in-process and out-of-process adapters. In-process adapters are faster than out-of-process adapters, but out-of-process adapters are usually easier to write. An out-of-process adapter will read data from some kind of communications bus (or even from a database table or a flat file), and will write a data stream to the CEP engine. It would be ideal to have the CEP vendors support FIX in in-process input and output adapters.

    Step 4) is easy. We calculate the 0-based timeslice for an order, and if it is below 0 or above 389, then we ignore this order in the stream. This can be done with a simple WHERE clause in the SQL statement.

    We also need to record the “current timeslice” and ignore any orders that come before the current timeslice. So, we need the concept of a “global variable” and when we see an order with a later timeslice, we need to update this variable. This is something which is easy to do with a procedural language, but what is the best way to do this in SQL?

    Steps 3) and 5) are interesting. We need to keep a one minute window per sector. This window should only keep running totals for the current timeslice. When a new timeslice comes in, we need to analyze the sector activity in the current timeslice, do any alerts, and then clear out the totals in all sectors. Again, this is something that is extremely easy to do in a C# application, but translating it into SQL is a bit of a challenge.

    In step 3), the mapping of ticker to sector is very easy. It’s just a join of the ticker in the order with the ticker in the mapping table. The interesting thing is the choice of window type for the stream. Do we accumulate all orders for all sectors for the one-minute timeslice, and then, when we see a new timeslice, do we just take a COUNT() of the number of orders for each sector? Or, do we simple have a window with one row per sector, and keep running totals for each sector as an order comes in?

    Coral8 supports the concepts of sliding and jumping windows. Aleri supports only sliding windows right now. With Coral8, we can set a window that will hold one minute’s worth of data, and we can also tell a stream that it should dump its output after one minute. However, we don’t want to tie the TransactTime in a FIX order message to the actual clock on the computer. We need a stream that will produce output on a certain value in a column, and neither Coral8 nor Aleri seem to have this yet.

    Here is some Coral8 code that shows windows and streams:

    CREATE WINDOW TickerAndSector
    SCHEMA (Ticker STRING, Sector STRING, SectorId INTEGER, Shares INTEGER,
    TransactTimeBucket INTEGER)

    INSERT INTO TickerAndSector
    TimeToTimeBucket(FlattenNewOrder.TransactTime, 'HH:MI:SS AM')
    TickerToSectorMap.Ticker = FlattenNewOrder.Ticker

    The first statement defines a window that keeps one minute’s worth of order data. After one minute, the window will empty its contents.

    The second statement will insert a new row into the window whenever we get a new order. After one minute, the window will send its output to another stream further down the pipeline. (We hope that the data will be sent to the next stream before the window clears itself. Otherwise, we will lose all of the data.)

    So far, in my brief evaluation, I have found step 5) difficult to implement in Coral8. Aleri has implemented this by using a FlexStream. A FlexStream is a stream that has procedural logic attached to it. Aleri has a custom C-like programming language that you can use to implement procedural logic in a FlexStream. But, if you write too much logic using FlexStreams, then wouldn’t you be better off to just write a nice C# application?

    To validate some of the CEP engines, I ended up taking a day and writing a C# application that implements this use-case. For grins, I added a tab that showed some animated graphics using the very excellent ChartFX package. The head of the trading business was so excited by this eye candy that he started to bring over various traders for a look at my simple app. So, in addition to this little app giving the traders information that they did not have before, it provided them a flashy way to see real-time movement across sectors.

    In addition to having SQL skills, a good CEP developer needs to readjust their way of thinking in order to consider pipelined streams of SQL processing. There is a big debate going on in the Yahoo CEP forum as to whether SQL is a suitable language for CEP processing. So far, with this use case, I see the suitability of SQL, but I also need to step out of the SQL way of thinking and apply some procedural logic.

    One of the things that I still need to be convinced of is that CEP engines can do a better job than custom code. I am all ears. Any CEP vendor (even Streambase) is invited to submit public comments to this blog to tell me how this use case can be implemented with their system.

    ©2007 Marc Adler - All Rights Reserved

    Saturday, November 17, 2007

    CEP Vendor Thoughts

    Recently, I came across an article on Streambase in Windows in Financial Services magazine. One of the questions to the head of Streambase went like this:

    WFS: Does StreamBase have any competitors?

    BM: The major players have not yet delivered anything in this space. IBM, for example, does not have a project to build a technology like this. We are IBM’s solution in this space.

    In my opinion, this answer totally evades the question. What happened to companies like Aleri, Coral8, Esper, Apama, Skyler, Truviso, Kaskad, etc? How about the IBM offering that Opher is working on? Alll of these companies freely acknowledge Streambase as a worthy competitor, and rightly so. It would be nice to see Streambase acknowledge the same. Brown University certainly was not the only university doing CEP research and not the only one to commercialize their offerings.

    And shame on Microsoft and Windows in Financial Services magazine for letting this slip by. Are you a journalistic effort or a fluff rag?

    In our evaluation of CEP vendors, we chose not to evaluate Streambase for various reasons. Streambase might have the best technology of all of the CEP vendors (for example, look at Tibbets comment from a few weeks ago on a question about cancelling events), but we will never get to find out. The people who I feel badly for at Streambase are the dedicated development and support staff who have probably come up with a really good product.

    (In the interest of fairness, Bill from Streambase told me recently that they had reduced the price of their offering, which was one of our concerns.)

    And, if anybody from Streambase reads this blog ---- doing an end-run around me and trying to market directly to the business will not earn you any points. The business people rely on me to make the right decision, and all of your email to the business side (as is any email from information technology vendors to the business side) gets forwarded directly to me. And, I guess that we will end up paying real dollars to your imaginary competitors.

    Meanwhile, let's take the attitudes of Coral8 and Aleri. One of these companies JUST hired its first salesperson. Their mantra was that the product should be the best that it can be before it was pushed by a salesforce. The other company has a low-key sales approach too. They have gone beyond the call of duty to incorporate our suggestions into their product and to come up with a POC that really impresses us.

    Both vendors have come up with FIX input adapters at our behest. Aleri has incorporated some of our suggestions into their FlexStreams, and has cleaned up some of their visual development studio. (With FlexStreams, you can use a procedural programming language to create custom processing for streams). I am impressed in what these companies have done to earn our business. I feel that, in exchange for these companies doing some of what we want, they get to expand their offerings for the capital markets communities, and bring themselves out of the narrow focus of algorithmic trading and pricing engines.

    Kudos to Mark, John, Henry and Gary of Coral8, and to Don, John, Jerry, Jon, David, etc of Aleri. All very nice people, and all trying compete honestly for a piece of the pie.

    In my opinion, the Coral8 and Aleri offerings are so close that we will eventually be choosing one vendor as primary and the other as hot backup. What needs to be done is performance evaluation. Pushing multiple streams of fast moving data into the CEP engine and seeing their performance under heavy load. Let's see if they can handle the data rates that come at 2:15 PM on a Fed decision day.

    One message that we have been hearing from the CEP and messaging vendors is that they perform better under Linux than Windows Server 2003. This is probably not a surprise to most people on Wall Street. But, I wonder what Windows Server 2008 has to offer in comparison to Linux. The November 8, 2007 article at Enhyper has some interesting things to say about Microsoft's marketing of the London Stock Exchange deal. We will most likely be running our CEP engine on Linux unless Microsoft comes up with a real compelling reason to the contrary.

    ©2007 Marc Adler - All Rights Reserved

    Wednesday, November 07, 2007

    IBM's ManyEyes

    (Thanks to Jules)

    ManyEyes is a community-based visualization project from IBM, run by the guy who did the Stock Market heatmaps for Smart Money.

    You can upload your own datasets and apply some pre-made visualizations to it. People in the community have contributed other visualizations.

    Pretty cool.

    ©2007 Marc Adler - All Rights Reserved

    Algorithm for Implementing Treemaps

    ©2007 Marc Adler - All Rights Reserved

    Sunday, November 04, 2007

    Help Wanted for the Complex Event Processing Project

    I have open headcount for about 4 or 5 people for 2008 for the Complex Event Processing project that I am running.

    I realize that it is foolhardy to advertise for people who have prior experience in CEP. What I am looking for are smart developers who have a great passion to learn a new, interesting technology. The team that I envision will consist of:

    1) Visualization developer - come up with new, interesting ways to visual events and data. The work may entail working with the .NET framework that my team has built, integrating visualizations with existing Java/Swing-based trader GUIs, or even exploring WPF (as the company gradually embraced .NET 3.x). You could be investigating visualization tools like heatmaps and you will definitely be evaluating third-party tools (both commercial and open-source). You will be involved in OLAP to some extent. There will be involvement in the building out of a notification and alerting framework.

    2) CEP developer. You will be building out the analysis part inside the CEP engine. Most of the CEP engines use a variant of SQL, so you should be fairly comfortable with SQL concepts. It would be nice if you had previous experience with tools like Coral8, Aleri, Streambase, Esper, etc, but even if you haven't, you should be willing to learn these tools. You may also be interacting with consultants from these companies.

    3) Networking, messaging, and market data specialist. Help us decide if we should migrate to a new messaging infrastructure (like RTI or 29West). Experience with Tibco EMS is a big plus, as well as experience with working with high volumes of data and low latency. Interact with Reuters and Wombat infrastructures, as well as internally-built market data infrastructures.

    4) Data specialist. You will be the person who is responsible for breaking down silos and getting good data into the CEP engine. Experience with SQL Server 2005 and Sybase are important. Experience with tick databases like KDB+ and Vhayu would be nice to have.

    Everyone will be doing a bit of everything, so everyone on this team will be intimately aware of what everyone else is doing.

    This is a highly-visible position in an investment bank that has promised me that they will reward good talent who comes to us from the outside.

    In addition to the positions mentioned above, I have two or three open headcount for people who want to work on the Ventana team. Ventana is the client-side .NET framework that is being used by various groups in our Investment Bank.

    ©2007 Marc Adler - All Rights Reserved