Wednesday, December 28, 2011

Brazil and Other Closing Thoughts for 2011

Last week, I came back energized from the first two weeks of my engagement in Sao Paolo, Brazil. I am amazed at how Brazil is exploding, and there seems to be a lot of evidence to support my observations.

But, before I go into Brazil, I want to backtrack a few weeks.

After the demise of the investment banking arm of Citadel Securities (search Google News for the term "Citadel Securities" to reconstruct the happenings), I rejoined some old friends in the world of capital markets consulting. It feels a bit strange to be sitting on the other side of the table from where I sat for the past 5 years, but since I came from a consulting background, the transition was pretty smooth. Now I am the vendor, and my former colleagues are clients, and I expect to get beaten up by them in the same way that I used to micro-examine the vendors who courted me. But, a challenge is a challenge, and I hope that my Wall Street experience can serve my new clients well.

The first engagement that I had was in Miami Beach (yeah, life sucks, right?) for a few weeks in November. A client of ours has a product that is widely used in its industry, but there is a need for the product to be able to scale out massively in order to handle tens of thousands of simultaneous requests, and to be able to deal with more and more complex products. The current product was a mixture of some new .NET technology, combined with a lot of legacy VB6, ATL, C++, MFC, and the current system is designed to handle one request at a time.

My job was to propose a future state architecture that would allow them to scale up, handle failures gracefully, and allow pluggable components for pipelined processing of their data. One restriction was that their clients would incur zero or little additional cost. Without going into much detail, I ended up implementing a typical Wall Street compute-grid-like architecture using Rabbit MQ (hence, my previous posts on Rabbit) and Microsoft's Velocity Cache. This was an example where I was able to bring the kind of architecture that is typically found in Wall Street to a client who had never heard of grid computing before.

For my current engagement, I am traveling down to Sao Paolo, Brazil, for two-week stretches for the next several months. I am the head technologist on a small team that is analyzing the current state and proposing a new future state architecture for our capital markets client. I am not going to talk about what we are doing for the client, but I want to give my impressions of Brazil and Sao Paolo.

Every day, it seems that articles like these are appearing in my news feed:

Brazil overtakes UK as sixth-largest economy

Miami Has a Hearty Oi (Hello) for Free-Spending Brazilians

Brazil Offers Opportunities and Risks For Wall Streeters On the Move

Traveling through Sao Paolo, you can feel the excitement of the city. Like they know that they are the next big thing. New York, without the tension and angst of New York. There are new office buildings springing up everywhere. Brookfield has an incredible new building under construction by the Faria Lima financial center. The Grand Hyatt's executive lounge crowded with all sorts of foreigners trying to get in on the action before others do.

Sao Paolo is huge ... bigger than NYC ... but the office buildings are spread out all over the city. I did not get the feeling of a single financial district as you would find in NYC, Miami, Charlotte, or San Francisco. The subway (metro) is not as extensive as it is in NYC, so odds are that you need to take a taxi to go from office to office. Taxis are convenient and relatively inexpensive, but the traffic ....

You really need to be able to speak Brazilian Portuguese. Whoever told me that everyone under 30 speaks English was lying. Your high school Spanish might bring you some comfort, but not much. Brazilians do not like speaking Spanish, but if you speak Spanish with a strong American accent, they might be willing to forgive you.

Not everything is sunshine and lollipops .... Crime is something tangible and is definitely the huge thorn on the rose. I was told never to show a hint of my laptop while in a taxi, because roving gangs of motorcycle thieves are everywhere, and they will surround your taxi and take your laptop at gunpoint. A Brazilian that I was having dinner with had to take a taxi for the 5-block walk to his hotel because he was afraid of being robbed... and this was in a good section of Sao Paolo. Luxury hi-rise apartments look beautiful from your hotel room, but when you view them from the sidewalk, you see the 30-foot barbed-wire walls and teams of black-booted security guards patrolling the grounds.

Traffic is a mess too. Stand-still traffic at almost all hours of the day. If you are traveling between offices and you choose to go by taxi, give yourself an hour to go two miles.

The traffic and the fear of crime combine to make Sao Paolo the city with the greatest number of helicopters per capita. Almost every office building has a helipad on the roof, and for a pilot like myself, it's exciting to see the constant flow of helicopters 500 feet above your head.

It will be interesting to see if Sao Paolo and Rio can get their acts together before the World Cup in 2014 and the Olympics in 2016. Lots of problems to be solved. But, if Athens can do it, then Brazil should be able to pull it off too.

Sao Paolo to make it an interesting challenge. But, I think that they are very receptive to Wall Street and City experience. A bunch of people have been anxious to know about my experience in Brazil because it seems that the mood on Wall Street has never been gloomier, and people want to have a bit of a change-up in their lives. My opinion is that a bit of international experience in Brazil will be very helpful to a career in Capital Markets, especially in the IT world. Certain investment banks are willing to send their folks to Brazil on a 3-year program. Why not roll the dice!

Have a happy 2012. As always, comments and questions are welcome.


©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Thursday, December 01, 2011

Hanselman's List of 2011 Dev and Power User Tools is Out

One of my favorite recurring blog posts can be found here.

Don't forget to read all of the user comments, as they often refer to great tools that Hanselman has skipped over.

Unless you have admin access at work, you will most likely need to lobby your Chief Architect to get some of these tools approved. Or, if you have a TechArch committee, be prepared to abandon all hope.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Tuesday, November 29, 2011

I am going to be in Sao Paolo for Two Weeks

The market in Brazil seems to be exploding, and I am going to be in Sao Paolo from December 4 to December 16  to kick off work for a major new client.

If there are any Brazilian (or ex-pats living in Brazil) that might be interested in a coffee, please feel free to contact me.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Saturday, November 26, 2011

HFT Trading Company Fined $850K

(Thanks to John Bates for the pointer)

According to this article, Infinium Capital Management was fined $850,000 for three separate incidents related to High Frequency Trading. It seems that they lost control of their algos at various times over the past 2 years.

I was thinking about this article in relation to a friend of mine who was interviewing at various HFT firms over the past few months (no, it's not me, and yes, he found a great job). He was asked every single little question about C++ minutiae, questions about manhole covers, questions about networking trivia, and questions about the timing of instructions. Maybe valuable stuff for a HFT developer to know. But not once did these interviews concentrate on testing and quality assurance and development practices.

This is probably the first step by the exchanges into monitoring their HFT partners. As technology improves, and as it becomes easier to recognizes errant algos (probably, in part, to the same kind of monitoring that Nanex uses), you will see these kinds of fines become more commonplace. It will also become necessary for HFT firms to come up with better simulation scenarios for their algo testing.

It would be interesting to me if Infinium hired Nanex to do a post-mortem on these three situations so that Infinium can start fixing their algos. Maybe this could grow into a sideline business for a Nanex-like startup, where other HFT firms could outsource the post-mortem and QA work. At the very least, the exchanges would be able to outsource this kind of real-time surveillance work.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Friday, November 18, 2011

There seems to be a shortage of .NET Developers

As an aside, my company is looking for a few great .NET developers who can work at one of our hedge fund clients, so if you are really good and you are available, let me know. The skills required are WPF, WF, WCF, and Entity Framework.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Wednesday, November 16, 2011

Microsoft Droping Dryad and Concentrating on Hadoop

Read about it here.

From what I have seen from my new vantage point in the consulting world, there are a number of Wall Street companies who are making investments into the Hadoop infrastructure. Good move by Microsoft.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Wednesday, November 09, 2011

Time Series Foundation released by Microsoft Research

Time Series Foundation

Time Series Foundation (TSF) is an open, .NET platform for exploring and prototyping new algorithms in time series analysis and forecasting. TSF is based on state space model methodology that includes all types of exponential smoothing, some autoregressive algorithms, and innovative algorithms for event detection and calendar event impact prediction.

TSF relies on Excel charting and presentation APIs by implementing an Excel interop layer. Numerical and graphical results of time series analysis and forecasting can be put in programmatically generated workbooks with the help of this layer. TSF also offers an Excel add-in that exposes a large subset of the platform's functionality through the Excel ribbon UI.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Saturday, November 05, 2011

Caching for .NET Apps

The goal ... find a simple distributed cache that can be used by a .NET enterprise app, supports the storage of large objects, and will result in no additional cost to the client. We don't care about cache notifications, and it's OK (but not desirable) to incur the cost of serialization from native .NET objects to XML strings.

Memcached for Windows is here. I used the Linux version at Citadel for our new multi-asset trading platform, and it was fairly easy to use, except it doesn't support the enumeration of cached objects.

A management console GUI for Memcached is here.

It has been over 2 years since I considered Microsoft Velocity (see past blog entries here). However, it seems that Microsoft's Velocity Cache has been put into the Azure AppFabric, and the client is not considering the cloud at this time. Does anyone know if you can use Velocity (and the new Azure Service Bus) independent of Azure?

It seems that the new System.Runtime.Caching namespace in .NET 4 doesn't support distributed caching.

There is a project on Codeplex called SharedCache, but it looks like it has not been updated in a while. In addition, some of the comments seem to indicate that it hogs the CPU.

NCache has a free version.

Has anyone used any other solution? So far, it looks like NCache or Memcached is the way to go.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Friday, November 04, 2011

Installing RabbitMQ on Your PC

For the past 7 years, I have been running an old copy of Tibco EMS on my laptop, and I've been using this whenever I need to do some prototyping and development of new message-based apps. But in these cost-conscious times, many of my clients are looking for low-cost solutions. So, time to seriously consider RabbitMQ.

This is a quick guide that I wrote to get RabbitMQ up and running on a local PC in preparation for some .NET development.

Download the RabbitMQ Bits

1) RabbitMQ requires the Erlang runtime. You can download the Erlang/OTP runtime from You need to install Erlang before downloading and installing the RabbitMQ server.

2) Download rabbitmq-server-2.6.1.exe for Windows from the RabbitMQ site,

2a) Install RabbitMQ. It will install a Windows Service and will automatically start it.

2b) From the Windows Start Menu, run services.msc and verify that the RabbitMQ service has started.

3) All of the .NET download for RabbitMQ are located at

3a) Download the .NET 3.0 version of the RabbitMQ client libraries and samples. The file is called

3b) Extract the ZIP file to C:\Program Files\RabbitMQDotNetClient (you can pick any location that you want).

3c) In C:\Program Files\RabbitMQDotNetClient\bin, there is the main assembly called RabbitMQ.Client.dll. In the same directory is the assembly for WCF integration called RabbitMQ.ServiceModel.dll.

4) From the Windows Start Menu, go to the RabbitMQ Server item, and then run RabbitMQ Command Prompt

4a) rabbitmqctl.bat is the command-line utility that lets you control RabbitMQ and list various objects. A sample command to list the RabbitMQ exchanges is
 rabbitmqctl list_exchanges
However, we will use the management console, not the command line.

5) Download and install the RabbitMQ management plugins from

5a) Download all of the EZ files associated with rabbitmq_management_visualiser

5b) Drop these files into the RabbitMQ Server\plugins directory, which on my system is C:\Program Files (x86)\RabbitMQ Server\rabbitmq_server-2.6.1\plugins

5c) Restart the RabbitMQ server. You can do this through the Services.msc applet on your Windows machine.

Test It Out Using the Management Console

Now we will test out RabbitMQ by sending a message into an exchange and have a wildcard subscriber read that message.

6) As a test of the RabbitMQ management console, try this URL in your browser:

Use the user id "guest" and password "guest"

7) Go to the Exchange tab and add a new exchange called CalcExchange that is non-durable (transient) topic. Fill in the name, type and durability as follows:
Name: CalcExchange
Type: topic
Durability: Transient

8) Go to the Queues tab and add a new durable queue called Calc.Queue.1

9) Stay in the Queues tab. In the table of queues, click on Calc.Queue.1

9a) Add a binding. In the Exchange and Routing Key fields, add the following:
Exchange: CalcExchange
Routing Key: Calc.Data.*.1

This binds any messages that have the routing key Calc.Data.*.1 to the Calc.Queue.1 message queue. So, if you publish a message with the key Calc.Data.Foobaz.1, it will be routed to this queue, where the subscriber will pick up the message for processing.

Now we will try to send a sample message using the management console.

10) Go back to the Exchanges tab and click on the CalcExchange item

10a) Go down to the Publish Message section.

10b) Type in the following entries for the Routing Key and Payload fields:
Routing Key: Calc.Data.ABC123.1,
Payload: This is a test message for Calc Node 1

10c) Press the Publish button. When the status message pops up, just close it. The status message should have a green background, indicating success.

11) Go back to the Queues tab, and click on Calc.Queue.1

11a) Go down to the Get Messages section and press the "Get Message(s)" button. You should see the message that you just sent.

12) You can also go to the last tab of the management page, which should be the Visualizer tab, and view the topology.

As an aside, the sample programs that come with the .NET Client need a bunch of mods before you can get them to load into Visual Studio 2010.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Sunday, October 30, 2011

Siri in Capital Markets

My wife has been having all sorts of fun with her new iPhone 4s, constantly testing the NLP capabilities of the Siri servers. If you ask Siri "What is 2 plus 2", she (yes, Siri is my new daughter!) won't speak the answer, but she will show you the appropriate page in Wolfram Alpha.

I don't know if Siri will ever be used for trading. With any mobile platform, the main worries are connectivity and security. There have been reports about security gaps within the 4s, especially where Siri is concerned.

But, the Siri platform, combined with Wolfram Alpha, can be the starting point for powerful, on-the-fly analytics. Combine this with an alerting platform, and you can begin to see the possibilities for Capital Markets.

Apple needs to open up the Siri platform for integration with 3rd party apps. I can imagine someone entering a simple command like "Siri, tell me when IBM trades past its 52-week high" or something more complex like "Siri, tell me when people are rotating into the Utilities sector". This kind of custom semantic processing has been around for a while, but it's crucial that Apple opens up an API for Siri that is available to iOS developers.

Another thought - If Siri integration will only be available through a Cocoa API, does that mean that people will move away from developing HTML5 apps for the iPhone? Or will technologies like PhoneGap be more popular?

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Thursday, October 27, 2011

One of the Best and Simplest Explanations of HFT


Looks like BAC has replaced C as the high-frequency trade of choice.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Wednesday, October 26, 2011

Free Visualization Controls for Silverlight

From the Microsoft Research site:

Dynamic Data Display adds interactive visualization of dynamic data to your Silverlight application. The download contains a set of UI controls for creating line graphs, bubble charts, heat maps, and other complex 2-D plots. 

The heat map part looks most interesting.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Tuesday, October 25, 2011

Interesting Topic at STAC Research Conference in New York on Nov 2nd

I am not going to be in New York that day, but STAC has a few topics of interest at their upcoming conference on November 2nd.

"Hadoop for Trading Firms: Beyond the Science Project," by Jack Norris, VP Marketing, MapR

“Standard Technology Benchmarks for Market Risk Platforms,” by Lars Ericson, Director, Citi

It's nice to see someone from Citi presenting :-) Wish they had more people from the client side of things presenting.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Monday, October 24, 2011

Windows 8 Registered I/O Networking Extensions

From Len

The Registered I/O Networking Extensions, RIO, is a new API that has been added to Winsock to support high-speed networking for increased networking performance with lower latency and jitter. These extensions are targeted primarily for server applications and use pre-registered data buffers and completion queues to increase performance.

Is Microsoft moving towards the day where people will consider them for a HFT stack?

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Tuesday, October 18, 2011

Raptor - Real-Time Analytics on Hadoop

I am starting to get interested in what Hadoop and "Big Data" can do for capital markets. This talk at the upcoming Hadoop World in NYC looks pretty interesting, especially the part about predictive trading.

Raptor combines Hadoop & HBase with machine learning models for adaptive data segmentation, partitioning, bucketing, and filtering to enable ad-hoc queries and real-time analytics.
Raptor has intelligent optimization algorithms that switch query execution between HBase and MapReduce. Raptor can create per-block dynamic bloom filters for adaptive filtering. A policy manager allows optimized indexing and autosharding.

This session will address how Raptor has been used in prototype systems in predictive trading, times-series analytics, smart customer care solutions, and a generalized analytics solution that can be hosted on the cloud.

If anyone attends this talk, please feel free to post a comment here and let me know how it went. Hopefully, there will be some slides up on SlideShare afterwards.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Thursday, October 13, 2011

.NET Parallelism and a Worthy Cause

A fascinating case study is available from Microsoft on how Massachusetts General Hospital (MGH) greatly reduced the time (and cost) to analyze the results from a colonoscopy. For men over 50, a colonoscopy is practically mandatory, and the process is somewhat disruptive and uncomfortable. The use of parallelism has reduced the analysis time by many orders of magnitude, and has made the concept of "virtual colonoscopies" even more real.

Microsoft had pushed MGH to port its code to C#, but for a variety of reasons, this was impossible. I wonder if there would have been even more magnitude of improvements if the code was ported.

Bringing this to capital markets, companies might find that their real-time pricing and risk (and stress tests) might benefit from a rearchitecture to the .NET/C#/TPL stack.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Monday, August 29, 2011

Git and Jenkins working

This blog post helped me get running with Git and Jenkins:

The key was finding the Plugins Management page within the local Jenkins monitoring page.

So, my build environment is now set up, and consists of

  • Eclipse Helios (IDE)
  • Maven (Build and dependencies)
  • Git (Source control)
  • Jenkins (Continuous Integration)
Next step is to install JUnit and write a few unit tests.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Maven and m2Eclipse

The continuing saga of the ".NET Guy" exploring the world of Java....

After doing some cleanup from Hurricane Irene (my 2 300-foot trees remained rock-steady during the storm), I sat down with my task for the day, and that was to get familiar with Maven. In order to shield myself from command-line arguments and XML configuration files, I downloaded a very wonderful called m2eclipse. This tools integrates Maven and Eclipse pretty nicely, and it looks like that I can do all of the basic dependency management through this tool.

Maven and m2Eclipse are two more things that a .NET developer has to get use to, since Visual Studio seems to manage most of the dependency and build process. However, one nice thing about the entire Open Source movement is that you can create a maven dependency, automatically download the source code for the dependency, and download the source for all of the sub-dependencies. Also, the Maven Repositories provides a nice, centralized way of locating frameworks and tools from within Eclipse.

I also have Git working nicely, and the Eclipse integration is being provided by the eGit tool. Much nicer than Clearcase.

Now, I need to download and learn Jenkins for my Continuous Integration needs.

Are there any other Eclipse-based tools that you recommend? I would love to hear about them.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Sunday, August 28, 2011

Excursions Back into the World of Java

I did some Java development when it first came out. Honest I did. I even taught Java classes in the late 1990's at Bellcore, which was a division of AT&T.

I have managed teams that have Java-based systems. And, as part of the CEP system that I did, I had to go and write Java-based web services and deliver a Java API so that other groups could hook into the Notification Services.

Despite that, and because of my long history with Microsoft technologies, I am often viewed as "The Dot Net Guy".

So, this week, I broke out the Java manuals again, and I decided to write the Exchange Simulator portion of my little project in Java 1.6. Writing the simulator was fairly easy, and in little time, I had FIX messages being passed back and forth between my Java-based Exchange Simulator and my .NET-based OMS.

Java is not that much different than C#. They even have Reflection, something that I sorely missed when I had to do C++ in the past year. I wish that Java supported C# auto properties and C# events (is this stuff coming in Java 1.8?), but other than that, it feels like programming in C# 1.0.

However, the Java language is only one small part about being a true Java developer and architect. There is an entire ecosystem that surrounds Java development. I got a list of tools and frameworks to learn from one of my friends, and there is a lot of stuff in there. This is a list of the basic tools that one needs:

  • JDK 1.6
  • Eclipse 3.6 (Java EE Edition)
  • JTA/JTS (transaction services)
  • Spring 3.0 or 3.1
  • Hibernate 3.6
  • Open JMS or ActiveMQ
  • JBoss, Jetty, or Apache Geronimo
  • Subversion
  • Jenkins or Hudson
For the past year, I have been using Eclipse for my C++ development, editing on Windows and compiling on Linux. Editing, building and debugging Java apps using Eclipse seems to be a breeze. Eclipse has a lot of the real-time error detection built into it that I am used to with Resharper in Visual Studio. The incremental compilation helps, and it takes almost no time to go from editing to running an application.

I have experimented with Spring in various forms over the years (mostly Spring.Net), in addition to writing my own dependency injection framework. Plus, my recent experiences with Prism has gotten me back into the world of IOC (although I am annoyed that Prism/UnityContainer does not support any kind of "IDisposable" pattern when your modules are unloaded when an app closes). I cannot integrate Spring 3.1 into my app just yet because of some logging dependencies that are documented (QuickFix/J uses SLF4J, which is not compatible with Spring).

(N)Hibernate is also another ORM that I have looked at over the years.

As far as source control, I have always been a TFS/Perforce user. The past year and a half was spent in Clearcase, which seems a bit ancient, although it seems to work. Many people seem to prefer Subversion, so this is a good opportunity for me to learn it.

The same goes for Continuous Integration servers. I have been using Cruise Control .NET for all of my CI needs, and Hudson/Jenkins is something that I have heard mentioned for a while now.

The only piece of this stack that is totally alien to me is the Java Application Server. The whole concept of a Java Application Server is a bit new to me, as we really don't have an equivalent in the .NET world. I am not interested in the web server capability of these containers, so I need to find out the other capabilities that I can leverage. This morning, I am going to download JBoss as a first step.

(Update - It looks like you need to purchase a $99 subscription for JBoss. So, I will download Jetty, even though I am not sure that Jetty is a complete Java Application Server.)

(From the Stack Overflow post, where is a great summary of what a JAS is:
A Java EE app server is a transaction monitor for distributed components. It provides a number of abstractions (e.g., naming, pooling, component lifecycle, persistence, messaging, etc.) to help accomplish this. Lots of these services are part of the Windows operating system. Java EE needs the abstraction because it's independent of operating system.)

I will keep everyone posted on my progress.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Monday, August 15, 2011

First Steps in WPF and Prism

As I mentioned in my last posting, I want to spend a little time to better educate myself on WPF and Microsoft Prism. I see both of these being used more and more in Capital Markets applications, and I want to see if these two will truly allow me to build better applications faster.

I spent the last few days on a little sprint to come up with a prototype of a "Quote Workbench" that I could eventually expand into an algo trading platform or a backtesting platform. All of this code was written from scratch. My only tool was Visual Studio 2010. There are no third-party commercial controls involved, so that if you wanted to download this code and build it yourself, you would not have to purchase any additional software.

I ended up using two small DLLs that I found on CodeProject and CodePlex. One was a set of themes for the Microsoft WPF DataGrid, and the other was a color picker control that I needed to enhance in order to support WPF Databinding.

I am also curious to see if the Microsoft DataGrid can keep up with a high volume of real-time quotes. Most of the time, there is some sort of quote throttling that goes on before the quotes ever reach the grid. The general rule of thumb is that human eyes can only follow two updates per second. And, unless your GUI has some sort of CEP system or business rules embedded in it, you can usually get away of throttling quotes to the GUI. Nevertheless, I am curious to see if WPF-based grids can keep up with a high volume of quotes. From what I have heard, some of the commercial grids (DevExpress, Syncfusion, Infragistics, Xceed) are more performant that the out-of-the-box Microsoft DataGrid, but I do not have these commercial UI toolkits at my disposal right now.

From my recent experience, it seems that most developers are wrapping Winforms grids in a WindowsFormsHost control, and are not going to native WPF grids. If you have any opinions on the performance of Winforms vs WPF grids, please post a comment here.

The idea for this sprint was to write two applications, a quote server and a quote client. The quote server spits out two types of quotes, Level 1 and Level 2 equity quotes. The quote client consumes these quotes and displays the Level 1 quotes in a WPF DataGrid and displays the Level 2 quotes in a standard Depth-of-Book control.

Since the overall theme of this exercise was to learn as much WPF and Prism as possible in the shortest amount of time, I did not pay attention to the way that the GUI looks. Nor did I try to optimize the internal plumbing. A lot of the application was designed to explore as much of WPF as possible, especially aspects surrounding Data Binding. My goal for Prism was to have as much loose coupling as possible.

The first task was to write a Level 1 Equity Quote simulator. This is always a useful thing to have in your bag of tools. I preloaded a quote cache with a few symbols, and implemented a few strategies with which I could pump out quotes. Right now the strategies are limited to Round Robin and Random. A next step would be to download a list of the S&P 500 and corresponding Average Daily Volumes (ADV) for each instrument. A new strategy would be to generate quotes based on the weighted ADVs.

I also put GUIs in both the quote server and quote client so that I can see if there is any noticeable latency in delivering quotes from the server to a GUI. Of course, since I am delivering it between processes on my laptop, there shouldn't be any latency. But, if I decide to experiment with Comet and deliver the quotes across the Internet to a web-based GUI, then it's useful to have a monitoring GUI attached to the quote server.

The Level 1 quote simulator is fairly dumb right now. All it does is push out a lot of quotes at a user-specified rate. Additional improvements to the quote simulator would be

  • Let the client subscribe and unsubscribe to specific quotes
    • Use the FIX QuoteRequest  message for this
  • When no clients are interested in a symbol, automatically stop publishing
  • Use something like Google Finance or Yahoo Finance to seed the initial quotes with realistic values

Once I got the quote simulator working, and the quotes updating the server-side DataGrid in real-time, I wanted to send the quotes from the server to the client. For that, I decided to publish the quotes using the FIX Protocol. I downloaded a copy of QuickFIX.Net, and created a Prism module that would take a stream of internally generated quote objects and push them across the wire to a FIX subscriber. Generally, you would not use FIX to push across a high volume of equity quotes. But, since this is a toolbox for experimentation, and since we want to stick with some free tools that implement a quasi-standard, let's just use FIX.

Note - The QuickFIX assembly is built with .NET 2.0, so we need to add  useLegacyV2RuntimeActivationPolicy="true" to the app.config file for any application that uses QuickFix.NET.

  <startup uselegacyv2runtimeactivationpolicy="true">
    <supportedruntime sku=".NETFramework,Version=v4.0" version="v4.0"/>

Now I had Level 1 equity quotes flowing between the client and server, I wanted to start publishing Level 2 quotes. I wrote a simple Book Generator that, given a certain symbol, generates an entire new book at a specified interval. Of course, this is not how the real world works. Generally, when you receive a book, you receive the initial image of the book and then incremental updates. The job of the book builder module would be to build up a complete order book for all of the subscribed instruments, and ship out deltas to the GUI. The GUI would then need to insert new rows in the book, delete rows in the book, or modify rows in the book. I will leave this for a later exercise.

How can I transmit the book from the server to the GUI? I coded up a quick WCF PubSub mechanism using the NetTCP binding. The book was transmitted across the wire as a SOAP message. Not very good for performance, but OK for interoperability in case I want to write an iOS-based or web-based GUI.

Improvements for the Level 2 Simulator are similar to the improvements I need to make to the Level 1 simulator, and include:

  • Improve performance - use binary encoding where possible, and investigate WCF-specific improvements
    • Investigate UDP
  • When a client subscribes to a symbol, send the full image and deltas, not the entire book.
  • The pub-sub "topics" should contain information about the specific security
  • Let the client subscribe and unsubscribe to specific quotes
  • When no clients are interested in a symbol, automatically stop publishing
  • Use something like Google Finance or Yahoo Finance to seed the initial quotes with realistic values
  • In the GUI, let the user switch between aggregated and non-aggregated views of the book. In an aggregated view, we aggregate the volume at each price point.

A screenshot of version 0.0001 is included below. Just a reminder that the GUI was coded so that I could explore different aspects of WPF and Prism, and is intended to be a test bench for learning.

What's next?

Probably a very light exchange simulator. I don't have access to a high-quality commercial product like the Aegis Exchange Simulator, but several years ago, a reader of this blog sent me the binary of a small exchange simulator that he wrote.

To keep on the track of investigating new technology, I would like to explore Microsoft-specific frameworks like TPL and perhaps Rx. I need to go over Matt Davey's blog and get some ideas :-)

The depth-of-book view can use some new kind of visualization in aggregated mode.

There is some unfinished business that I have with Microsoft StreamInsight. I would like to provide a GUI that allows the user to create CEP queries, and have StreamInsight monitor the quote stream and provide alerting. I haven't looked at SI in two years, and I don't even know if I have access to it if I don't have a MSDN subscription. I can always used NEsper as a fallback (is Esper still around?)

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Sunday, August 14, 2011

On WPF, Prism, Grids, and Wall Street

I am taking this time off to try to learn some technology that I have been planning on learning, but have not had much time to dive into. Two of these technologies are WPF and Prism. For the past few years, I have been managing teams that have been using WPF and Prism, but when I have had to write GUIs myself, I have fallen back to the old comfort zone of Winforms.

When I was Chief Architect for Equities at Citi, I was asked to make a call on the future GUI technology of the Citigroup Investment Bank. After a lot of thought, I recommended the use of WPF. The recommendation was certainly not made for performance reasons, and it was not made because of any extra features that WPF had over Winforms. It was made so that Citi would be able to retain its developers. As techies, we want to be involved in the latest and greatest of technologies. We don’t want to be perceived as dinosaurs when it comes time to changing jobs. I have seen the effects on developers of companies who do not move to the latest technologies. Developers will naturally migrate to companies that will challenge them technically. So, in order to preserve the “forward moving outlook” that I was trying to promote in Equities, I chose WPF for the future direction. But I wasn’t convinced.

However, now I see that almost everyone in Capital Markets has moved to WPF. A lot of the time, you have a WPF shell that contains Winforms-based grids. WPF-based grids just can’t match the performance of Winforms grids. And, grids are everything.

I have never been totally comfortable with the thought of marrying WPF and trader workstations. Most trader workstations that I have seen are just a bunch of grids. Traders use the grids to monitor the market and put in the trades. Research and analytics are mostly done on Bloomberg. If you want to look at some charts, you will use your Bloomberg workstation.

The main difficulty in writing quote blotters is how to handle a large number of quotes. If you have a trader who is dealing with ETFs, the trader needs to be able to monitor the ETFs as well as the components in the underlying basket. This means that a quote blotter could potentially have hundreds of quotes in each tab, and several thousand quotes being monitored simultaneously. It would be interesting to see if a totally WPF-based workstation could be written that would keep up with this kind of flow.

In order to learn WPF and Prism, I have started to write a POC from scratch. I wrote a level 1 quote simulator that can push quotes out at varying speeds, and I have the quote collection data-bound to a standard Microsoft WPF DataGrid control. Over a small range of quotes, the speed seems acceptable. I have not yet explored ways to speed up the grid and to handle a high volume of quotes. I will need to start using unbound columns, implement a ring buffer of last-value caches, use more intelligent updating, use virtual grids, and start examining the use of small objects and garbage collection. All of these improvements are standard when writing GUIs to process real-time data. But I want to dive deep into WPF before going horizontally and writing these optimization.

Next up is to write a Level 2 quote simulator and to write a depth-of-book control.

Should Wall Street give up on WPF? No, not at all. I have seen some beautiful applications written totally in WPF. The old Lighthouse CEP system at Citi is one example of a beautiful and compelling GUI that is totally WPF-based. And, there were enough challenges overcome in writing the GUI where I feel that the GUI developers on this Lighthouse team are some of the very best WPF developers on Wall Street.

I would love to find out what everyone’s experiences with WPF have been in your work in Capital Markets. Please take the time to send some comments here.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Monday, August 08, 2011

New Blog - Bootstrapping a Low Latency Trading Firm

Is this the same blog that was created and quickly shut down last year?

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Thursday, August 04, 2011

I am Free to Blog Again

Hello???? Is there anybody out there??? Does anyone still have my blog on their RSS reader?

Yesterday was my last day at Citadel. As everyone knows, when you join a secretive company such as Citadel, you are not free to blog and disclose any inkling of what you were working on. This is what I agreed to when joining Citadel. Now that these constraints have been removed, I will hopefully be resurrecting this blog and will start to talk about some topics which may be of interest to the readers here.

I am appreciative to Citadel for the opportunities that they gave me, and as always, you come out of a situation knowing more than what you came in with. I ran the desktop apps team there, and morphed into leading the effort to build a new web-based real-time multi-asset trading system. And we did great work there, and came up with a beautiful system. Even though I ran the team and, as always, did the management-BA-architecture roles, I had to do a lot of the coding on the server side. After about 10 years, I got back into the C++/Boost/Linux world, and it was a lot of fun. Unix hasn't changed very much in 30 years. People still use vi, and the best way to debug is to use lots of logging statements. This gives me an even greater appreciation of the entire Microsoft Visual Studio development stack.

So, now I will embark on new adventures. It looks like the entire CEP industry disappeared while I've been gone, and people do not write too much about CEP anymore. My old CEP team at Citigroup seems to still be going strong, and keeps on delivering new functionality all the time. My friend HH did a great job in taking over the team and taking the architecture to the next level, pushing Coral8 into a background role and implementing a custom server in C# (remember that I theorized a long time ago that the best CEP engines were the bespoke ones, written for a custom purpose?).

I have been intrigued by the iPad. I was never an Apple person. I bought an iPad because we were strongly considering coming up with some products on a mobile platform, and in order to get in front of the technology, I borrowed my wife's MacBook Pro, installed XCode, and starting plugging away on Objective C.

All of the banks seem to be rapidly moving to develop mobile platforms. Are they targeting just the iPad/iPhone, or are Android and WindowsPhone in their gunsights as well? Are they doing Objective C development for the iPad versions, or is everyone moving towards HTML5, using technologies like PhoneGap where necessary? Can the new iCloud compete in any way against Azure? Can iCloud be leveraged in trading apps (ie: can you use it for backtesting services?)

If anyone would like to contact me about opportunities,  I can be reached by my Gmail address (magmasystems). Meanwhile, I am going to crack open some books, do a lot of puttering around different technologies, and try learn some of the things that I have been too busy to learn.

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.

Thursday, February 03, 2011

A New Blog on the topic of HFT Development

(Thanks to JOS for the pointer)

©2011 Marc Adler - All Rights Reserved. All opinions here are personal, and have no relation to my employer.