Saturday, September 13, 2008

The London stock Exchange Crash and .NET/Windows

I have been on the receiving side of some good-natured ribbing inside of my company in the wake of the LSE crash and the suspected involvement of .NET and Windows. Our Complex Event Processing System, one of the most strategic projects in our company, runs totally on Windows 2003/64-bit and .NET 3.5. It is one of the highest-profile usages of .NET in one of the largest financial institutions in the world.

Microsoft has been very public is touting the fact that the LSE is a heavy user of .NET. But, from what I can tell, there hasn't been a peep out of Microsoft in defending their platforms in the wake of the LSE debacle. Speculation has pointed to problems with Cisco, problems with SQL Server 2000, problems with the code written by Accenture consultants, and problems with Sarah Palin.

I have been vocal on my blog in the past about the viability of Windows and .NET as a platform for real-time trading applications. I have never seen Microsoft come out with a definitive roadmap for using its platforms for realtime trading apps. No benchmarks against Realtime Linux. No attempts to through OPRA feeds at a Microsoft Server.

Sure, there is some good stuff going on at Microsoft with their grid efforts, but that effort will be the most beneficial to risk and analytics. Things like Velocity Object Caches and SSDS are a lot of fun. Excel is everywhere. But, the core operating system is Microsoft's bread and butter.

We need Microsoft to defend the LSE's use of Windows and .NET, and we need Microsoft to address viability of using its platform for low-latency, high-bandwidth trading apps.

At the Stac Council meeting in July, the companies who gave presentations about using their platforms for trading apps included IBM, Sun, and HP. Where was Microsoft? Why aren't they involved in the Stac Council?

©2008 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.


Anonymous said...

Mark, .NET is fighting an uphill battle when it comes to realtime apps. Interpreted code just cannot possibly compete with code compiled to work directly with Win32. The whole .NET subsystem has a significant overhead in startup speed, deployment footprint, execution speed (although mitigated by precompilation), versioning, and other issues. The garbage collector cutting in just when the market has a big shift is a standard complaint which still carries water. It is basically Microsoft's answer to Java cos they missed the boat on that one. At the end of the day, Microsoft is full of highly skilled techies who like to get close to the metal, and Office / SQL server are still written in c++, so they don't really believe in the whole .NET idea for their own core bread and butter apps. Sure there is a huge need for a common framework written by a proper software company. But banks will simply continue to reinvent wheels (badly) cos they are just trying to satify short term expectations of business people and are not selling shrinkwrapped software to outside companies which would require much higher software development and QA atandards. The quality of computer programmer (and programming) is much lower in banks in general than in software houses where the pay is maybe 40% less. That is a fact and will not change anytime soon.

IBM took a big hit trying to convert all their systems to run on Java, and Corel nearly went bust trying to rewrite their office suite for Java.

The best trading systems would be written in DOS, becuase it has minimal overhead and traders are just worried about numbers and can do without the GUI nicities. The closest thing to DOS is Linux so that is one reason why it is still heavily in use. JMHO.

Lets all become Excel contractors.. :)


Mark Palmer said...

Completely agree Mark - at Apama, we formed a partnership with Microsoft with their capital markets practice and it was revealing in working with them why they just can't get an enterprise mind set. Microsoft, as an organizational construct, simply doesn't think like an enterprise software company. The group we partnered with were great, but they were stranded by the corporate desktop way of thinking on the enterprise software island.

Although they have a great base operating system and all the presence you could want, they just don't do the "little things" - sales, marketing, benchmarks, building the right partnerships, focusing on enterprise software components like middleware (and CEP :))

Although I think your plea is a good one, it seems to me that they should be coming to their own defense rather than relying on users to do it for them.

- Mark Palmer, StreamBase

Charles Young said...

A couple of comments. First, I really didn't understand 'anonymous'' comments about 'interpreted code'. .NET IL was never designed for interpretation, and I know of no mainstream interpreters for .NET code. The stuff that runs on your processor is compiled x86 or 64-bit code, and actually, the optimisation is pretty good, given that .NET can get a degree of synergy from different types of optimisation at both the IL and native code levels. Interestingly, the pre-compilation anonymous mentions (I assume he means pre-generation of native code rather than JiT compilation) is problematic in that the code cannot always be as well optimised as JiTted code! Throughput performance of pre-compiled .NET code generally lags behind that of JiT-compiled code, although the gap has closed somewhat in .NET 2.0

Anonymous is much more accurate in identifying non-deterministic garbage collection as a real problem for real-time processing. Absolutely. Technologies like .NET and Java are always going fact that challenge.

I do also agree that Windows does not seem like a natural best fit for high-performance real-time processing. You want a very clean, minimal build with absolute minimum of 'noise'. Windows earns money for Microsoft by being packed full of noisy features that make management and deployment easier (a controversial statement, I realise).

Finally, I also have a fair amount to do with certain groups within Microsoft, and my own view is that blanket statements that characterise this fairly large company never quite hit the mark. What is true for one product group, or even an entire division, is not necessarily true elsewhere. There is always a danger of living in ivory towers in a company like that, but a lot of time and effort is expended by some groups in ensuring that they remain as close as possible to the real world their customers inhabit. I'm not excusing the mistakes that are made, but I do think the real situation is a little more complex and nuanced. In my experience MS tries hard to evolve approaches which best meet the needs of their core customer base, even if that means taking a maverick approach compared with their competitors. They don’t always get it right, and sometimes the conclusions they come to are not ones I appreciate. But they do try.