Monday, 19 November 2012

A GOOD way to do Bring Your Own Device...

Continuing my series on Post-PC / Enterprise vs. Consumer IT, I want to look at the trend towards Bring Your Own Device. As a recap - previously I wrote about how Consumer and Enterprise IT have historically been very separate worlds in terms of both type of product and how its sold. I then talked about how this means that very few vendors have been able to bridge the gap between those worlds.

However as part of the shift away from traditional PC architectures towards a cloud/Post-PC world I believe these two different worlds are now starting to converge. This is going to have massive implications for how we consume IT and how tech companies make money out of it.

I now want to explore the two ways in which vendors are trying to bridge the gap - firstly by running remote services on pure consumer devices via Bring Your Own Device. Secondly by evolving new categories of dedicated crossover products - Post-PC devices.

Let's get going...


What is Bring Your Own Device?


Bring Your Own Device (BYOD) is the catchily titled idea of letting employees access corporate IT and services via their personal computers and smartphones. But BYOD takes that a step further by letting workers directly consume enterprise IT services like on their personal consumer devices.

This has received a lot of attention, but bear in mind this is nothing new. After all we've been logging into our office machines from home for years. Back in 2002 JPMorgan were even nice enough to pay for broadband for my entire house - something my three other housemates were enormously pleased with. What's different is that the rise of personal smartphones and tablets has put a BYOD-able general compute device in a much more accessible position for employees.


Positives:
BYOD makes this a
thing of the past...


    • Better device: We get to use our personal iPhone for work email. Compared to a standard-issue Blackberry this has the advantage of being a) smaller and sleeker b) not making us look like a corporate dinosaur and c) being capable of running Angry Birds.
    • Less device congestion: This saves us having to lug around ridiculous combinations like personal smartphone + work mobile + work Blackberry. This also stops you  looking like a complete mug at the airport x-ray machine (although as an alternative you could just get a SCOTTEVEST with built in iPad pocket...).
    • It saves your employer money: Also as you supply the device your employer saves money, not only on the capex bill for the intial device, but also the cost of ongoing hardware support for said device (remember I said before that support-maintenance is where the big bucks are in enterprise IT).  This is particularly attractive, of course, in the current belt-tightening environment - in essence by having you supply the device they are getting you to pay them to do you job. :-x

    Negatives:
    • Security/lack of control: This is the biggest issue with BYOD - enteprises spend billions of dollars locking down their machines and ensuring they are secure. BYOD opens a whole can of worms in terms of add complexity and thus security risk. The answer is usually some sort of walled garden or sandbox which restricts what data resides on the consumer device.
    • Limited functionality versus native: The downside of a sandboxed environment is that it tends to limit functionality, either in terms of denying it direct access to the hardware (e.g. sensors, file systems) or making software slower because of additional computing overhead from the sandbox.
    • Consumer hardware constraints: One point I made in my first post is that enterprise hardware is built-to-last whilst consumer hardware is built-to-break. There are a number of trade-offs in terms of product reliability which are undesirable for an enterprise road-warrior. The most important one for me is limited battery life - even devices with best-in-class power efficiency will not last a whole day with heavy data usage. That is a major problem if you are on the road and your iPhone is your only link back to base.
    • Data/contract constraints: Consumer devices are also tied to personal consumer contracts - particularly for data. This present issues around quality of service (not all users have the same operator/coverage or be permissioned to use all services) and cost (e.g. is users are unlikely to use data when travelling abroad unless they get reimbursed for stiff roaming fees). 
    • Platform risk: Another difference between enterprise and consumer IT I highlighted is the lack of a long-term roadmap in consumer IT. Consumer IT companies like Apple and Twitter can make significant changes to their product architecture with little or no warning. For an enterprise IT user used to having years of product roadmap this is unacceptable. Imagine even Apple changes the API permissions on iOS and your sales guys could no longer access their CRM system. Even if a quick fix were implemented it could take months to test and roll out - during which time you couldn't perform basic business functions. Yeah...
    So far, so good. But how does this work in practice? As an example I want to look at Good Technology. Firstly because they one of the most successful standalone BYOD vendors, and are doing some very interesting things around developing a platform (I LOVE platforms). Secondly because, despite their success in the banking industry I've seen very little written about them in public.



    Good Technology - the lucky company



    One of Visto's earlier consumer-focused
    products; the tech equivalent of the crazy
    old grand-aunt you never talk about.
    Like most good tech stories Good Technology's story it involves a good slice of luck (several, in fact).

    The company was founded as Visto in 1996 by a bunch of ex-Javasoft employees. For years it puttered along offering push e-mail, burning cash, and suing anyone who infringed its patents. Think Blackberry crossed with a patent troll, minus the hardware business. Historically its focus was on consumer email but, to be frank, the business was going nowhere fast.

    Then it got its first slice of luck. At the time it was suing Motorola for patent infringement (as it had done with RIM and Microsoft). And at the same time Motorola was also looking to sell non-core assets such as Good Technology, an enterprise-focused email provider it had bought for a cool $438m for just a year earlier. It looks someone put put two and two together, settled the litigation and threw Good Technologies into the deal to boot. The deal was so good (no pun intended) that after closing the acquisition in early 2009 Visto renamed itself after the target.

    Acquiring Good gave Visto, which had always been consumer-focused, a much stronger position in the enterprise market. But then again that wasn't necessarily a great place to be - Blackberry obviously dominated enterprise email with its corporate-friendly NOC architecture, and mobile leader Nokia was hot on its heels after its 2006 acquisition of Intellisync.

    But then it got its second slice of luck - just as Good started to push (again, no pun intended!) into the enterprise market it founds its biggest competitors RIM and Nokia being decimated by the onset of the iPhone. However while they remained tied to the inferior hardware and unable to offer credible enterprise email on iPhones, Good could.

    So what does Good do?


    Good's BYOD armoury



    Good email on iPad. What the Playbook should have been...
    Good offers two key products. It's bread-and-butter is its push email solution Good for Enterprise. Effectively this lets you access your corporate email, calender and intranet web browsing on a completely secure sandboxed app on your iOS, Android or Windows Phone 7 device. You simply download the app from the relevant App Store, login in with your pin and off you go.

    This means that everything you can do on your Blackberry in terms of getting push email and operating within the corporate firewall can be done on your smartphone.

    The architecture isn't rocket science. Basically they have their own servers embedded both sides of your firewall to provide a secure link for your data. They then forward the data over an encrypted mobile connection to the Good App which is run in a secure container on your smartphone. Data stored on the smartphone is, of course, encrypted as well:


    It's pretty much the same as what Blackberry did all those years ago - just substitue "Blackberry NOC" for "Good NOC" and "Blackberry Enterprise Server" for "Good control/proxy server" and you could be in Waterloo, Ontario.

    The difference is that while Blackberry's email services are only available on its hardware, Good will give you clients which run on Android, iOS and Windows Phone 7. This means as the end user you don't need to worry about ensuring compatibility/security with all these devices. Good will handle that for you.

    Good's second product is its more interesting one, an app platform called Good Dynamics launched just over a year ago. This takes the same infrastructure used for Good for Enterprise, but exposes it to third party apps. This means that an independent software vendor (ISVs) can build at app which runs on an iPad but plugs directly into a customer's internal app servers via the same secure pathways which the Good email service uses (obviously Good will charge for this, most likely by taking a cut of the ISVs sales and then charging the customer a stiff premium for the control/proxy server).

    Much like the Bloomberg App Portal I discussed last week, this offers small software vendors a way to plug into a heavy-iron enterprise IT systems in ways they wouldn't have been able to do on their own. In return Good receives both the direct financial benefits, and also the network effect of having its service (hopefully) before the go-to platform for mobile apps.

    Apps based on Good Dynamics are already up and running - check out this link for a list of ISVs or simply fire up iTunes and search for "Good Technology" in the app store. Here are a few examples:
    Roambi lets you take pointless charts on the road!

    • Box: Allows users to share share files with each other. Also offers security & mgmt features such as remote-wiping of files.
    • Splashtop: Remote desktop client which lets you plug into your computer at work via your mobile.
    • Roambi: Analytics platform which lets you pull data from your business intelligence system back at base and display it on your iPhone. (NB - sounds a lot like the mobile analytics stuff SAP has been doing with the Sybase Unwired Platform)
    • Breezy: Cloud printing app which lets you to print documents to any networked printer.

    You get the idea... Nothing earth-shaking at the moment but a lot of potential if they can get the platform right.



    How big is Good Technology?



    Good has had strong momentum with its core email/calender/browser app over the last few years, particularly amongst banks. You can see this in the chart below which shows device activations split by industry. According to the company they now sell to 4,000 organisations worldwide, including over 50 of the Fortune 100 US companies.

    Of course all such statistics should be taken with a pinch of salt but I've heard first hand of a number of big Wall Street banks who are users - these guys have clearly got over the hump of credentialising themselves with corporates.
    Source: Good Technology Device Activations Report, Q2 2012









    They also seem to have a jump on the competition at the moment - as I said I've heard of several banks using Good but haven't heard of anyone using an alternate solution. At the moment they seem to be in the sweet spot of being bigger than any other fast-moving startups (e.g. Enterpoid's Divide platform or Android specialist 3LM), and faster-moving than any bigger established vendors (e.g. SAP's Unwired platform - a clear competitor for Good Dynamics - or Citrix).

    Note: Number in blue are hard data from annual reports.
    But how big are they? Unfortunately financial data is hard to come by, but its clear from what's out there they are a force to be reckoned with.

    Let's start with Motorola. Remember Motorola paid a (not inconsiderable) $400m+ for the business in 2007. Not a bad starting point. Then two years later they sold it on to Visto. They didn't disclose how much Visto paid them, but you can do some interesting detective work around Motorola's 2009 annual report.

    Footnote 2 on page 87 discloses that MOT recorded a gain of $175m on disposals in that year. In that period they sold two assets - Good to Visto and their Printrak biometrics business to Safran. Safran's 2009 annual report says they paid $181m for this (p82). We know that at the end of 2008 Good Technologies was carried at a book value of roughly $300m (they paid $438m and took a write-down of $123m on the asset in 2008). So even if we assume Printrak had been held on the balance sheet at zero cost, the gain on disposal implies that Visto paid roughly $300m for Good.

    PS Yeah I'm showing off my forensic accounting skills here. And yeah I'm probably completely wrong due to exogenous factors (patent litigation settlements?) :-p

    Note: Numbers in blue are hard data from annual reports.
    NB the Motorola report also discloses $19m of revenues from the two disposals in Q1 (Good was sold end-Feb, Printrak sold end-Apr). Safran reported €32m ($44.5m) of revenues for the remaining 9 months of Printrak in 2009 which implies a $14.8m run-rate for the Q1 reported by Motorola. This implies $4m of revenues for Good in Jan-Feb implying a $25m annualised run-rate. Note of course this was before the whole iPhone/BYOD schtick took off, so I wouldn't say that's representative of the current business. Plus there's also the revenues Visto had before it bought Good - consumer push email (probably negligible) and patent royalties (likely more sigificant, and very high margin).

    So how big is Good Technology now? To be honest haven't a clue. But if I had to hazard a guess I would say somewhere between $100 and $1bn of revenues (probably the lower half of that range). Pass me a pin please I think there's a donkey I need to affix a tail to... :-x



    Larry Ellison's next acquisition?



    I think the broader point to make is this: Good Technology is not some fly-by-night startup. It's got a set of assets which people have been prepared to pay triple-digit millions for in the past, and since then its taken significant share in an explosive market. I would be very surprised if this was not a hot IPO candidate in the couple of years.

    What's much more likely though is that they will be bought out by a larger enterprise software vendor. Its just a matter of whether this happens before or after the IPO (my bet is before - in this fast moving market no-one can afford to wait). Good Technology is owned by VCs - they will sell for the right price.

    There's already been scuttlebutt about McAfee kicking the tires (dumb idea in my view as that would tie them into Intel at a time when the core smartphone market is going all ARM). The more likely acquirer is one of SAP, Oracle or IBM - i.e. a big-iron enterprise software vendor which doesn't own its own smartphone ecosystem (i.e. not MSFT).  My strongest hunch is that they end up going to Oracle - Larry Ellison has an obvious gap in his armoury on the mobile front, particularly vis-a-vis SAP which made a smart move buying Sybase's assets. Something like the Good Dynamics platform would plug that very nicely.


    Why I don't think BYOD is the answer


    Okay after than brief jaunt into the BYOD-osphere back to the program. I'll be frank - I do not think that BYOD is the answer. I think at best it's an interim solution while tech moves towards a genuine converged device (the Post-PC device) which runs cloud-based consumer and enterprise apps side by side.

    There are big issues I have with BYOD which I'm not sure even Good has grappled with.

    The first is that architecturally speaking its a kludge, a piece of middleware which sticky-tapes a window through the enterprise firewall onto you consumer device. It a short-term patch rather than a long-term answer. What you basically have is the same old enterprise spaghetti-ware back at base, except this time you've layered a few hundred more not-quite-thin client apps on the smartphone side to crunch the data Good has fetched for them.

    I think that's a compromise which reflects the fact that mobile networks are (still) not fast enough and ubiquitous enough to run genuine cloud apps straight into a secure mobile browser. But as networks continue their exponential improvement (hey we've just gotten LTE in London!) you will see the need for this patchwork solution fall away.

    The second is that there is a great deal of platform risk associated with Good's efforts. Take the Good Dynamics platform for example. It reminds me a lot of what Facebook is trying to do to encourage developers to have "social apps" which run on iOS but share data via Facebook. The problem with this model is that if Apple decides that Facebook or Good's platforms are freeloading (or competing) with its own APIs all they have to do is change the terms of service and ban those platforms. That's bad news for Facebook and Good, but that's horrific news if you're an enterprise which is accessing critical IT versus the Good infrastructure.

    That is a risk no sane CIO will want to take.

    Okay that's all for today (its 4pm and all I've had to eat today is a boiled egg). Next up I want to expand on why a genuine Post-PC device is the solution to the Consumer / Enterprise divide. A bientot!

    Friday, 16 November 2012

    The Hacker Way (More Cookbooks Than Sense!)


    A fun bagatelle for a Friday. On my off days I run a Cookbook blog. Not a vast amount to do with tech but occassionally, just occassionally, the two worlds collide. So I thought I'd cross-post the review I put up today of Jeff Potter's fantastic Cooking for Geeks: Real Science, Great Hacks, and Good Food.

    Here's to the Hacker Way!


    One thing to Like about Facebook


    Down to his last 11 bil...
    For those of you who follow the stock markets, Facebook's recent IPO hasn't been its finest moment. Since debuting at a heady valuation $104bn, the company is now worth 42% less than it was in May. I understand poor Mark Zuckerberg is now down to his last $11bn. Lawsuits are already flying.

    But if one good thing did come out of the whole mess it was Zuckerberg's Founder's Letter, which was included in the initial IPO filing. In it he laid out his vision for Facebook and the culture which brings it about. Just as Google used its 2004 Founder's Letter to set out the mantra "Don't be Evil", Zuckerberg believes in "The Hacker Way".

    Setting aside your view on whether Zuck is a privacy-snatching scumbag (I tend to sit in the "yes" camp), there's much to admire here. As he says:
    The word “hacker” has an unfairly negative connotation from being portrayed in the media as people who break into computers. In reality, hacking just means building something quickly or testing the boundaries of what can be done. Like most things, it can be used for good or bad, but the vast majority of hackers I’ve met tend to be idealistic people who want to have a positive impact on the world. 
    The Hacker Way is an approach to building that involves continuous improvement and iteration. Hackers believe that something can always be better, and that nothing is ever complete. They just have to go fix it — often in the face of people who say it’s impossible or are content with the status quo.
    There's more, but in a nutshell the Hacker way is questioning, meritocratic and "can-do" attitude which is always trying to push the boundaries. Which believes "Done is better than perfect", and that "something can always be better, and that nothing is ever complete".

    So what's this got to do with cookbooks?

    Well the answer is Jeff Potter's slug of culinary hacksomeness: Cooking for Geeks: Real Science, Great Hacks, and Good Food.



    Don't ask what. Ask how and why


    This isn't your usual celebrity cookbook. Jeff Potter doesn't have a posh restaurant or a michelin-starred diffusion chain (although he did get a TV gig on the back of this book). He's a old-fashioned IT geek and food nerd who decided one day to write a book.


    And he didn't go through your usual publisher either. Rather take his cook to a culinary powerhouse like Artisan, Quadrille or [a curse on all their houses] Phaidon, Potter went to O'Reilly Media. This is a specialist tech publisher best known for publishing haute-geekologie texts like The Cathedral and the Bazaar (the canonical gospel of the open-source movement) or gripping blockbusters like Learning PHP, MySQL, JavaScript, and CSS, 2nd Edition (and yeah I'm sure if I read it this blog would look a lot better!).

    And finally he doesn't think like your usual cookbook writer either. As he says:
    At our core, though, all of us geeks still share that same inner curiosity about the hows and whys with the pocket-protractor crowd of yesteryear. This is where so many cookbooks fail us. Traditional cookbooks are all about the what, giving steps and quantities but offering little in the way of engineering-style guidance or ways of helping us think.
    What you get is a book which doesn't just follow the recipe, but wants to understand where the recipe came from, why it works and how it can be improved. That is that Hacker Way.

    Thinking not only about what works together, but why
    it works together (click image for full table)
    How does he do it? The book is laid out in three main sections. The first section deals with the stuff you should know before you turn on the oven: What sort of cook are you? What is your basic kitchen setup? How does physiology (and psychology) of taste work and why do flavour combinations come together? This is probably the weakest section of the book, but a necessary evil.

    The second section is where he really gets going, analysing the key Variables which affect cooking - time, temperature and air (many chefbooks are full of hot air, but this is the first one which devotes a whole chapter to it...).

    But its in the final two chapters where Potter really kills it, as he addresses the more, er, "creative" things you can do in the kitchen. He splits this into two chapters - one on chemicals ("software", as he calls it), and one on equipment and gadgets ("hardware"). This contains the stuff most recognisable from the Heston/El Bulli/Noma world of molecular gastronomy. With a vengeance.

    But that's not all. Potter also gives dozens of recipes to demonstrate the principles. Note this isn't primarily a cookbook - the recipes standalone are distinctly uncheffy (although I am quite taken with the Calamari Crackling on p202). But what they do is practice what he preaches, by introducing startling new angles on old favourites. A chocolate cake is microwaved in 30 seconds flat. Duck confit is made without any duck fat. A Tiramisu recipe is repurposed as an engineering time/activity chart (via Cooking for Engineers)...

    A new way to Tiramisu...

    But that's not all. The text is also broken up by over twenty interviews giving expert insight on a variety of topics. Food science demi-god Harold McGee opines on Solving Food Mysteries. Le Bernardin patissier Michael Laiskonis chips in on Pastry Chefs. And don't miss Jeff Varasano's eye-popping digression on Pizza (if you haven't heard of him before, this is a man who's iconic pizza recipe runs to over fifteen thousand words). So as well as Mr Potter's wisdom you basically get a culinary boot camp thrown in for free.


    Great hacks


    But it's the hacker mentality that's at the heart of this. And this is a book full of great hacks.

    Hacking is a mindset more than anything else. As Zuckerberg said, its the result of combining constant questioning with continuous iterative improvement. Potter also throws in the idea of "functional fixedness" - mentally restructuring your world so you use your tools in ways their designer never dreamed of.

    This can be something as simple as slapping a few rubber bands on the each end of a rolling pin to allow you to roll a pizza dough out to a uniform thickness, or roasting peppers in a toaster. Or it can be as wild as clipping the lock off your oven and short-circuiting the electronics so you can use its 800c cleaning cycle to bake pizza (it worked, but Potter had to upgrade the oven door to missile-grade PyroCeram glass to keep the heat in).

    This book is full of great hacks. If you don't feel like overclocking your oven, he explains how to make a Lego Ice Cream Maker. Or if a $450 Sous Vide Supreme is out of your price range he gives step-by-step instructions about how to lobotomise a slow-cooker with a thermocouple to create your own home-made sous-vide rig.

    Julia Child eat your heart out.

    But what's also refreshing is the hacks aren't just there for shock value. There are also simple things. For example Potter shows you how to calibrate your oven with a bowl of sugar (sugar melts are 177c, giving a precise reference point for oven temp). He outlines how to mill your own flour. And he sagely points out that the most overlooked but useful thermometer in the kitchen... is nothing more complicated than your hand.


    Real science


    The hacks go hand in hand with exposition. Everything Potter does is underpinned by hardcore food science (I'd expect nothing less from an engineer and a nerd). And this is a great book on food science.

    The middle section, on Variables, gives one of the clearest explanations I've seen about how temperature affects food. And more important it isn't only how hot the food gets, but how long it stays hot. The idea of a time-temperature curve, and how it affects different cooking methods, is beautifully laid out in Chapter 4:



    And he doesn't shy away from the nasty stuff. There's a whole section on foodborne illness for example, helpfully split out into sections on "Bacteria" and "Parasites". He gives great advice on how to avoid Bacillus Cereus and tapeworm, although unfortunately to nail both of them you need to both freeze your food and heat it above 60c. Tricky.

    (And while not quite food science, but he also throws in a brilliant game-theory inspired cake cutting algorithm to make sure no-one complains about getting short changed.)

    The highlight of the book though is the last two chapters. As I mentioned already, Chapter 6 deals with "software" (chemicals and additives) and Chapter 7 with "hardware" (food gadgets FTW).

    The section of food chemistry goes through all the usual suspects you've seen popping up on A Heston Blumenthal TV show, with a clear explanation and practical examples. Potter is careful to put everything into a clear context.

    Take colloids for example. While they may sound like a species of alien parasite, in fact they are simply a mixture of any two substances - gas, liquid, or solid - uniformly dispersed in each other but not dissolved together. Basically a suspension of A in B, or as he helpfully summarises:

    Attack of the Killer Colloids...
    There. Now you know. Chocolate is a Colloid.

    If you don't know your Methylcellulose from your Maltodextrin then this is the place to come (Methylcellulose melts as it cools. Maltodextrin melts in the mouth). But what's also great is that Potter doesn't get carried away with his rocket science. He makes the very helpful point that using chemicals in food is nothing new, and backs it up by showing how salt, sugar, acids and alcohol are equally important in food science (Bacon-Infused Bourbon anyone?).

    What a shockingly good recipe!
    The last chapter on Hardware is the one with the really fun hacks - the overclocked pizza oven and DIY sous-vide machine all feature here. But there is also a comprehensive twenty-page teach-in on the techniques behind sous-vide cooking ranging from "standards" like 48-hour low-temp beef brisket to cute applications I haven't seen elsewhere, like using sous-vide to temper chocolate (one of the trickiest jobs in the pastry kitchen). Plus there's a bit of stuff on rotary evaporators, foam guns and anti-griddles, but I guess that's pretty much par for the course.



    Better than Modernist Cuisine?


    Of course when you have any book which deals with food science the elephant is the room is Nathan Myhrvold's five-volume, 24 kilogram opus, Modernist Cuisine (the only item in my collection that works better as a bedside table than a cookbook). While Cooking for Geeks covers much of the same ground, at 412 pages versus 2,438 for MC it's hopelessly outgunned in terms of depth.

    But the funny thing is I think that Cooking for Geeks is actually the better book on food science.

    You see its the Hacker Way in action. Modernist Cuisine represents Myhrvold's set-piece assault on the subject, where he brute-forces the problem with sheer weight of resources. To write his book he set up a fully staffed lab, including a hundred-ton hydraulic press, a rotary evaporator and an ultrasonic welder.

    In contrast Jeff Potter had two feet of counter space plus a 2" x 4" board hanging across the sink. So rather than throwing money at the problem he falls back on his wits and his hacks. It reminds me of the (apocryphal, alas) story about NASA spending millions of dollars designing the absolute perfect Space Pen, and the Russians just using a pencil.

    Reading them both, I actually find Cooking for Geeks gives a simpler, clearer and above all more fun explanation of what makes cooking tick. Our Nathan may have billions of dollars, dozens of experts and an autoclave, but Jeff has "Real Science, Great Hacks and Good Food".

    I know which one I'd rather have...


    Afterword: Strategy consultants could, I imagine, have all sorts of fun with this analogy of plucky agile newcomer vs. lumbering giant. I'm definitely on the side of the underdog here, not least because in his day job Nathan Myhrvold doubles as CEO of Intellectual Ventures, one of the more egregious patent trolls currently blighting the tech world. Grrr don't get me started... J

    Tuesday, 13 November 2012

    Bloomberg takes its next step into the cloud

    Back on the Bloomberg Beat


    Regular readers will know that I blogged a lot in September about Bloomberg. In a nutshell its the $45bn cloud computing giant which everyone forgets about, and one that's increasingly competing against its biggest customers - the investment banks. If you missed it I've compiled the full series into a free 20-page report available to download here.

    A key part of my thesis was that Bloomberg are one of the underappreciated pioneers of cloud computing. They were doing Software-as-a-Service for some of the world's most demanding enterprise customers decades before the Cloud existed. However what is even more interesting was that they are now turning into a credible Platform-as-a-Service (PaaS) by opening up their API (BLAPI). This is big news as become the industry-standard PaaS is pretty much the holy grail for any cloud computing vendor.

    And yesterday they took the inevitable next step by announcing their version of the iTunes App Store, the Bloomberg App Portal:



    Bloomberg opens the door to third party apps...


    This portal (accessed by the snappy Bloomberg function APPS) lets users purchase and install third party apps, running on top of the Bloomberg platform. The developer is free to set the price (obviously this is paid on top of your $20k Bloomberg terminal fee), of which Bloomberg takes a 30% cut.

    At the moment Bloomberg says there are 45 apps available, expect to ramp up to 50 by the end of the year. They say they are working with 300 app developers. At this stage app curation sounds quite heavy-handed (again not surprising give this is a v1.0 product), with Bloomberg working directly with the developers and vetting apps to make sure they do not duplicate existing Bloomberg functionality.

    The initial selection of apps is relatively narrow. It focuses largely on analytics apps for traders (the sort of things which allow you to draw funky trend lines on your stock charts, albeit with a dash of PhD level maths thrown in). These are relatively static "dumb" apps. Bloomberg say they are deliberately shying away from more complex apps involving live trading execution until the platform is more mature (sounds sensible). However they do say apps will be access data integration and collaboration functions within the Bloomberg platform.

    To get a flavour, here's some apps I uncovered after a quick scarf round the web. As I said , the majority of these seem to be technical analysis tools aimed specifically at traders:

    The Market Map visualisation app
    • Market Map: A market visualisation app which lets you view stocks/markets in the form of heat maps.
    • Stealth Analytics: Software which drills down into current trading of a given security such as the bid-ask flow rate to give a real-time view of market sentiment.
    • ExtremeHurst: A technical analysis app for traders.
    • Kase Statware: Gives trading indicators to allow traders to time entry and exit points into stocks.
    • DeMark Indicators: Technical indicators for traders ($500 /month).
    • Updata: More technical indicators for traders.
    • Forecastis: Trend prediction for traders.
    • Cycleintelligence: Quant-based forecasting system.


    Why this matters


    At first sight little about this seems groundbreaking.

    Most of the apps seem to be repackaged versions of apps previously available on a standalone basis, rather than new offerings. As such it is unclear how much these are genuinely operating on Bloomberg's platform. My suspicion is that the majority of them will be downloaded and running locally, but making data calls to Bloomberg through the API. In essence these programs are mainly using the App Portal as a glorified click-to-download button and billing system.

    However for a smaller vendor (and most of these seem to come from small single-solution companies) the distribution platform alone is a massive step forward. A big hurdle for any startup software provider is credentialising yourself with gorilla-like enterprise buyers. As the CEO of Updata says "One of the massive issues for us as a company is deployment. When you're dealing with a lot of large institutions, as a smaller company, getting your application approved and into the operation is quite a big issue, so that was really key for us".

    In theory a Bloomberg PaaS platform offers the seller the credentialisation and the buyer the promise that it will work seamlessly with the existing infrastructure. In theory if I had a bright idea for a killer trading software startup, I would be beating the path to Mr Bloomberg's door to sign a distribution deal.

    Also if Bloomberg has got their PaaS architecture right then if offers to scope for developers to do really interesting things if they can integrate with the data, security and customer credentialisation that Bloomberg offers at the back end. This, hopefully, lies ahead of us.

    Finally for Bloomberg if they can get a flourishing ecosystem to work this is a massive win for them, not only against competitors such as Reuters and Markit, but also in terms of embedding themselves more deeply into their customers' business processes. As I wrote previously, Bloomberg's biggest challenge is that the majority of their customers only use basic functionality (news, quotes, charts) which makes a Bloomberg terminal an expensive luxury rather than an essential tool (bad place to be in a downturn). Adding a pack of hired guns via the App Portal helps shift it towards the "essential" category.

    If Bloomberg are serious about being a cloud leader, the App Portal was a step they had to take.


    Reality check - what happened to the SAP Store?


    Of course I'm not going to get too starry eyed about this. The portal is in way-early stages and much of the content could (uncharitably) be called "reheated Excel plugins". Also technical detail on the platform remains scant - I suspect because Bloomberg themselves haven't figured out where they are taking it yet.

    The SAP Store: Not setting the download charts on fire.
    I'm also aware that its easy to open an App Store but hard to fill it. Just look at SAP, which launched its SAP Store offering apps running on its Business ByDesign cloud platform a good year or so ago. At launch: A bunch of sexy looking apps. Since then: Deafening silence (such that even bloggers on SAP's own website are making rude comments about it). There is clearly a chance that the Bloomberg App Portal ends up the same place twelve months from now.

    I'd like to think that won't happen, for three reasons.

    Firstly the terminal platform which the Bloomberg App Portal runs on is Bloomberg's key business. This contrasts with Business ByDesign which is a relatively new product which only accounts for a tiny part of SAPs revenues.

    Secondly the functions Bloomberg are offering (at least at this stage) are relatively simple ones which customers are already using, whereas SAP as pitching new categories on apps such as cloud ERP plugins and mobile analytics.

    Lastly the trump card is Bloomberg's relatively Darwinian internal culture, where product managers compete (often against each other) to champion their respective offerings. It is interesting to note that the point man for the Bloomberg App Portal (which I would have assumed should be a relatively major corporate initiative) is actually a thirtysomething ex-McKinseyite (and purveyor of soccer gear) who's only been with the firm for two years. The impression I get is he's been handed this baby and told that he has to either make it work, or look for new work.

    In this sort of culture I suspect that this product will either succeed big time, or cease to exist.

    Good luck mate!

    Sunday, 11 November 2012

    A History Lesson: How enterprise firms get consumer tech wrong (and vice versa)

    A continuation of my thoughts on Enterprise vs. Consumer IT. After sketching out the theoretical framework in my first post, a few more practical examples of how - and why - companies fail at bridging the divide.


    Recap - A Tale of Two Cities - Enterprise and Consumer


    During my undergraduate studies I once went to a lecture. This was an unusual occurrence because, as I was doing a Modern History degree, lectures were strictly optional.

    This lecture was on St Augustine - I presumed it was Augustine of Canterbury, early English missionary and highly relevant to my paper on Anglo-Saxon history. The amusing thing was that it actually turned out to be about Augustine of Hippo, guilt-tripped North African theologian and originator of the idea of Two Cities (subsequently swiped by Charles Dickens).

    And an idea which sort of sums up my previous post on Enterprise vs. Consumer IT. Like Augustine's Cities of Man and God, there are two parallel but completely separate worlds of Enterprise and Consumer IT. While they are raised on the same technological foundations the products they sell, who sells them and how they sell there are utterly different. Here's summary of what i wrote:



    How (not) to sell to both enterprise and consumer - a brief history lesson


    Another point I raised in my previous post is that very few IT companies are successful at selling to both halves of the market. You sort of have this corporate groundhog day where a company does very well in one segment, gets delusions of grandeur and thinks they can shift their bulk to take on the other side.

    To flesh out that point, I'd thought I'd (finally) make use of my undergraduate studies, and give a brief history lesson (see those three years weren't entirely wasted after all!):

    • IBM tries to crash the consumer market with the PC Jr: In 1984, having built up a handy lead selling PCs to business, Big Blue turned its beady eye on the consumer market with a stripped-down PC called the PC Jr. If was one of the Big Blue's more memorable flops - overpriced, underpowered and rather shown up by Apple's sexier-looking Mac. And just to show that they hadn't learned their lesson IBM tried six years later with the PS/1, an ugly duckling of a desktop hobbled by a 16-bit 286 processor. One good thing did come out of the episode though - the original Kings Quest game was funded by IBM to show off the PC Jr, the first of many excellent games to come from the pen of Ken and Roberta Williams.
    The PC Jr, and Kings Quest (the only good thing to came out of the whole mess!)


    • Amstrad's fumbles the enterprise: In the UK Amstrad tried to do the opposite. Having built up a handy business selling home computers, and bargain-basement PCs for the masses, it then tried its hand at genuine enterprise-class PCs. The result was the PC2000 series, launched in 1988 against IBM's sexy new PS/2 architecture. Unfortunately while CEO Alan Sugar's ambitions were enterprise-class, his product control wasn't. Numerous hardware failures - notably a spate of dodgy Seagate hard drives - did it in for him and his PCs. Makes you wonder how exactly Sugar (now Lord Sugar) got his reputation as a "business guru"!
    Amstrad's 2000 series - quite good if you don't need a hard disk...

    • Microsoft's game of two halves: For the first two decades of its history Microsoft had very little to do with the consumer. With the exception of the excellent Flight Simulator series (RIP), I struggle to think of a decent consumer highlight. The lowlight would obviously be MSN walled garden - their Canute-like attempt to pretend the Internet didn't exist. However since then their fortunes have turned around with the success of XBox and XBox Live - although I note that after ten years this business still remains tiny when set alongside their enterprise offerings. And a shan't mention Zune. Much.
    Look, could you stop picking on the Zune. The hardware wasn't actually that bad (but shame about the rest...).

    • RIM's successes (and failures): Blackberry are probably the most successful crossover product to date (but see what's happened since!). They took an original enterprise hit product and used it to ride the initial stages of the smartphone boom. They then extended their consumer lead with their BBM product. However since then consumer success has proved transitory, even though the enterprise side remains resilient (so far).
    Open the door... Get to the floor... Everybody walk the dinosaur!


    • Google vs. Office: Google have obviously been a consumer smash, pretty much immolating the world of online advertising and listings. However their enterprise offerings (does anyone remember the Google Search Appliance?) have been nascent. Google Docs for example is a great product and suits the needs of the majority of Office users, but has simply failed to make a dent in the risk-averse world of IT purchasing managers.
    A rack-mounted Google Search Appliance. Don't think they quite cracked the whole corporate colour-scheme thing yet!


    • Amazon into the Cloud: Amazon has probably has the most success of recent crossovers, branching out from its core online business into cloud hosting with Amazon Web Services. They have done a great job taking core competencies in data-centres and distributed computing and using it to pretty much start a whole new business. Perhaps this is the exception that proves the rule...
    Amazon Web Services. For once, no sarky comments. Respect is due.

    Why companies fail to cross the divide


    So what lessons can we learn from these failures and (limited) successes? When I look at the failures, I think there are a few key lessons that stand out:


    1) Consumer companies misunderstimate the enterprise buyer: When consumer-focused companies try to sell to enterprises, they often underestimate how demanding the buyers are. There's a reason "Enterprise-grade" is an adjective - it means that corporate customers have much more exacting standards. If something doesn't work an apology and a product recall will not do. Amstrad's botched attempts to sell PCs to business was an example of this - their consumer business was built around making machines just-good-enough but more-than-cheap-enough. However when they tried to cut corners with their PC2000 series they were soon found out.

    Some of Google's attempts also touch on this. The Google Search Appliance, for example, was a useful product which wasn't pitched correctly at the enterprise buyer ("What do we use it for? Where is the ROI case? And who's going to support the sodding thing?"). Contrast that with Autonomy which has had great success selling million-dollar search appliances, because they knew exactly what buttons to press in the enterprise.


    2) Enterprise companies get caught out by the brutal competitiveness of consumer tech: For enterprise companies trying to sell to the consumer, it can be like a panda jumping into a pool of piranhas. Coming from a world of 5-6 year product cycles they are ill-equipped to deal with the fast-moving world of consumer IT. RIM is a case in point - their initial product was great for where the market was. However the iPhone uprooted the whole landscape, and fast-followers like Samsung moved swiftly in to follow. Meanwhile RIM was left with a pretty static portfolio and simply couldn't keep up with the ruthless pace of change. What competitive advantage they had was rapidly eroded.


    3) Companies try to bridge two markets with the same product: If you've had a hit product on one side of the fence, its tempted to try and tweak it to sell to the other side. After all it means you can leverage your current investment and market position, which seems very attractive versus having to start again from scratch. However this path can lead to fatal compromise - IBM PC Jr was a good example. They tried to wrangle their enterprise PCs into a Procrustean Bed to serve the home market. At the same time they had to brain-damage it enough so that it wouldn't cannibalise enterprise sales. The result was a sad compromise.

    Admittedly it can work at times - the Blackberry was the right product at the right time. However once the consumer work moved touch-screen they were conflicted because their enterprise customers demanded the keep the keyboard, just as the consumer market was leaving it behind. In the end they pleased no-one.


    And how a few have succeeded


    Of course its not all bad news. There are, as I have said, a few beacons of hope: Microsoft with XBox and Amazon with AWS/EC2 are two good examples of this. How do they do it?

    XBox motherboard - PC in a box.
    One key point is how they've taken core technology from their existing business, but used it to create completely new products. Rather than falling into the trap of trying to modify an existing product for a new market, they have started with a blank slate.

    For example the original XBox was built on a solid WinTel platform, but ditched all signs of PC compatibility (as least at the end-user level). In essence it was a very very ordinary Pentium III / GeForce 3 based PC in a silly box. Similarly Amazon's EC2/Web Services are light years away from their core consumer-facing shopping sight, but under the hood share a lot of the infrastructure know-how.

    The advantage of this approach is that they leveraged existing skills but their strategy in new market wasn't constrained by baggage from their old one. This avoids the big trap I've highlight above of trying to bridge enterprise and consumer with the same product.


    There is one big disadvantage however, which is if you are starting with a new product you are by definition launching from a low revenue base. This means it can take a long time for you to build the new product to a stage where it makes a meaningful impact on group revenues. XBox for example has been around for over a decade, yet still contributes only 13% of group revenues.


    Then again 13% is better than nothing.


    Right that's all for tonight (its 2am at the moment. I think I need some sleep!). When I come back I want to continue this series by thinking about how vendors are finally trying to unite the two worlds - first by using Bring Your Own Device as a compromise (e.g. Good Technology), and secondly how Post-PC devices could finally bring them together.

    Also plotting a secret left-field book review, which will be ready when its ready. J

    Wednesday, 7 November 2012

    Why Apple doesn't do big business (and IBM doesn't do consumer)

    Right, after a brief election-related diversion (actually thought the election nite TV show was a bit dull - if were paying $6bn for it I'd have wanted some better special effects!), back to my series on the Post-PC world. This week I want to think about the yawning gap between enterprise and consumer IT, and how Post-PC devices help to fill it.


    A tale of two cities?


    It is one of the truisms of the tech business that its impossible to be good at both enterprise and consumer IT.

    Not Big Blue's finest hour.
    Apple? A behemoth in the consumer world. But even five years into the iPhone era they still barely dent in the enterprise world (and yes I'm including BYOD/Good - more on that later). IBM? The world's biggest IT Services company and the world's second biggest software company. But I can't think of a consumer product from Big Blue since the sold PCs to Lenovo. And even before that their gear didn't exactly set the world on fire <ahem> PC Jr <ahem> PS/1.

    Companies that bridge the divide are few and far between. Microsoft has arguably done it with XBox (although if you look at the financials, its still pretty much an enterprise OS and Office company with a tiny games console business bolted onto its arse). Blackberry managed it for a few years by taking its enterprise communications devices to the masses (but look what's happened since...). But by and large the grungy white-collar world of IBM/Oracle/SAP is a very different world from the rainbow polo-necked world which Apple/Facebook/Google live in.

    What I want to do is highlight three big differences between the consumer and enterprise IT, and then show how the Post-PC world is bringing them together.



    1) Consumer IT = shorter product cycles


    Sixty bucks for a roster update? Money for old rope!
    Catch me while you can: The standout feature of consumer IT is short product cycles, typically twelve months but sometimes as short as six. There are a couple of reasons for this. First the modern consumer is trained to want "this year's model" (or ideally this season's model) which means older SKUs are quickly deprecated (EA Sports' annual updates are a good example of this). Secondly because consumer tech is a growth category it offers excess economic returns which attract extreme levels of competition (just look at this year's race to have a "flagship" smartphone)

    In contrast enterprise IT typically moves in longer product cycles, typically 5-6 years. This reflects the slow-moving and complex nature of an enterprise IT organisation. Windows 7 a great example of this - for many enterprises the Windows 8 launch has come just as they begin their Windows 7 roll-out. Also remember companies like SAP or Oracle make much more money from annual support than upfront licence - they are masters at elongating the 5-6 years product cycle.


    What were they thinking??
    Built to last vs. built to break: This means that consumer IT products are built to become obsolete. Consumer devices are meant to conk out every 18 months, preferably just as a new model hits the market. This is manifested in a number of ways. The obviously one is lower build quality (arguably one of Nokia's early faults was producing phones like the Nokia 6310i which were too damn durable!). In contrast enterprise devices are mission critical so have to be built to last - the old logic is that if you're IT system is down then you can't open up shop in the morning.

    Another way this is done is making devices more closed an appliance like (cf the lack of memory/storage expandability in MacBook Air/Retina Display Macs. After all there's no point building an expansion port into a product that's going to be junked in a few months, especially if you can use that lack of memory/storage/speed to encourage them to buy a new <insert shiny new Apple product here>. Enterprise devices are more open and upgradable e.g. swappable batteries in Blackberrys - one issue about having the iPhone if you are a corporate road warrior is that if you blow through your battery life by 2pm you have a serious workflow issue.


    Cutting edge features vs. trailing edge reliability: Another consequence of shorter product cycles is that there's always a rush to include cutting-edge features in an attempt to one-up the competition (even if they're not ready for prime time; Siri anyone?). This means consumer IT is more likely to be at the bleeding edge - a good example I've used before is the "Wing Commander 2 Factor" - how cutting-edge video games drove the increasing power of desktop PCs from 1990-2005.

    In contrast enterprise IT tends to have fewer, but more robust features with evolutionary rather than revolutionary products the order of the day. That's not to say that revolutionary disruptions don't happen in enterprise IT world, but what normally happens is that they come from a small start-up rather than from one of the mainstream behemoths (said start-up is usually acquired two months later by Oracle).


    2) Solutions versus products


    Not just for Christmas... The second important difference between enterprise and consumer IT is that historically consumers have bought a product, whereas for the enterprise you are buying a wider solution which not only includes a device or software CD, but also a maintenance contract and a bunch of systems integration around it. Whereas a consumer device is like buying a puppy for Christmas, an enterprise IT solution is more like having a child (in more ways then one - I'll warrant more than a few corporates still have 18-year old Fortran databases gnashing away in the basement!).


    Instant gratification versus product roadmaps: One important consequence of this is that enterprises users need a roadmap. This is one of the biggest reasons why Apple struggles to sell to corporates. An IT manager needs to be able to schedule his deployments months or years in advance. If you've built your IT around a particular product you don't want the risk that your supplier turns around without warning and completely changes its external interface. In contrast consumer IT companies are all about springing a short-term surprise to drive the fashionistas.

    This sort of roadmap gives enterprise IT strategists a warm fuzzy feeling. Especially the bit which says "Risk-Free"!

    Support matters for enterprise buyers: Similarly enterprise users need a package of lasting, reliable support. Unfortunately consumer devices don't always deliver that. Take the iPad. This device has been much hyped for enterprise adoption but the bald fact is that Apple announced it was dropping support for the original iPad in its iOS6 announcement this June. That's barely two years after the device was released and only 14 months after they stopped selling it. For an enterprise IT buyer that is completely unacceptable (after all it's like to take 14 months to get the damn thing certified for your organisation!).


    "Yep we'll buy it. The users will just how to figure out how
    it works later. Oh and How much did you say that optional
    training contract cost would cost me?"
    In enterprises, the buyer is not the end-user: Because a solution is large and complex the purchasing decision is made by the organisation rather than the end-user. This makes purchasing decisions larger and slower and also impacts the product being sold. One think that's pretty obvious that successful consumer devices has simple, attract user interfaces. In contrast for enterprise solutions interfaces are more difficult to use, sometimes horrifically so (three worlds: Blackberry Web Browser). This is partly because enterprise IT is more complex, but also I think because in a big corporate the guy making the purchasing decision is in the IT department, rather than the end-user who has to use the damn thing.


    Enterprise IT needs to play ball with the existing stack. For an enterprise buyer you have have the whizziest, sleekest product but if it doesn't play ball with you're existing IT stack its next to useless. Interoperability (and ensuring a new product doesn't bork the existing IT stack) is a critical part of testing. In the consumer world interoperability is become more important (that's the whole point of Apple and Google's eco-system plays), but its nowhere near as much of a deal breaker. You can still sync your iTunes to a Windows PC, or your Android phone to a Mac. Apple hasn't figured out how to block that yet (although once they take Macs to an ARM-based iOS platform I'm sure they will!).



    3) Different business models


    Upfront sale versus ongoing maintenance: As I mentioned already, SAP and Oracle are masters about getting the customer to buy into not only a product but a whole support package. This has important financial implications as they make much more money over a product's life-cycle from high-margin recurring maintenance (typically about 20% of the initial licence fee) than they do front the initial licence. That's a complete inversion of say a consumer video game where the overwhelming majority of profits is delivered in the first month of sales (before piracy manages to kick in).


    Source: Asymco
    Volume and fixed cost models: If you sell to standalone consumers you will by definition have a higher volume (of lower value) sales vs. if you sell to a deep-pocketed corporate. This drives consumer businesses more towards the classic high fixed cost/high gross margin business model where if you get above a certain number of sales you can rake in massive products, but if you don't break even your margins plummet (greatest exponent: McDonalds). This is particularly obvious in the smartphone world where Apple and Samsung capture the majority of volume and virtually 100% of industry profits, leaving Nokia, LG and Sony with virtually nothing.

    In contrast enterprise IT is much more of an a la carte affair. For example most enterprise software companies should be able to make a decent 20%+ margin, so longer as they are over the initial sales & marketing hump. It's much less a winner-takes-all world.


    "We.. are.. the.. ghosts.. of.. IT past. On your left is
    ADABAS, the lady on the right is ASE and that's Ms
    Fortran and Little Ms Cobol in the middle...."
    Enterprise IT is full of zombies: Because because of the short product cycles and lack of ongoing maintenance, when a consumer IT company stop's selling new products their margins go into sharp decline (cf Nokia, RIM). That's very different in enterprise IT where the stickiness of many heavy-duty products and the requirements to pay ongoing maintenance means companies can stay in business (and make good money) for decades after they go into decline. Software is a particularly zombie-friendly zone. German company Software Ag continues to make millions in maintenance from ADABAS, it's 1970's- era database. Sybase was another example of this which parlayed cashflows from its 1980's ASE daabase into smart mobile middleware investments and eventually a $6bn bid from SAP.


    Right that's all for today (to be honest I'm still slightly groggy after my election night all-nighter). Hopefully I've outlined some of the contours of why these are like "two nations divided by a common language". What I want to do in my next post is build on this and show how the Post-PC world brings them together, and what companies are poised to succeed (or fail!).

    Stay tuned!

    Monday, 5 November 2012

    An update on the election night arb...

    The big night looms...


    As with the last time I blogged about this, not exactly tech focused, just a bit of fun ahead of the Big Night tomorrow (I plan on staying up to watch the show... Just need to find some cooked pork products in the deli section to keep me company).

    I wrote about the arbitrage between the two leading prediction markets, the Iowa Electronic Market and Intrade back at the end of September. Interestingly at the time you could make a $14.45 arbitrage profit trading the two.

    The evolution of an arbitrage

    You can see this on the charts below. The one on the left show's the actual price offered on the two candidates by the two markets (Obama quotes are the upper two lines, Romney quotes the lower two). The one on the right show's the spread between the two offered by each market:



    As you can see, through August until about the end of September when I wrote about it, a noticeable difference opened up between the two candidates. It then closed right up in October, partly I think because the difference had started to attract some mainstream coverage (e.g. this article which came out before I'd posted my piece - although I'd come up with much the same thinking on my own,  FT Alphaville and PBS).

    On 15th October when I updated the analysis the spread can come right in and actually there was a slight cost to the trade (around 1% annualised, which makes sense as a trading cost for a relative illiquid market).

    Now as we roll into election day the spread has gone right out again. Here is the updated analysis - in theory (if you can get your mail check into the University of Iowa fast enough) the trade now yields a whopping $44.68 (9%) one-day return:


    There are a couple of reasons for this. One is a technical one - the Intrade $4.99 monthly fee is payable on the 1st of the month, so if you sign up now and then close your account before 1st Dec you don't have to pay it.

    The second one is more of a bummer - technically IEM pays out based on the winner of the popular vote and Intrade on the winner of the electoral college. While Obama retains a thin but material lead in battleground states the national vote is closer, raising the prospect of a 2000-style split verdict where he wins the college by not the popular vote. In which case of course the strategy isn't as risk-free as first appears! I suspect the risk around this unlikely but hard to value event is the main reason why the spread has gone up.


    What's the chance of that?

    Of course this leads us onto a different trade. Effectively Mr Market is now saying there is a roughly 9% chance of a split vote occurring (making the simplifying assumption of zero trading cost - in reality its probably a tad higher once you strip out trading costs). For context NYTimes pundit Nate Silver assigns that outcome a 7.6% probability in his influential FiveThirtyEight blog. Although I would add the safety warning that Silver's analysis is derived by analysing past trends (albeit with adjustments) - one of the great bugbears (IMHO) of stock analysis.

    Now what's the chance of that? I guess we'll find out in around thirty hours time...

    Anyhow no earth-shattering conclusions here. Just a cute little episode which illustrates some of the mechanics - and pitfalls of arbitrage, as well providing an example of how the market can struggle to value illiquid, esoteric assets. Which is probably how we got into this whole credit crunch mess in the first place!

    Right I'm off to the shops now to hunt down some pork pies!