Thursday, August 28, 2008

Distributed Identity in Virtual Worlds

Thinking about managing identity across various distributed grids of virtual worlds, I am of course drawn immediately to OpenID as the current preeminent solution to that problem. Let me explain the problem a little bit, and discuss how OpenID might fit in.

The Problem

VWs suffer from the same problem as website with regard to multiple sign-on. However due to the central role Identity makes in VWs, the problem is much magnified in its implications. Having single identities for each website you visit is merely a clerical annoyance -- remembering each user name and password -- and one that can be quite acceptably fixed with a smart enough user agent keeping track of each password, as Firefox now does.

However in a VW, having a new Identity for each grid of VW means having to recreate yourself each time you change worlds. You have to redesign your avatar appearance, rebuild your inventory, rebuild your social network, etc. An unacceptably large burden that turns the "metaverse" into a series of "walled gardens".

What we need is a means of

  1. asserting a unique identity agreed to by all observers
  2. mapping of that identity to a common specification for appearance, inventory, etc.

For this post, we will only be considering how to enable item 1.

Thinking about solutions

The obvious means of asserting identity, at least for those of us who have followed any cryptographic developments within the past decade, is for the user agent in question to generate a asymmetric key pair, and use it to pass to the VW a digital signature.

However as we well know, an endpoint directly asserting a signature without a public third-party is vulnerable to a man-in-the-middle attack. However if there was a public third party from which he could receive your public key, then that could be avoided (so long as the third party is actually publicly trusted).

Also, the tools for generating, maintaining, and using an asymmetric key set are far from ubiquitous and user friendly. If we really want it to be something that everyone can agree on and use, is has to be practicable for the average computer user. If we wanted to use PKI directly, we would have to create a standard protocol for use in VWs, write a standard implementation, then convince everyone to use it. Otherwise we'd have nothing more than cryptographically secure walled gardens.

The simplest way to take the burden off the end-user from maintaining his own identity is to outsource it a third party as much as possible. Also we can increase the odds that various VWs might standardize on something third party.

An interesting security property of having a 3-way negotiation, is that if any one party feels that they might be being lied to by one other party, they can always verify with the third. So it takes two parties collaborating with each other to create a deception.

With special consideration towards having a protocol that could be standardized on, and already has existing implementations, we look at OpenID to see how it measures up to our criteria, and what flaws it possesses.

OpenID Protocol

I recommend you read for yourself here and here, however I will provide a brief overview of how it works.

Let us define:

User Agent: The end user's Web browser, which we will denote AliceFox.

Relying Party: A Web application that wants proof that the end user controls an Identifier, which we will denote BobBlog.

OpenID Provider: An OpenID Authentication server on which a Relying Party relies for an assertion of identity, which we will call CarolCert.

Identity: A URI that represents an identity that AliceFox will assert, and and CarolCert will corroborate. Notice that this identity is "for" AliceFox, but cannot exist outside the context of CarolCert. The two are inextricably tied. We will denote AliceFox's identity with CarolCert as


AliceFox (A) wishes to assert her identity (I) to (B) for the purpose of submitting a comment under that identity. She POSTs I to B through the comment form.

B GETs the URL I, which includes a "link" tag which points to (C). B redirects A to C, passing a "return to" parameter (R) and nonce (N) to C as GET parameters.

A and C confer amongst themselves is a manner undefined by the protocol in order to verify that A is in fact the entity known as I.

When C is satisfied of this fact, it prepares a reply to B by creating a secret key (K), a "handle" (H) that publicly identifies that secret key, and signs certain fields to ensure detection of tampering (S). C redirects A back to B with H and S attached as GET parameters.

If B carries state, B can negotiate (out of band??) the exchange of K with C using Diffie-Hellman key exchange, and use K to verify S.

If B is stateless, he merely echos A's message back to C to verify that A did not tamper with it en-route.

When B or C verifies S, B considers the identity successfully asserted, and associates A with I for the duration of the session.

What does OpenID Give

With that understanding in mind, what can we say about A and B, and what does I really mean to them?

It's important to point this out: all OpenID says from B's perspective is "A is an entity known to C as I". The delimiting context here is important to understand. A could have any multiple of identities with C, or any other OpenID provider. Moreover C could be anyone at all, even an untrustworthy party. OpenID says nothing about trust.

However, it is easy to create a trust-based system if one was to white list OpenID providers based on public reputation. It's certainly trivial to create a black list of known untrustworthy providers. This is certainly enough for our purposes as a distributed VW authentication system.

That said OpenID gives us much more. As we said above, its not hard to create our own simpler (and probably more secure) protocol. However that would entail writing an implementation, drafting a standard, ensuring it was practically deployable in working systems, and building critical mass for its adoption by all VW players. OpenID lets us skip that work, because its done already for us, and get straight to solving our own domain specific problems.

What does OpenID not Give

There is a well-known flaw in OpenID that can be used for phishing. If B is malicious, he can redirect A to his own copy-cat OpenID provider C', and if the user doesn't notice, C' can possibly steal A's credentials and allow a malicious party A' to impersonate A.

There are other means of misdirection as well. While a computer would always be able to tell that I and I' are different identities, humans might be confused if A and C are sufficiently similar looking to A' and C'.

The only solution to this is to white list known providers.

Another known problem with OpenID is the common reliance on passwords as a means of validating A with C. While that is no different from the password schemes currently in use on VWs or websites, I am sure a digital signature based protocol would be vastly superior from a technical standpoint.

My humble alternative protocol

Often the best way to judge a design is to consider how you would do it yourself, if you were tasked with making an alternative. I am happy to say that while what I started out with was quite different from OpenID, as I improved my protocol, I noticed that my decisions became more and more like those taken by OpenID, which is a good sign.

Warning: I am neither a security nor web expert, nor do I claim to be. Be gentle with your corrections.

This design also relies on cookies from one site being inaccessible from another. I have no idea how true this is in practice.

When A registers an identity I with C, A downloads an implementation of an asymmetric cipher in JavaScript that writes the private key K in a cookie, and the public key P in C's database.

A asserts identity I to B. Like OpenID, this is a URL that ties A to C, such as

Like OpenID, I contains a link to C, which B redirects A to, sending a return-to R and nonce N as GET parameters. Unlike OpenID, I also contains a link to P, which B downloads and associates with N.

C delivers to A a JavaScript program which opens K, encrypts N into a digital signature DS, then creates a message signature S to detect tampering. C then redirects A back to B, with N, DS, and S as GET parameters.

B verifies S, uses P to decrypt DS and verify it is the same as N. If so, B considers I successfully asserted.

Essentially C is nothing more than a repository for 1) public keys, 2) trusted JavaScript implementation of cryptographic algorithms.


Thursday, August 21, 2008

Disruptive Products, or Disruptive Technology doesn't apply to Software

Disruptive Technology

For a long time I've heard people use the phrase "Disruptive Technology" in conjunction with one idea or another, as if the invocation of the phrase was supposed lend some weight to the discussion I never quite fathomed. The people who tended to use the phrase also tended to be the sort of people who often used those kind of phrases in such a manner as to render them meaningless, so I never really paid much attention to exactly what it was that they meant -- because strangely enough, it turned out that whether declared "disruptive" or not, it made no difference as far as I could tell!

I've heard half-coherent explanations of the phrase, but I always took it to mean "technology that makes managers and business-types no longer understand how to make money", which I thought was nothing special considering I assume that most business types don't understand technology anyways, and their understanding of how to make money with it was always suspect in the first place.

However after Clayton Christensen's "The Innovators Dilemma" made its way to the bookshelf at work, I thought I would see just what it was I was missing the whole time. Turns out there is far more to the idea than the throngs of excessive catch-phrase users would imply.

Firstly, I object to the use of the phrase "Disruptive Technology", since that implies to me there is something inherent in the engineering or science that causes a technology to become "disruptive", which is just not the case. The things labeled disruptive technologies often have traits in common, but those traits are really all about how a market perceives them, and have nothing to do with technology itself. I prefer "Disruptive Products", because that captures the fact that this is a book about Business Management, not Engineering. The idea of "disruptive technology" as presented by Christensen cannot be separated from the fact that is is a product destined to be sold on a market.

I recommend you read the entire book yourself, because there are lots of noteworthy ideas for anyone interested the business of technology, but I will give a brief summary of the major themes as I saw them herein.

The author uses his extensive research into a collection of industries where certain technological innovations happened, which he characterizes as either "sustaining" or "disruptive", and how the affected the health of the businesses in that industry.

A Sustaining Technology is one where the market for the products made with the technology is well known, and the firm in question can use the technology to increase its profit margin by adding new features which increase the products performance in the market.

Higher profit margins mean the company can further grow, and invest that growth in tuning the entire operation of the firm to servicing the high-margin market. In many ways the firm becomes captive to its high-margin customers, because failing to invest in sustaining the growth in performance of the products, as its existing customers demand, will cause it to hamper its own growth in profit, and lose market position. The author notes that a firm's quest for continued growth, forces it to "trend upward" from the low-end low-margin area of the market, to the high-end high-margin, and that trend is essentially inevitable if a firm wants to satisfy its investors.

Disruptive Technology
is one where there the technology has no proven market at the time of development, but eventually will develop a market outside the expectation of its developers.

That in and of itself is hardly surprising. The interesting thing about the book is that Christensen does his research, and finds there is a historical trend in disruptive technologies, that if known, can help businesses manage and succeed in turning disruptive technologies into profitable, and often industry changing, products.

He discovered that DTs have common traits, and the following traits serve as a definition of a DT

  • DT often appear as an innovation stemming from an existing industry.
  • DT provide value in areas that the current market places no value in. Historically this means they are low-performing and low-cost, but provide benefits such as reduced size, or improved flexibility.
  • DT, due their perceived lack of value in the existing market, begin their life as a product by taking the low end of the market, left under served as established firms climb to the high-end in pursuit of growth.
  • Existing firms are completely unable to capitalize on DTs because their high-end cost structure means they cannot grow with low margins, and thus development of DT into products for the emerging market is left entirely to small firms.
  • DT improve in performance in traditional metrics over time, as well as retaining the benefits that gave it access to its first emerging market, and thus begins to move into the high-end and cannibalize the previous market (for which it initially appeared ill-suited), and the existing firms, caught flat footed, find them selves in serious trouble.

To make that more concrete, the classic example used throughout the book is the hard drive industry. The story there is
  1. market demands higher capacity and speed for their existing computers, which existing well-managed firms deliver, and thus grow and move upscale
  2. someone develops a smaller form factor drive, with dramatically reduced capacity, and the existing market soundly rejects is as unsuitable for their needs
  3. new markets emerge for smaller computers (mainframe -> minicomputer -> pc -> notebook -> appliance (ipod, car-navi)), which find the smaller form factor indispensable
  4. density on the small form factor drives increase to the point where it cannibalizes the market for large high-margin drives, and the once small companies who were able to market the small drives grow larger and displace the incumbents
  5. repeat (14 -> 8 -> 5.25 -> 3.5 -> 2.5 -> 1.8 etc inch drives)

Disruptive Products

However while reading the book, I kept trying to see how the concepts there applied to what I was doing, which is programming in the software industry. I couldn't help but come to believe that this analysis was a little bit dated, being published in 1997, and only seemed to apply hardware products.

Hardware products have a easily quantifiable manufacturing and distribution costs, and thus have a clear relationship between price and profit. That is expensive hardware is expensive because the physical bits involved in making it are expensive, and the overhead of moving those bits to market is expensive.

Software however, has no fixed costs except programmer salary (among which programmer talent and pay can vary widely), and some minor managerial overhead. For a shrink wrapped software product, you could charge $1000 one day and $1 the next, and it would cost you nothing but your profit margin to do.

And here is the controversial point coming up:

I don't believe that, ultimately, software is really a product at all. It's a service. The days of being able to hand over an application in exchange for $X will soon be gone for good.

Ultimately, there will be two models for commerce in the software industry:
1. Custom whole application development
2. Customization of existing open source solutions to emerging, specific, business needs

Although the first case you might assume that since there is an integrated deliverable, you might consider it a "product", I would assert that what the developer is being paid for is to solve a problem, and so the compensation is proportional to the problem, not to the cost of manufacturing the good.

I foresee that so long as there exist new and different niche businesses, there will always be a demand for new and niche software; things not complete serviced by any pre-existing software package, and have to be built from scratch.

The second case makes a lot of assumptions in excluding proprietary software from the picture. Let me try to follow the history of open source in the software industry, and use that to justify my extrapolation.

  • Most open source projects are born out of existing industries, either because the a firm needed some software for its core business, and found that once completed it was valueless to keep it secret, or because engineers trained in an industry felt like putting their skills to use outside of a corporate setting.
  • Open source solutions at first held no value for anyone outside hobbyist and hackers. It was hard to use, poorly supported, and lacking important features. However the one value proposition it did provide, access to the source code, gave it flexibility that made it absolutely indispensable for certain applications such as research or highly customized niche applications, where the code would have to be heavily modified, but there was no justification for starting entirely from scratch.
  • Due to the perceived inappropriateness of open source for business or personal use, open source use often started out as skunkworks style projects running under the radar of management, in small low-end roles where other existing solutions were inappropriate.
  • Existing firms software firms, Apple, MS, Adobe, etc, at first treated open source with derision, then fear, and tried to ignore it for the longest time, because their dependence on high-margin software sales could never allow it to move downscale and compete with something that was free. It would be corporate suicide. This allowed smaller shops to capitalize on the commercialization of open source.
  • While at first constrained to only the low end, open source has been on a trend of steadily increasing quality over time, and is now at the point where MacOS has significant open source portions, Adobe is open sourcing the basis for its strategic Flash platform, and Linux is seen as a credible alternative for Windows on low end machines.
  • I predict this trend will continue to the point where open source has so complete dominated the software world, from the bottom up, that firms like Apple are constrained to sell their OS as value-added perks on top of a mostly open OS for their hardware, Adobe becomes a seller of boutique software, and MS becomes a systems integrator and software as a service provider.

Open Source is a Disruptive Product

Surely, I'm not the first person to think of this...

Sunday, August 10, 2008

The Next Chapter

While I was at 3Di I had the pleasure of working with many real and potential partner companies from across the globe. Many times I succeeded only in embarrassing myself professionally; but I always tried to hold true to a sense of professional competence, ethics, and behaviour that stood for the kind of person I want to be. Not unselfishly, I hoped to be seen as a stand out guy, even if only to serve my own conceited self image.

Part of my professional ethic is that companies, as abstract entities, with no more life than that which the whim of its investors endow it, deserve no great respect or loyalty in and of themselves. Companies demand obedience, whether it be to laws or directives from superiors; but it's people that deserve respect and loyalty. And many times it's people outside your company, who's respect you've cultivated and earned, who go furthest in helping you accomplish your goals.

I hope that doesn't sound too shocking, because superficially it appears to be a contradictory to the notions of belonging and identity that most people hold. However if you look closer you might find merely a different definition of the same.

The purpose for someone like me to join a company, it's never about getting rich, it's always about creating something larger that yourself. Money is just a means to play the game, not the game in and of itself. On the other hand, for many people, joining a company is about keeping your head down so you can continue to earn a wage to fund whatever it is you'd rather be doing. People like that aren't going to join you in building new things.

I am happy to say that the result of my connections was several strong relationships with people from many different companies, industries, and backgrounds. Some of whom I haven't even ever met in person yet. When I informed those people that I would no longer be able to act as a liaison with 3Di due to my decision to leave, I was impressed with the level of interest in continuing working with me after 3Di. It didn't take very long for someone to offer me a job, but I am somewhat surprised at the quality of opportunity I have been given a shot at.

I have tentatively accept a role as senior engineer with RealXtend, an emerging leader in open 3D virtual worlds, to work with them directly in Oulu Finland. I will likely be involved in working with the open source community, general programming, and helping plot technical direction using the knowledge and insight I've cultivated during my short but eventful career in virtual worlds.

That means picking up the family and heading down to the equivalent of a small town to the north of Edmonton. While Oulu is quite a bit smaller than the some 35M people in the greater Tokyo area, I am actually relishing a chance to leave this mass of humanity behind for a while. I needed a new direction, under leadership that I can respect, and quite frankly, I am impressed beyond words at the professionalism and attitude that realxtend has shown in their work to get me over to Finland. These guys are great guys, and as I said to start out, it's the people there that have earned my respect and loyalty.