Is Desktop Software Dead?

When was the last time you were impressed by desktop software?

Really impressed?

After seeing (in chronological order) Steve Jobs, Al Gore and Tim Bray make use of Apple Keynote, I absolutely had to give it a try. And impressed I was – and to some extent, still am. For me, this revelation happened about a year ago. I cannot recall the previous instance – i.e., the time I was truly impressed by desktop software.

Although I may be premature, I can’t help but ask: Is desktop software dead?
A few data points:
  • Wikipedia states: “There is no page titled “desktop software”.” What?! I suppose you could argue I’m hedging my bets by choosing an obscure phrase (not!), but seriously, it is remarkable that there is no Wikipedia entry for “desktop software”!
  • Microsoft, easily the leading purveyor of desktop software, is apparently in trouble. Although Gartner’s recent observations target Microsoft Windows Vista, this indirectly spells trouble for all Windows applications as they rely heavily on the platform provided by Vista.
  • There’s an innovation’s hiatus. And that’s diplomatically generous! Who really cares about the feature/functionality improvements in, e.g., Microsoft Office? When was the last time a whole new desktop software category appeared? Even in the Apple Keynote example I shared above, I was impressed by Apple’s spin on presentation software. Although Keynote required me to unlearn habits developed through years of use Microsoft PowerPoint, I was under no delusions of having entered some new genre of desktop software.
  • Thin is in! The bloatware that is modern desktop software is crumbling under its own weight. It must be nothing short of embarrassing to see this proven on a daily basis by the likes of Google Docs. Hardware vendors must be crying in their beers as well, as for years consumers have been forced to upgrade their desktops to accommodate the latest revs of their favorite desktop OS and apps. And of course, this became a negatively reinforcing cycle, as the hardware upgrades masked the inefficiencies inherent in the bloated desktop software. Thin is in! And thin, these days, doesn’t necessarily translate to a penalty in performance.
  • Desktop software is reaching out to the network. Despite efforts like Microsoft Office Online, the lacklustre results speak for themselves. It’s 2008, and Microsoft is still playing catch up with upstarts like Google. Even desktop software behemoth Adobe has shown better signs of getting it (network-wise) with recent entres such as Adobe Air. (And of course, with the arrival of Google Gears, providers of networked software are reaching out to the desktop.)

The figure below attempts to graphically represent some of the data points I’ve ranted about above.

In addition to providing a summary, the figure suggests:

  • An opportunity for networked, Open Source software. AFAIK, that upper-right quadrant is completely open. I haven’t done an exhaustive search, so any input would be appreciated.
  • A new battle ground. Going forward, the battle will be less about commercial versus Open Source software. The battle will be more about desktop versus networked software.

So: Is desktop software dead?

Feel free to chime in!

To Do for Microsoft: Create a Wikipedia entry for “desktop software”.

Evolving Semantic Frameworks into Platforms: Unpublished ms.

I learned yesterday that the manuscript I submitted to HPCS 2008 was not accepted 😦
It may take my co-authors and I some time before this manuscript is revised and re-submitted.
This anticipated re-submission latency, along with the fact that we believe the content needs to be shared in a timely fashion, provides the motivation for sharing the manuscript online.
To whet your appetite, the abstract is as follows:

Evolving a Semantic Framework into a Network-Enabled Semantic Platform
A data-oriented semantic framework has been developed previously for a project involving a network of globally distributed scientific instruments. Through the use of this framework, the semantic expressivity and richness of the project’s ASCII data is systematically enhanced as it is successively represented in XML (eXtensible Markup Language), RDF (Resource Description Formal) and finally as an informal ontology in OWL (Web Ontology Language). In addition to this representational transformation, there is a corresponding transformation from data into information into knowledge. Because this framework is broadly applicable to ASCII and binary data of any origin, it is appropriate to develop a network-enabled semantic platform that identifies the enabling semantic components and interfaces that already exist, as well as the key gaps that need to be addressed to completely implement the platform. After briefly reviewing the semantic framework, a J2EE (Java 2 Enterprise Edition) based implementation for a network-enabled semantic platform is provided. And although the platform is in principle usable, ongoing adoption suggests that strategies aimed at processing XML via parallel I/O techniques are likely an increasingly pressing requirement.

Cyberinfrastructure: Worth the Slog?

If what I’ve been reading over the past few days has any validity to it at all, there will continue to be increasing interest in cyberinfrastructure (CI). Moreover, this interest will come from an increasingly broader demographic.

At this point, you might be asking yourself what, exactly, is cyberinfrastructure. The Atkins Report defines CI this way:

The term infrastructure has been used since the 1920s to refer collectively to the roads, power grids, telephone systems, bridges, rail lines, and similar public works that are required for an industrial economy to function. … The newer term cyberinfrastructure refers to infrastructure based upon distributed computer, information, and communication technology. If infrastructure is required for an industrial economy, then we could say that cyberinfrastructure is required for a knowledge economy. [p. 5]

[Cyberinfrastructure] can serve individuals, teams and organizations in ways that revolutionize what they can do, how they do it, and who participates. [p. 17]

If this definition leaves you wanting, don’t feel too bad, as anyone whom I’ve ever spoken to on the topic feels the same way. What doesn’t help is that the Atkins Report, and others I’ve referred to below, also bandy about terms like e-Science, Grid Computing, Service Oriented Architectures (SOAs), etc. Add to these newer terms such as Cooperative Computing, Network-Enabled Platforms plus Cell Computing and it’s clear that the opportunity for obfuscation is about all that’s being guaranteed.

Consensus on the inadequacy of the terminology aside, there is also consensus that this is a very exciting time with very interesting possibilities.

So where, pragmatically, does this leave us?

Until we collectively sort out the terminology, my suggestion is that the time is ripe for immediate immersion in what cyberinfrastructure and the like might feel like or are. In other words, I highly recommend reviewing the sources cited below in order:

  1. The Wikipedia entry for cyberinfrastructure – A great starting point with a number of references that is, of course, constantly updated.
  2. The Atkins Report – The NSF’s original CI document.
  3. Cyberinfrastructure Vision for 21st Century Discovery – A slightly more concrete update from the NSF as of March 2007.
  4. Community-specific content – There is content emerging on the intersection between CI and specific communities, disciplines, etc. These frontiers are helping to better define the transformative aspects and possibilities for CI in a much-more concrete way.

Frankly, it’s a bit of a slog to wade through all of this content for a variety of reasons …

Ultimately, however, I believe it’s worth the undertaking at the present time as the possibilities are very exciting.

CANARIE’s Network-Enabled Platforms Workshop: Follow Up

I spent a few days in Ottawa last week participating in CANARIE’s Network-Enabled Platforms Workshop.

As the pre-workshop agenda indicated, there’s a fair amount of activity in this area already, and much of it originates from within Canada.

Now that the workshop is over, most of the presentations are available online.

In my case, I’ve made available a discussion document entitled “Evolving Semantic Frameworks into Network-Enabled Semantic Platforms”. This document is very much a work in progress and feedback is welcome here (as comments to this blog post), to me personally (via email to ian AT yorku DOT ca), or via CANARIE’s wiki.

Although a draft of the CANARIE RFP funding opportunity was provided in hard-copy format, there was no soft-copy version made available. If this is of interest, I’d suggest you keep checking the CANARIE site.

Finally, a few shots I took of Ottawa are available online

CANARIE’s Network-Enabled Platforms Workshop

Next week, I’ll be attending CANARIE’s Network-Enabled Platforms Workshop: Convergence of Cyber-Infrastructure and the Next-Generation Internet in Ottawa. Although the workshop is described elsewhere, to provide a little context consider that:

The purpose of CANARIE’s Network-Enabled Platforms Workshop is to explore the development of and participation in network-enabled platforms by Canadian researchers and other interested parties. The workshop will be an important step towards the launch of a CANARIE funding program in this area.

Based on the agenda, I expect this will be a highly worthwhile event, and I am looking forward to it.

My contribution to the workshop will be a short presentation described by the following abstract:

Evolving Semantic Frameworks into Network-Enabled Semantic Platforms

Ian Lumb
Manager Network Operations
Computing and Network Services
York University

A semantic framework has been successfuly developed for a project involving a network of globally distributed scientific instruments. Through the use of this framework, the semantic expressivity and richness of the project’s ASCII data is systematically enhanced as it is successively represented in XML (eXtensible Markup Language), RDF (Resource Description Formal) and finally OWL (Web Ontology Language). In addition to this representational transformation, there is a corresponding transformation from data into information into knowledge. Because this framework is broadly applicable to ASCII and binary data of any origin, it is appropriate to develop a network-enabled sematic platform that (i) facilitates integration of the enabling languages, tools and utilities that already exist, and (ii) identifies the key gaps that need to be addressed to completely implement the platform. After briefly reviewing the semantic framework in a generic way, a J2EE (Java 2 Enterprise Edition) based, work-in-progress proposal for a network-enabled semantic platform is forwarded.

I expect to be sharing more on this thread as it develops …

PR Agency Deems My Article Cynical: Internal OGF Communication Leaked via Google Cache

I’m a huge fan of WordPress and Google .

While perusing my blog’s WordPress stats recently, I noticed that my opinion piece on the creation of the Open Grid Forum (OGF) was receiving interest.

On Googling “open grid forum”, my GRIDtoday article rated as the number three result. In first place was the OGF’s Web site itself, and in second place a breaking news article on the OGF in GRIDtoday. Not bad, given that Google reports some 17.7 million results (!) for that combination.

This prompted me to Google “open grid forum lumb”. Not surprisingly, my GRIDtoday article rated first out of some 822 results. Following four results pointing to my blog, and one more to a Tabor Communications’ teaser, is the seventh result:

[gfac] FW: Final OGF Coverage Report
Harris also discusses a cynical article contributed by Ian Lumb of York University (formerly of Platform Computing Inc.), “Open Grid Forum: Necessary…but

http://www.ogf.org/pipermail/gfac/2006-July/000171.html – 12k – CachedSimilar pagesNote this

Somewhere between “… cynical article …”, and a subject line that belies an internal communication, my attention was grabbed!

So I clicked on the link and received back: “The requested URL /pipermail/gfac/2006-July/000171.html was not found on this server.” Darn!

Then I clicked on “Cached” … and:

This is G o o g l e‘s cache of http://www.ogf.org/pipermail/gfac/2006-July/000171.html as retrieved on 30 Sep 2006 05:14:59 GMT.
G o o g l e‘s cache is the snapshot that we took of the page as we crawled the web.

Excellent!

Below is an extract from the cached version of the page:

[gfac] FW: Final OGF Coverage Report

Linesch, Mark mark.linesch at hp.com
Thu Jul 6 16:15:25 CDT 2006

fyi mark
—–Original Message—–
From: Hatch, Marcie [mailto:Marcie.Hatch at zenogroup.com]
Sent: Thursday, July 06, 2006 1:08 PM
To: Linesch, Mark; Steve Crumb; tony.dicenzo at oracle.com; John Ehrig; Don Deutsch; Toshihiro Suzuki; robert.fogel at intel.com
Cc: Maloney, Nicole
Subject: Final OGF Coverage Report

Hi Team,

There have been nine pieces of total coverage resulting from the EGA/GGF merger announcement. The coverage has remained informative and continues to reiterate the key messages that were discussed during the press conference. Please note, the expected pieces by Patrick Thibodeau of Computerworld and Elliot King of Database Trends and Applications have not appeared, to date.

GRIDToday has featured four different pieces as a result of the announcement. Editor Derrick Harris summarized the various stories in an overview, providing the details of the announcement and points to the overall importance of grid computing. Harris also discusses his Q&A with Mark regarding the next steps for the OGF, the pace of standards adoption and how the OGF plans to balance the concerns of the commercial community with those of the research community.

Harris also discusses a cynical article contributed by Ian Lumb of York University (formerly of Platform Computing Inc.), “Open Grid Forum: Necessary…but Sufficient?” Lumb uses his experience working for Platform as a basis for his pessimistic outlook on grid computing. Hestates, “I remain a grid computing enthusiast, but as a realistic enthusiast, I believe that grid computing sorely needs to deliver definitive outcomes that really matter.”

Please let us know if you have any questions.

Kind regards,

Marcie Hatch

According to their Web site: “ZENO is a new-style communications company.” (Indeed!) And presumably, Marcie Hatch is one of their representatives. In this internal communication of the OGF’s Grid Forum Advisory Committee (GFAC), Ms. Hatch relays to OGF president and CEO Mark Linesch and colleagues her assessment of the coverage on the Enterprise Grid Alliance / Global Grid Forum merger announcement.

In the first paragraph of Ms. Hatch’s message, it is revealed that there have been nine items on the merger, although at least two more items were anticipated. The second paragraph introduces the coverage in GRIDtoday, and in the third paragraph, explicit reference to my GRIDtoday article is made. Before commenting on Ms. Hatch’s assessment of my article, let’s review how GRIDtoday editor Derrick Harris contextualized it originally:

However, not everyone is wholly optimistic about this new organization. Ian Lumb, former Grid solutions manager at Platform Computing, contributed an opinion piece questioning whether the OGF will be able to overcome the obstacles faced by the Grid market. While most in the Grid community are singing the praises of the OGF — and for good reason — it is nice to have a little balance, and to be reminded, quite honestly, that it will take a lot of work to get Grid computing to the place where many believe it should be.

Even with the benefit of hindsight, and Ms. Hatch’s assessment, I remain very comfortable with Harris’ contextualization of my article. And because it’s difficult to take the cynical spin from his words, I must assume that the cynical assessment derives from Ms. Hatch herself. For a variety of reasons, it’s very difficult for me to get through Ms. Hatch’s next sentence, “Lumb uses his experience working for Platform as a basis for his pessimistic outlook on grid computing.”, without laughing hysterically. I’m not sure how Ms. Hatch arrived at this assessment, as I appended to my GRIDtoday article the following in my bio:

Over the past eight years, Ian Lumb had the good fortune to engage with customers and partners at the forefront of Grid computing. For all but one of those eight years, Lumb was employed by Platform Computing Inc.

Now that’s a fairly positive spin for a cynic, and one that can be attested to by the Platform colleagues, customers and partners I interacted with. In re-reading my article, and indeed the earlier allusion to Platform in it, I believe it’s fairly clear that Ms. Hatch was unable to appreciate the Platform context. To re-iterate, I needed to step away from the community, so that I could appreciate the broader business and technical landscape. Ironicaly, even the OGF has acknowledged this broader landscape directly through the first of their two strategic goals. Ms. Hatch concludes her paragraph on my GRIDtoday article by quoting me directly. Not only is the quote not entirely a cynical one, it expresses sentiment that was conveyed by numerous others around the recent GridWorld event.

Not too surprisingly, I suppose, my GRIDtoday article did not make the “OGF News” page. Ironically, however, Globus Consortium president Greg Nawrocki’s blog post did:

July 2006 InfoWorld.com Blog, “A Broader Scope Needed for Grid Standards Bodie”
http://weblog.infoworld.com/gridmeter/archives/2006/07/a_broader_scope.html

Greg’s blog entry starts off: “There is a great article in a recent GRIDtoday from Ian Lumb detailing the Open Grid Forum’s necessity but questioning its sufficiency.
For those of you who’ve read this far, I feel I owe you some lessons learned in closing, so here goes:

  • PR companies may position what they think you want to hear, but not necessarily what you need to hear – Engage in your own due dilligence to ensure that their assessment matches your assessment, especially on matters that have any technical content.
  • OGF’s tagline is “Open Forum, Open Standards” – Hm?
  • Google results may inflate perspective, but Google cache delivers the goods – Semantics aside, is there any credibility in 17.7 million results for an entity created this past July? (I just re-ran the query and we’re up to 19.1 million results. Not bad for a few hours!) Google cache allowed me to view a mailing-list archive that, I expect, should’ve been off limits.

Cynically yours, Ian.

Gridness As Usual

In the wake of GridWorld, Intel’s Tom Gibbs writes in GRIDtoday:

The people toiling in the trenches in the Grid community have long stopped caring what it’s called or how folks outside the community think about it. They’re too busy making it work and haggling though standards needed to make it work better.

Understandable. However, if Grid Computing is to rise out of Gartner’s “Trough of Disillusionment”, a customer centric perspective is needed. Based on their strategic priorities, even the Open Grid Forum acknowledges this.

NIST’s Guide to Secure Web Services

NIST has recently released a Guide to Secure Web Services. Their Computer Security Division describes the document as follows:

NIST is pleased to announce the public comment release of draft Special Publication (SP) 800-95, Guide to Secure Web Services. SP 800-95 provides detailed information on standards for Web services security. This document explains the security features of Extensible Markup Language (XML), Simple Object Access Protocol (SOAP), the Universal Description, Discovery and Integration (UDDI) protocol, and related open standards in the area of Web services. It also provides specific recommendations to ensure the security of Web services-based applications.

Writing in Network World, M. E. Kabay extracts from the NIST report:

Perimeter-based network security technologies (e.g., firewalls, intrusion detection) are inadequate to protect SOAs [Service Oriented Architectures] … SOAs are dynamic, and can seldom be fully constrained to the physical boundaries of a single network. SOAP … is transmitted over HTTP, which is allowed to flow without restriction through most firewalls. Moreover, TLS [Transport Layer Security], which is used to authenticate and encrypt Web-based messages, is unsuitable for protecting SOAP messages because it is designed to operate between two endpoints. TLS cannot accommodate Web services’ inherent ability to forward messages to multiple other Web services simultaneously.

The NIST document includes a number of recommendations, the five of which Kabay highlights:

  • Replicate data and services to improve availability.
  • Use logging of transactions to improve accountability.
  • Use secure software design and development techniques to prevent vulnerabilities.
  • Use performance analysis and simulation techniques for end-to-end quality of service and quality of protection.
  • Digitally sign UDDI entries to verify the author of registered entries.

The NIST document definitely warrants consideration for anyone developing Web services.

Licensing Commercial Software for Grids: A New Usage Paradigm is Required

In the Business section of last Wednesday’s Toronto Star, energy reporter Tyler Hamilton penned a column on power-based billing by datacenter services provider Q9 Networks Inc. Rather than bill for space, Q9 chief executive officer Osama Arafat is quoted in Hamilton’s article stating:

… when customers buy co-location from us, they now buy a certain number of volt-amps, which is a certain amount of peak power. We treat power like space. It’s reserved for the customer.

Power-based billing represents a paradigm shift in quantifying usage for Q9.

Along with an entirely new business model, this shift represents a calculated, proactive response to market realities; to quote Osama from Hamilton’s article again:

Manufacturers started making the equipment smaller and smaller. Customers started telling data centre providers like us that they wanted to consolidate equipment in 10 cabinets into one.

The licensing of commercial software is desparately in need of an analogous overhaul.

Even if attention is restricted to the relatively simple case of the isolated desktop, multicore CPUs and/or virtualized environments are causing commercial software vendors to revisit their licensing models. If the desktop is networked in any sense, the need to recontextualize licensing is heightened.

Commercial software vendors have experimented with licensing locality in:

  • Time – Limiting licenses on the basis of time, e.g., allowing usage for a finite period of time with a temporary or subscription-based license, or time-insensitive usage in the case of a permanent license
  • Place – Limiting licensing on the basis of place, e.g., tieing usage to hardware on the basis of a unique host identifier

Although commercial software vendors have attempted to be responsive to market realities, there have been only incremental modifications to the existing licensing models. Add to this the increased requirements emerging from areas such as Grid Computing, as virtual organizations necessarily transect geographic and/or organizational boundaries, and it becomes very clear that a new usage paradigm is required.

With respect to the licensing of their commercial software, the situation is not unlike Q9’s prior to the development of power-based billing. What’s appealing about Q9’s new way of quantifying usage is its simplicity and, of course, its usefulness.

It’s difficult, however, to conceive such a simple yet effective analog in the case of licensing commercial software. Perhaps this is where the Open Grid Forum (OGF) could play a facilitative role in developing a standardized licensing framework. To move swiftly towards tangible outcomes, however, the initial emphasis needs to focus on a new way of quantifying the usage of commercial software that is not tailored to idealized and/or specific environments.

Licensing Commercial Software: A Defining Challenge for the Open Grid Forum?

Reporting on last week’s GridWorld event, GRIDtoday editor Derrick Harris states: “The 451 Group has consistently found software licensing concerns to be among the biggest barriers to Grid adoption.” Not surprisingly then, William Fellows (a principal analyst with The 451 Group) convened a panel session on the topic.

Because virtual organizations typically span geographic and/or organizational boundaries, the licensing of commercial software has been topical since Grid Computing’s earliest days. As illustrated below, traditional licensing models account for a single organization operating in a single geography (lower-left quadrant). Any deviation from this, as illustrated by any of the other quadrants, creates challenges for these licensing models as multiple geographies and/or multiple organizations are involved. Generally speaking the licensing challenges are most pronounced for vendors of end-user applications, as middleware needs to be pervasive anyway, and physical platforms (hardware plus operating system) have a distinct sense of ownership and place.

grid_sw_licensing_vo.png

The uptake of multicore CPUs and virtualization technologies (like vmWare) has considerably exacerbated the situation, as it breaks the simple, per-CPU licensing model employed by many Independent Software Vendors (ISVs) as illustrated below.

grid_sw_licensing_hw.png
In order to make progress on this issue, all stakeholders need to collaborate towards the development of recontextualized models for licensing commercial software. Even though this was apparently a relatively short panel session, Harris’ report indicates that the discussion resulted in interesting outcomes:

The discussion started to gain steam toward the end, with discussions about the effectiveness of negotiated enterprise licenses, metered licensing, token-based licenses and even the prospect of having the OGF develop a standardized licensing framework, but, unfortunately, time didn’t permit any real fleshing out of these ideas.

Although it’s promising that innovative suggestions were entertained, it’s even more interesting to me how the Open Grid Forum (OGF) was implicated in this context.

The OGF recently resulted from the merger of the Global Grid Forum (GGF) and the Enterprise Grid Alliance (EGA). Whereas Open Source characterizes the GGF’s overarching culture and philosophy regarding software, commercial software more aptly represents the former EGA’s vendor-heavy demographics. If OGF takes on the licensing of commercial software, it’s very clear that there will be a number of technical challenges. That OGF will also need to bridge the two solitudes represented by the former GGF and EGA, however, may present an even graver challenge.