on User Experience design

Opinions on a subject close to one's heart can be quite difficult to express. UX certainly is one I've procrastinated on for a long time and probably will never be able to capture in a short neat post. However, I feel compelled to start boiling it down hoping that in time a well distilled elixir of love will emerge. When unable to precisely define a notion of what one stands for, it is always best to eliminate the superfluous and the opposites from the spectrum of likely candidates for core interests and beliefs.
Rather than the “look and feel” of landing pages or navigation issues, which appear to dominate many UX debates, these days, transitions of interactive devices and mechanisms in user workflows e.g. content authoring tasks, are of particular interest to me. The nature of users' dialogue with the contraptions facilitating task oriented actions within the system is an increasingly important aspect, especially when we consider their impact on the extent of engagement in creating content.
Not intending to discount the significance of navigation, wayfinding and landing page design, I'll focus on the wizards enabling authoring, ongoing maintenance and organisation in ad-hock created content on the web from a user perspective. A few issues immediately come to mind and given time I'll attempt to attribute those to aspects of success and failure in context of user adoption and capability to evolve with changing needs. The cumbersome authoring process of this system (blogspot.com) and blogging systems at large, should provide ample opportunity to discuss failed workflows in contrast with later generation of Social Media sites.
I suspect that much of the failing User Experience can be directly attributed to system design based on code engineering tasks as opposed to usability principles and User Centred Design process in particular. This leads me to the role and the general fit of UX practice in business culture and an interesting paper by Zafer Bilda on this very subject. The report comprises of a number of propositions Bilda presented to a group of 15 UX practitioners in order to solicit their views. I felt motivated enough to respond to the issues raised myself and in part to address the answers he obtained from the group. Read the original paper and my comments on “Chris Khalil’s Musings” blog.

agile or fragile software development

Agile methodology is widely accepted and has been adapted to many specific applications. The motivation to develop “lightweight” processes for accelerated production of software, lead to the formulation of the very ambitious Agile Manifesto in 2001. 1999 was a low point for the software development community, not only did revisiting old code to hunt for Y2K bugs seem like punishment at the peak of the .com boom, but more critically – the process exposed fundamental weaknesses of the documentation heavy processes in the past. It was time for a disruptive change; the pragmatism behind a general reluctance to adopt new approaches evidently failed us.
The appeal of Agile was by no means a discovery of new techniques, we've seen or done it all before, but this time a methodology was elegantly distilled into a few pages rather than a book. The Agile Manifesto offered a convincing vision of preferred values and principles focusing on collaboration, responsiveness, the individuals and their roles instead of a rigid processes and documentation. Other benefits ascribed to Agile include: satisfaction through rapid delivery and high ROI (return on investment).
______________________
more agile
The informal and often cuddly nature of the face-to-face workflow has at times been vulnerable to parody in its less successful applications, but the wide acceptance speaks for itself. The long emphasised focus on the user and the problem domain is finally an integral part of the process and not an 11th hour contractual necessity. From a design point of view, this aspect alone opens up the scope for exploration of truly creative solutions. Shifting from the closed shop research of the problem space towards stakeholder interaction, enables the developers to go beyond delivering 'what the client asked for' to gain the insights which increase the likelihood of discovering innovative and disruptive outcomes.
______________________
but still fragile
But isn't all this bonding and hugging amongst stakeholders just as fragile as other partnerships we form in everyday life. Well, yes it is and either party may shoulder the blame. On the developer side, the integrity of the routine daily progress/problem updates can be compromised for various reasons, including the reliance on interpersonal relationships. However, the client representative role is far more vulnerable to failure. The mantra and reality of the partnership and the sense of engagement will inevitably lead to emphatic attachment to the notion of validity and value in the project without the safety net of ongoing checks and balances from the peers. Further, the client-side stakeholder isn't necessarily representative of the end-user or worse - the prospective user may for now be just a speculative 'persona' profile.
The methodology falls short of full coverage in a number of key scenarios of the product development lifecycle and delivery situations. The Agile stereotype of client – developer relationships, typical of the last century e.g. building business process software, are not universal. The above-mentioned vendor as a commissioning party and the direct to public deployment paradigms are increasingly common in today's market yet neither seems to fit the magical stakeholder interaction scheme. My deliberations on those two examples revealed the most glaring omission. The inherent rush to produce application code, due to the value placed on software as a primary measure of progress, appears to overlook the need to integrate the critical early stages of the process; research, idea generation, design exploration and prototyping.
______________________
what about design?
It is one thing to declare catchy principles like openness to changing requirements and regular adaptation to changing circumstances and yet it is an absolute mistake to embark on code without a design and prototype validation. Under those circumstances, the unexpected manifestation of changes and the necessary adaptations are almost guaranteed and consequentially would compromise the expected delivery timeline, ROI and ultimately result in poorly designed and unmaintainable code. Yes, the core aspect of Agile – iterations and the prescribed code re-factoring may address the latter problem to some degree, but surely not without further compromises of the primary objectives.
I'd like to entertain the notion that Agile methodology can be readily mapped to the design and prototyping stages, just as it has been adapted to other processes. However, besides the strong appeal of the core values and user-centric principles, I fail to see further complementary features. Perhaps it is time for the early stages of development to be specified as additional steps extending the Agile iteration methodology at a cost of diminished elegance and simplicity.
______________________
is there a fix?
Many designers agree that, the correct way to accommodate design in Agile is to define “an initial phase zero that takes a long-term and holistic view of the project needs before moving into agile process” [Bernard C. Summers S., Dynamic Prototyping with Sketchflow in Expression Blend]. A common approach, in real-life application, is to embellish Agile with attributes of a familiar or industry recognised process, for example, see “Kanban Development Oversimplified” or “How Scrum Induced Sub-Optimization Kills Productivity (And How To Fix It With Kanban)”. However, none of those exaples cover the User Centric Design model. Unfortunately, all attempts at pimping up Agile with 'process' simply water down and often compromise its core principles and create a patchwork of less appealing prescriptions rather than a crisp well defined methodology.

to app or not to app, that is the smartphone question

______________________
cool code in the cold
At present, there seems to be a lot of dispute about the appropriate technology to implement smartphone oriented RIAs (rich internet applications). The debate is especially valid in terms of the 'me too' rush to develop iPhone and iPad apps instead of mobile device optimised websites. This picture is muddy indeed when we consider decisionmaking behind the tendency to favour the native app paradigm and especially the iOS platform. Lets look at a similar credibility dilemma in the recent past - the emphasis on deploying Flash websites, against the better judgement of many internet programmers.
In the days when the 'suits' had no idea of what internet technology objectives were and many felt quite proud of this since they had secretaries to type and print emails, decisions were all too often left to the young 'propeller heads' with little experience and even less expertise. The technologists aware of the big picture in terms of network applications or User Experience seemed just too hard to delegate, especially when requested to “show something by Friday”.
The cool “look and feel” designers stepped in without hesitation. Many had Macromedia Director skills and felt quite comfortable with applications that didn't print and broke the fundamental interactive UI event behaviours. And so, the user got the 'look' in the form of “Wait.. loading Flash” and 'feel' trying to work out how to scroll the information clipped in a tiny fixed size box.
Luckily, by the time the legacy Netscape was buried and the browsers imitating IE DOM provided a nearly uniform client-side scripting environment, the industry had begun to exploit the dynamic possibilities of HTML in combination with HttpRequest object. The 'suits' got a new phoney moniker to talk about; “Web2” and the web became a hub of Social Networks. The ECMA Script committee was shaken up by Microsoft, Google and Yahoo developers who intended to adhere to a deliverable standard, forcing the committee to abandon the ES4 utopia and accept a more realistic roadmap of critical revisions to ES3, which bore the ES5 release in 2009.
The final nail in the coffin for Flash is being hammered in by Steve Jobs. The 'special' relationship between Apple and Adobe (now the owner of Flash) reached a high point when Jobs expressed the commitment not to support Flash on iOS.
______________________
ecosystems deja vu
For a while it seemed that we had the future direction mapped out. But suddenly, the suits began to notice the outstanding growth figures as Apple introduced the new status symbol on the block. The techs lamented – a PDA with GSM not even 3G, UI designers cried – the icons all have the same shape, women frowned – the fingernails get in the way and the calls drop. And yet, the iPhone boosted street credibility in a prolific way and at the same time proved to be the most disruptive technology in a long time.
Tempted by Apple's market success (despite of or more likely due to the device's high cost) and the hefty 30% margin on third party app sales and subscription revenues, the manufacturers are busy clambering up the greasy pole building their own ecosystems. Quite apart from my semantic objections to using the term ecosystem (the original meaning is a topic more alive now than ever before), the concept of proprietary device or system oriented marketplaces for digital products bluntly contradicts the principles of User Centric Design. The notion that consumer choice and their needs are the primary objectives simply falls by the wayside when you lock them to a proprietary procurement and service channel.
The choice between native apps and cell phone optimised websites (web apps) should be based on technical requirements and user tasks, but in reality it is anything but. For one, when a company director or CEO asks for an iPhone app, very few employees have the guts to ask if the decision was purely influenced by the 12 years old technology guru at home. A truly 'smart' phone screen could be a combination of 'gadgets' for live data feeds and links to commonly used web URLs, not unlike a fusion of iGoogle and the Chrome opening page. However, since the phone manufacturers suffer from amnesia when it comes to failures of locked-in online environments e.g. Apple's eWorld or the Microsoft Network, we have to re-learn the lessons of 1995. Thank God that Steve Jobs is on-board this time, even if his health isn't quite what it was in 1996 when he returned to Apple and scrapped the Newton MessagePad.
I realise that statements like this are very controversial and deserve a lot more than an off-the-cuff post. However, in the interest of brevity I want to limit it to a just few key points. So beside the “customer is king” argument, what is the major ecosystem problem?
The business culture relating to Intellectual Property law. Just look at Apple; with 250,000 apps on offer it is apparent that some of them may violate registered patents and Apple is in no position to verify the legitimacy of every offering it sells. So far, when an app becomes a subject of an IP claim Apple pulls the app from the store and in many cases seeks to be admitted as intervenor in the legal case. There are countless reasons for Apple to be involved considering the business size and immaturity of many developers behind these apps. The question is, can and should Apple's (Google and Microsoft are in the same boat) IP people deal with the growth in patent assertions by licensing companies and patent trolls? My guess is obvious I hope.
______________________
the silver lining behind proprietary technology
The native app development environments aim to streamline the implementation and delivery, as well as enhance UI presentation on the respective platform. These intentions are very worthy, but are they effective considering the need to deploy three or more app versions and engage multiple vendors? Microsoft is the last to join the Apple, Google and Nokia ecosystems and back-end clouds. The Windows Phone could become more important now that Nokia adopted it for their products. Even if this choice was strongly influenced by monetary benefits of the partnership, its technical advantages, if any, will constitute the difference the user may experience. Perhaps sharing a few thoughts about WP7 technology will help to illustrate my point.
Personally, I don’t believe that the new focus on the phone as an entertainment device, in preference to the earlier emphases on business/enterprise communications, is anythig but an iPhone panic. The new architecture appears to be aimed at single user games and trivial commerce apps. The Windows Phone 7 platform is based on Silverlight and xna for applications requiring 3D support. These two development paths 2D or 3D are quite distinct and don't allow for mid-stream migration. The single-tasking runtime environment (tombstoned when user switches to another app) is also limited to self-contained executables sand-boxed (exclusive memory and local storage of persistent data) aiming to enhance security and stability, but as a result severely crippled in scope. In plain English, this paradigm hardly looks capable of enabling the next big thing beyond Web2.
Further, just like the Apple, Google and Nokia ecosystems, Windows Phone is totally reliant on the Microsoft controlled delivery channel via the Windows Azure back-end cloud. At large, I dismiss the commonly expressed 'big brother' fears regarding cloud services, however the platform locked user experience objections expressed above remain.
______________________
clouds on the horizon
The cloud paradigm will no doubt be the basis for the next generation of RIAs and provide ample opportunities for new innovative network applications beyond Social Networking and spanning across all forms of devices we happen to have on hand. I believe that in the near term future, HTML5 will become the favoured technology for mobile oriented RIAs over deployment of multiple native apps. For now, the wish list priority focuses on wider HTML5 and SVG support.
As far as the more mature cloud application candidates for the smartphone boom - Microsoft Windows Live or Gmail and Google Apps/Docs hold a lot of promise and yet in business settings I’m still a little dubious at best. As expected, Microsoft manages to meet some formidable design objectives, e.g. 'simplicity' - least effort in adoption since that is the scarcest resource and yes the 'price' is right. But Google services provide a better example for a number of reasons; the claimed high adoption rate, more complete functionality and above all, a business model that doesn't compete with a core legacy revenue source. However, the monetisation model based on advertising isn't free from hindrances when one considers the targeting mechanism. Using context derived from seemingly private content constitutes an obvious problem for anyone with Intellectual Property concerns (clients’ or one's own).
Commercial contract based cloud computing (SaS) is another story, especially for companies culturally struggling with IT departments presenting a mind-field of 'yes men' too afraid to raise an issue or too lazy to address problems.
The recently introduced (June 28 2011) replacement for Business Productivity Online Suite (communication and collaboration tools for MS Office) – the Office 365, which now includes Office Web Apps allowing for online document viewing (including smartphones) and editing (albeit limited), at a reasonable price. This package will be a more attractive option for business startups than companies with legacy in-house data stores, which will no doubt find the migration a daunting task. Moving the data from office servers to the cloud provides countless management shortcuts; however, the process implies a substantial shift in the IT culture and of course requires 'big decision' to commit to that change. Further, there are additional costs and effort accompanying the deployment e.g. business process related manuals, documentation and training.
To be fair, all these meta issues are equally relevant to Google Apps/Docs, in addition to some basic user skill quirks. So far, the feature-set of the Google platform constitutes a superior candidate for a replacement of local desktop apps. But, for many users, the close coupling of local MS Office Applications with Web Apps may provide the 'least pain' path in moving towards the cloud paradigm, e.g. supporting teleworking from home.
The collaborative capability of the two offerings are very hard to compare. The MS check-in / check-out workflow model is easier to adopt and understand through an analogy with the 'track changes' function in regular apps and yet, Google provides real-time editing by multiple users – a more powerful feature; however, in many cases it may take some time to capitalise on these strengths. My experience with Google Wave (same capability) proved that exposing users to tools which are far removed from their familiar ground, introduces hard to overcome barriers and often a sense of dis-empowerment, resulting in a reluctance to learn.
We will see the future before long.

For those interested in the technicalities discussed here Anthony Franco's blog posts “Mass Confusion: The hysteria over Flash, Silverlight, HTML 5, Java FX, and Objective C” and “Mobile Strategy Best Practices“ are well worth a read. Also see mobiThinking article “Mobile applications: native v Web apps – what are the pros and cons?” or these slide presentations (both are on slideshare.net so you can read text outlines instead of flicking slides) on platform specific HTML Apps by Davy Jones "HTML5 is the Future of Mobile, PhoneGap Takes You There Today"Mikko Ohtamaa Building HTML based mobile phone applications and a general Mobile RIA overview by Myles Eftos Smart phone development. A year passed since writing this post and finally I found a programmer's view on this dilemma which is very well expressed in a slide presentation and an article describing a concept of TransMedia Applications by Martin Fowler of ThoughtWorks.

doodling and sketching

The act of sketching is an integral part of the creative process at the early stages of concept development. Three aspects of sketching constitute salient and unique characteristics.
- First, intuitive externalisation – subconscious exploration of the problem space and dynamic idea generation.
- Second, ambiguity of components and their relationships – implying fluidity of the captured structure and inviting ongoing interpretation.
- Third, expression of an unfinished/rough state – a form still open to changes and likely to stimulate further exploration and one which clearly isn't indicative of the aesthetic intentions for the final design.
The first is by far the most critical aspect and one that is least understood. The physical aspect of sketching is often recognised as a bridge between the idea 'in the head' and its representation in the world. The process of informally expressing uncertain ideas appears to have an energising quality of experiencing the emergence of new insights and a dialogue with the resulting form.
I'll offer an analogy which may help to explain this phenomenon; Doodling on paper when one is waiting or bored, presents a comprehensible scenario. The activity begins without specific intentions and is predominantly a physical act, but in time it transforms towards an explicit refinement of the emerging features. The later intentions are motivated by subliminal exploration of the rough form which itself provokes or suggests possible interpretations.
An issue related to this analogy arises in context of a widely accepted hypothesis that conceptual sketching with computer tools takes on a different and an inferior work-flow pattern. For example, Vinod Goel of UC Berkeley studied designers generating ideas by sketching on paper or using a drawing program. He noted that the designers sketching freehand quickly followed the initial idea with several versions. However, those who used a drawing program tended to focus on refining the initial design, instead of generating additional variations.
Therefore, the question is; do we have any evidence that people doodle with their computer tools? I've never observed this to be true in my extensive experience with graphic and design professionals who are constantly exposed to an environment most suitable to facilitate behaviours like doodling. Perhaps my analogy confirms the hypothesis of the diminished productivity in computer based idea generation and innovative problem solving tasks.

effective tools for creative design

A well designed tool is in effect 'invisible' in its application. The user embodies the utility (physical or cognitive) and performs their task in synergy with the tool unaware of the actual artefact. A simple example may be a hammer, the user doesn't observe the constantly changing physical situation, but rather hammers away as if their arm naturally included the tool's utility. Subtle feedback of the actions performed with the tool is critical to support that state. In the ideal situation, a level of trust arises in the process, allowing for a degree of risk inherent in the operation to be readily overlooked. As a result, the intent of the high level task constitutes the only concern for the user.
In context of creative design, the only such tool remains to be a pencil/pen. Unlike routine production work (CAD, image manipulation, etc.), the tasks of generation, exploration and interpretation in design process are not well supported by computer based tools. Even the most basic and essential act of sketching, feels stilted and non-expressive using a computer. It seems that, a sense of ambiguity, avoided in the final product, is essential to the intuitive process of externalising and exploring early ideas. These pursuits appear to be severely compromised in the on-screen paradigm, irrespective of the additional capabilities on hand e.g. versioning, undoing or attribute editing capabilities.
A likely exception, in my experience, may be the creation of music. In its traditional form, it is focused on motor skills and dexterity directed at instrument interfaces and these can be readily integrated into a computer environment. However, even this proposition is debatable if we consider that the gestalt state of musical creation and performance appears to exist in the head and muscles on an intuitive level, quite apart from the creative problem solving paradigm. Rhythm and melody tend to adhere to various rules, similar to grammar in language and these have no equivalents in common sense reasoning or general intelligence which underscore creative design. Consequently, while the design processes and methodologies can be readily taught, creativity just like Artificial Intelligence encounters impassable boundaries. Common-sense knowledge continues to elude our attempts to formulate, let alone formalise, a general set of context-free rules beyond discrete specialised heuristics. Similarly, the creative or inventive endeavors remain an illusive challenge as far as tools are concerned.

the dilemma of UI design based on metaphors

Originally published in 1996 on my personal website at infomania.net and www.thepla.net/~piotr

______________________
the problem
My first design of a user interface (UI) was the Sydney Water Control Database front-end implemented by Applied Control Systems in 1992. This demanding project confronted me with a significant dilemma, namely: Metaphor based user interfaces, like the Sydney Water’s database, promise an immediate advantage of defining the system’s functions, purpose and current state to the user through a graphic illustration of the process involved. This methodolgy may seem appropriate to a broad range of UI implementations, but isn't always the optimal solution.
The problem continues to constitute one of the most important decisions in UI design projects. In my experience, the metaphor solution is only viable for systems whose actions are substantially reinforced or literalily represented by the given metaphor. In many cases however, the user is destined to be burdened and fatigued by the superfluous visual information, especially in prolonged or recurrent practice. Additionally, the metaphor may not lend itself to accommodate all functions of the interface, a problem readily illustrated by the design of this website, see the logo and additional navigation elements it incorporates. (screen capture below)

A tabbed notepad metaphor website UI showing a detail from the Sydney Water Control Database front-end implemented as a symbolic metaphor representing the system structure.

______________________
the approach
The challenge therefore, is to define orientation and navigation methods specific to the context and to employ an appropriate level of abstraction with high visibility of the interactive elements within it. A critical objective in the design process is to identify correct idioms and meaningful symbolism for visual and verbal representation of the system at hand.
Every aspect of the interface must be aimed at sign-posting and guiding the user towards meeting their goals with minimum effort. Meeting these criteria, in my opinion, is more integral to the success of the concept than the often debated ‘look and feel’ or aesthetic characteristics and gimmicks which only attract the attention of first time users.

(C) 1996 Piotr Kulaga aka Charlette Proto.

the medium shapes the message

This essay was originally published in 1997 on my personal website at infomania.net and www.thepla.net/~piotr
______________________
more channels
Two dominant trends are currently shaping the ‘voice’ of our culture. On one hand, the previously clear market boundaries between computer software and service providers, media/entertainment industry and the telecommunication companies are constantly being eroded and on the other, the multiplicity of sources is escalating beyond expectations. While new technologies and media delivery routes are being introduced with varying degrees of acceptance we are witnessing an explosion in volume and diversity of source material produced.
When we examine the new ‘on-demand’ media delivery systems like the Internet, this argument takes on exaggerated proportions. Whether in pursuit of entertainment, personal interests or in research activities, Internet users establish millions of private channels in a combination of media environments at any given moment.
The logistics of the network-based media distribution platform allows information publishers and in particular those catering to marginal interest groups, to explore the opportunities for development of specific audience focused content. This as realised, further increases the choice, leading to growth in specialised services offered by mainstream publishing houses and stimulates the emergence of dedicated market-segment media providers.
______________________
a new kid
In my opinion, the new entrants amongst content and technology providers will continue to gain a substantial share of the changing entertainment market at the expense of the established media and software houses. This will be inevitable during the impending transformation and re-positioning of the respective industries.
The majority of creative output from the mainstream media organisations is currently driven by budgetary constraints synonymous with the established high cost production and concept development methods. As a result, many publishers rely on proven ‘formula’ productions and repackaging of material whilst adding value or re-focusing for the projected audience. These restrictions more often than the creative talent have defined existing media/entertainment markets but they will not shape the less rigid network-based media operations.
Many problems will need to be overcome before the adaptation of existing programs for on-line access is at all possible. Obvious complications include copyright restrictions, royalty fees and moral rights arising from the specific contracts entered into for the production of the existing media material. More importantly, the very business structure and practices established in the media and publishing industry at large will hinder their initial ability to dominate the on-ramp marketplace.
______________________
making copies
Excluding TV broadcasting and cinema, the majority of media and software publishers rely on mass manufacturing, printing/packaging processes and distribution of 'copied' material as core infrastructure components of their operations. This commercial model is contrary to the ideals of online business practice.
In the virtual environment, ‘copies’ are generated on-demand and essentially do not represent a major part in the overall structure of the revenue generating processes. Whilst online delivery requires substantial resources like network bandwidth and server capacity to meet the demand for product (synonymous with the process of handling retail copies), these can be efficiently tailored as necessary, in complete transparency to the user.
A system geared to respond to the on-demand characteristics of the network can achieve the highest levels of operating revenue optimisation. Needing only to satisfy the total volume of requests for new material, regardless of the product mix or success of any individual item, gives publishers and resellers the most cost-effective model for distribution.
Unfortunately, this scenario doesn’t easily translate to the traditional business practices and/or skill/employment structures established in the industry. The current workforce evolved around a warehouse of stock and a costly distribution chain and may soon be an obsolete link.
Additionally, the established pricing structures for software and entertainment titles reflect the inherent costs of manufacturing copies, packaging, distribution and the losses associated with unsold and returned stock, rather than the real cost of development and user support needed by the given product.
Due to the above factors, in the immediate future we should not expect the media or software ‘majors’ to deliver the revolutionary content, since they too are starting from scratch, if not constricted in their operations by the inherent infrastructure.
______________________
online cowboys
Some of the recent technology blunders have now become legendary. It is important to keep in mind the Apple e-World and Microsoft Network episodes which help us realise the extent to which the computer industry underestimated the impact of the Internet on its future. Put aside Microsoft’s ability to survive a 180 degree turn in its Net policy to become a monopoly player on all levels and you will find that the thrust behind major success stories has been the very nature of the medium itself.
The 1990 legislation of the ‘High Performance Computing Act’ and the memorable address to the US Congress by the then Senator Al Gore ‘The Data Superhighway’, set plans for the development of fibre-optic computer networks and stimulated many technologically competent entrepreneurs to start up companies devoted to the new medium. In an unforeseen grab for credence and recognition, small and not so accomplished service providers, able to capitalise on the policy push for ‘information-age’ technology, all over the world penetrated the established ‘blue chip’ IT business elite. The amazing rise of Netscape Corp. and Yahoo! or the later examples of Sausage Software and Oz-Email in Australia set a dangerous precedent for exaggerated expectations on behalf of participating market observers. These in fact, remain the major credibility hurdle for the online industry at the end of the 1990’s.
The potential for commercial gain in the ‘virtual world’ is not obvious or by any means implicit. Outside of the US, where the desired security technologies are an exclusive implementation and home shopping has long been part of the culture, the majority of users shy away from commercial participation on public networks. At the same time a number of the international niche market operators are successfully attracting immense public attention and avalanches of financial market interest.
Most businesses pioneering sales on the Web however, have been unable to demonstrate profit in their operations and in some cases appear to manifest absolute focus on attaining market presence alone. The Internet indeed offers a perfect backbone for the virtual shop-front but in my opinion, the benefits of online commerce will never be realised by the likes of amazon.com and online CD retailers or their investors.
The full advantage of the network’s retail value and the efficiencies it promises can only be capitalised on by providing ‘soft’ product. The network offers a unique ability to conduct transactions and deliver on request any form of media which can be digitally encoded. This alters the operating criteria, especially for the software and entertainment publishers able to respond to the changing perspective.
However, this won't occur until products like the .MP3 audio files and .MPG video are accepted by the industry as viable formats for distribution and suitable commerce standards/regulations are enforced internationally. Only in this context will we see any viable inroads for the dominant media publishers being made on the network.
At present, predominantly the providers working outside of traditional market restrictions who specialise in services catering to the esoteric needs of the public, continue to explore new commerce opportunities and together with the participating audience, constitute a large part of the overall growth of the activity on the networks.
______________________
net rules ok!
In this new global playing field, it is increasingly difficult to evaluate your supplier. Anything other than advertising or network presence may be out of reach. This creates an ideal climate for a budding industry in the service and associated technologies as well as content development and production areas.
A new marketing platform has emerged, setting its own standards. Online technical resources are fast replacing the traditional telephone and on-site product support that was once available from locally based suppliers. Practices like ‘download before you buy’ promise to improve efficiency by eliminating a substantial portion of misguided purchases where incompatible or inappropriate product would have been selected. Whilst providing substantial savings for the publisher, this regrettably also shifts the responsibility for competency into the hands of consumers.
This trend is readily evident in the software industry which increasingly chooses to provide a ‘pay per minute’ help-desk only for its product. The consumer, by definition may find this a cold reality yet all publishers will follow Microsoft in the practice of ‘pay as you use’ for support on product freely distributed. I'm not suggesting that anyone will go broke giving away software, possibly the opposite. We’ve known this argument since Henry Ford postulated: "I'll give everyone in the world a model T if they guarantee they will buy the spare parts from me" and while this was far from practical with Ford vehicles, it presents a new vision for software delivered at user’s expense.
______________________
lonely rider
The very nature of hyper-media empowers the users to play a more active role as customers, both in the selection process and more importantly - the on-demand participation with access to services at all times. These include net broadcasts, hypermedia, newsgroups, feedback and real-time activities like Chat and IRC.
Online participants define their own extent and progress of interest, making the experience an almost exclusively ‘one-on’ environment. In a group situation, the inherent interactive characteristics of the hyper-media are lost, or in fact, present an immediately apparent hindrance, just like someone switching TV channels in spite of other participants. Any situation other than a demonstration or a training session, where the audience passively engages in a presentation, is bound to encounter these limitations.
Yet, this brings us to the very strength of the hypertext world where the audience and consumers are able to control the selection of source material with an unprecedented degree of opportunity to individually evaluate the product.
______________________
me too
It is debatable whether there is any evidence for an unprecedented rise in creativity and expression throughout the communicating world, yet the data entering the global information systems, seems to be taking on explosive growth patterns.
Some of this deluge is merely the countless ‘me too’ responses to the attention the ‘new media’ has generated amongst the established public forums. These will diminish as the ‘lime-light’ of popular media moves on to the next attraction.
By far the majority of content on digital networks today is created and survives in an entirely new paradigm. The contributions keep coming, just because now they can. Like any space, if it is there, it will be filled guaranteed... and if it grows?
Virtual space has opened opportunities for many to publish their own data and explore the traditional boundaries of public access to information. Whilst the real value of material remains unrealised, it is this very phenomenon which dramatically accelerates the state of play in the IT industry and creates new consumer markets for technology eg. the search and indexing systems which are suddenly so prominent.
______________________
garbage in – garbage out
The majority of data is simply accumulated to the maximum available capacity. Once it is committed to storage, data will usually survive for the life of the system and end-up on the scrap pile if it is not transferred to the next generation of computers.
However, at large information remains hidden behind the indexing regime, if indeed one is at all observed. There will never be enough time to clean-up the discs or evaluate the contents since this process is not as transparent as it is with conventional junk. For instance, old copies of documents don’t get scuffed corners and often display the date of the last system transfer or upload instead of the actual record.
Unlike anything in the material world, data can be produced with relatively negligible requirements of source material or energy. Even less effort or resources are required to make copies of existing data and additional versions. These will forever hog storage and indexing space of our computer systems, which hopefully will just cope with the bottom line expectations of the users.

(C) 1997 Piotr Kulaga aka Charlette Proto.