Design, Develop, Create

Wednesday 31 August 2011

Updated Scrum Guide (v July 2011)

News that Ken Schwaber and Jeff Sutherland have released an updated scrum guide. (see www.infoq.com)

See the actual guide at

www.scrum.org/scrumguides/

Thursday 25 August 2011

How and why to fake the design process

Parnas & Clements' (1986) observations on the practicalities of design conclude that the design process is not equivalent to the SDLC or rational engineering method. They address this contradiction between formal process and practical process by suggesting that the formal process be 'faked'. Allow the practical process to proceed as it inevitably must, as a chaotic flux of learning and refinement, but avoid burdening it by documenting each and every change in minute detail. Instead revise the minimal documentation set if needed or better still produce it at the end with the end-product so that both are in synch. That is, the product drives the documentation rather than the other way around.

Crucially Parnas & Clements recommend that if a design document is used it must remain open to revision as the problems and solutions described in it become better understood, refined, and clearer.

If we think of design work as a kind of learning process then I think their pragmatic solution to managerial formalism can be understood by considering Dreyfus & Dreyfus's (2000) theory of learning. Dreyfus & Dreyfus consider the ways that virtuosity and expertise are expressed by the skilled practitioner, not as an extensive manifold of logico-deducto formalisms, but through insightful performance in a situation. They claim that the graduation of knowledge and competence can be roughly systematised from novice to expert, the process of expert accomplishment does not (and can not) proceed along the same lines. The process of design must therefore be iterative, incremental, producing usable parts that can be tested and give feedback so that the design can evolve towards what is ultimately wanted (not necessarily the same as what might be stated at the outset of the project).

References
Dreyfus, H. L., Dreyfus, S. E. & Anthanasiou, T. (2000) Mind Over Machine: The Power of Human Intuition and Expertise in the Era of the Computer, Free Press.
Parnas, D. L. & Clements, P. C. (1986) A rational design process: How and why to fake it. IEEE Transactions on Software Engineering, 12, 251 - 257.

Programming as theory building

This is a classic reflection on the essence of software design in teams, from one of the greats of programming. Peter Naur is the 'N' in BNF notation. In this paper Naur presents the idea that software is rarely finished, instead it evolves as its authors' understanding of the program and the problem changes. Software's essential structure, or architecture, undergoes constant revision and evolves according to the theories and ideas held by its authors. Furthermore, the community of people involved in a design 'own' that design and carry it forward in a way that a new group with no prior knowledge of the code could not. The capability to evolve a design inheres with the people who created it as the code manifests their ideas, working assumptions, and 'theories' of the situation.

Reference
Naur, P. (1985) Programming as theory building. Microprocessing and microprogramming 15, 253-261.

Analysing the experience of software design in teams

The "Studying Professional Software Design" workshop at the University of California, Irvine, set up an interdisciplinary observational interactive research experiment to revisit how design knowledge unfolds in a software development team.

See collectiveworks.blogspot.com

The cathedral and the bazaar

Eric Raymond's classic web-book, on-line and published by O'Reilly Press (Raymond, 1999). He takes a radically alternate stance on the culture and profession of programming which he feels is deeply entangled with the open source movement in software. The dichotomy between commercial and open source affiliations by programmers is in many ways artificial as documented in Steven Levy's "Hackers : heroes of the computer revolution" (Levy, 2001) Raymond considers there are two polar opposites to organise and engage in design work, the Cathedral way and the Bazaar way. Open source is the Bazaar way where design emerges through negotiation and use, a deeply bottom-up approach to design. Closed in-house architecture driven design (e.g. Apple, Oracle, Microsoft) is the Cathedral way, charactertured as top-down.

The book has been influential in terms of capturing an alternate view of software design culture that roughens the polished smoothness of high-road approaches to software engineering like RUP, CMM/CMMI, and other SDLC methodologies. XP, SCRUM and the broad move to agile or improvisational models resonate with the Bazaar way. These modern practitioner generated and practice oriented methods cast software development as a flux of social communicative involvement where design evolves around the programmer's common commitment to act and respond to early, frequent, feedback from others (customers, users, testers, test frameworks, others).


www.catb.org/~esr/

References
Raymond, E. S. (1999) The Cathedral and the Bazaar, O'Reilly Press.
Levy, S. (2001) Hackers : heroes of the computer revolution, New York, N.Y., Penguin Books.

Requirements, design and other thoughts on programming

T. Verhoeff compiled a selection of insightful quotes from Michael Jackson's "Software Specifications and Requirements: a lexicon of practice, principles and prejudices" (1995)

They're nuggets of wisdom that transcend the current state of the art in programming languages, computers and technologies.

"To develop software is to build a Machine, simply by describing it."
"... good graphic descriptions are hard to make. For any description, graphic or textual, you have to decide what to show and what to ignore."
"for inventing or designing new things to be built ... top-down enforces the riskiest possible ordering of decisions."

Further reading see www.win.tue.nl

Joel on software's indicators for better programming

The Joel Test distills a bit of wisdom and a mountain of management structure to these 12 key indicators of good programming practice. Some of these are practices, some of them are technological infrastructure which imply other practices, and some are social practices that mark the social culture of a programming team.


In brief (quote):

The Joel Test
  1. Do you use source control?
  2. Can you make a build in one step?
  3. Do you make daily builds?
  4. Do you have a bug database?
  5. Do you fix bugs before writing new code?
  6. Do you have an up-to-date schedule?
  7. Do you have a spec?
  8. Do programmers have quiet working conditions?
  9. Do you use the best tools money can buy?
  10. Do you have testers?
  11. Do new candidates write code during their interview?
  12. Do you do hallway usability testing?

For more see www.joelonsoftware.com

Tuesday 23 August 2011

Exercise: What is good design?

Goal
Produce a personal definition of 'what constitutes good design'.

Instructions
1. Ask individuals or groups to work together to prepare one definition or statement of 'good design' and two or more examples of good high-tech designs. (3 minutes)
2. Snowball discussion. Ask for and write up keywords from definitions on one half of the white board. Write up named examples on the other half of the white board.

Definitions

  • ...
  • ...
  • ...

Examples

  • ...
  • ...
  • ...

Comments
What is good design?
What is bad design?
Is the distinction obvious?
Can the distinction change over time?

References
Buxton, B. (2007) Sketching User Experiences: Getting the Design Right and the Right Design, San Francisco, Morgan Kaufmann.
Cooper, A. (2004) The Inmates are Running the Asylum: Why High-Tech Products Drive Us Crazy and How to Restore the Sanity, Indianapolis, Indiana, USA.
Rooney, B. (2011) Renault Opens Up the 'Car', WSJ Tech Europe (article).
Miller, P. (2011) The condescending UI (2011), The Verge (blog link).
Norman, D. A. (2002) The Design of Everyday Things, Basic Books.

Wednesday 17 August 2011

Pretotyping - a way to make sense of your high tech ideas

Alberto Savoia has shared his work-in-progress book on what he terms 'pretotyping', a way of testing and learning about nascent or conceptual product ideas, bringing them beyond the 'I wish I had something that could do this' and into the real where we can make reasonable valid decisions about what is right and what is wrong about that big idea.

Alberto summarises the process in two steps.
1. Make sure you are building the right product
2. Before you build it right

Two introductory cases 'cast' as pretotypes are engaging and readable (IBM speech to text, the Palm pilot). Alberto then describes a number of types and strategies for pretotyping.

Overall a useful and actionable body of work that can only get better.

Reference:
Pretotyping: A Different Type of Testing by Alberto Savoia

Tuesday 9 August 2011

Theoretical bases for technology objects

What underlying basis in theory allows us to truly understand of how digital media, ICT and other technology objects structures and 'becomes' intrinsic to social or organisational infrastructure?

Martin Heidegger's idea of 'Gestell' is one way of theorising generative processes involving technology with experience, and subsequently social experience, such that we may account for both the seemingly obvious and trivial explanations for technological agency (tools do jobs they were designed to do) and the more intractable situations and conditions where technology fails to achieve what was intended.

Review the following precis of Heidegger's essay on the Question Concerning Technology (Heidegger, 1977) and reflect on its implications for creating and deploying ICT in organisations as instrumental infrastructures for shaping employee and customer behaviour.

An overview of the Question Concerning Technology by John Zuern at the University of Hawaii.

Reference
  • HEIDEGGER, M. 1977 (1954). Question Concerning Technology and Other Essays, Harper Perennial.

Further reading
Hubert Dreyfus, Phil 185: Heidegger's Being & Time (socrates.berkeley.edu)

And Hubert Dreyfus's other courses at Berkeley (socrates.berkeley.edu/~hdreyfus)



Analyse Struan Robertson's Guidelines for Making Games

Take a look at Struan Robertson's guidelines for making games on www.gamesbrief.com.
Robertson has distilled'51 random guidelines' from his experience developing video games, how do you make sense of them against the management or engineering methods used in your own organisation?
For example consider these guidelines in comparisson with Winston Royce's paper of 1970 (Royce, 1970), often credited as the definitive description of the Waterfall method.
Alternatively consider the guidelines in relation to the principles of the Agile Manifesto (Beck et al, 2001).

(Struan Robertson is Product Director at NaturalMotion Games)

References
  • ROYCE, W. W. Managing Development of Large Scale Software Systems.  IEEE WESCON, 1970. TRW, 1-9. (link)
  • BECK, K., BEEDLE, M., BENNEKUM, A. V., COCKBURN, A., CUNNINGHAM, W., FOWLER, M., GRENNING, J., HIGHSMITH, J., HUNT, A., JEFFRIES, R., KERN, J., MARICK, B., MARTIN, R. C., MELLOR, S., SCHWABER, K., SUTHERLAND, J. & THOMAS, D. 2001. Manifesto for Agile Software Development [Online]. The Lodge at Snowbird ski resort in the Wasatch mountains of Utah. Available: http://agilemanifesto.org/ [Accessed August 2011].

Wednesday 3 August 2011

Is implementation success due to design superiority?

Is implementation success due to design superiority or is it due to successfully managing user buy-in and product marketing? The question is central to process of technology and ICT implementation. It is usually a question that results in two different answers; claims that success is down to design superiority, or that success achieved by co-opting the user (conditioning technology acceptance of individual behaviour). These two positions represent two contrasting theories of agency, that technological objects determine use - technological determinism, or that social interpretation defines use - social determinism (Grint & Woolgar, 1997). Technological determinism characterises the belief that technology's intrinsic properties are objectively superior (or inferior), lending it to successfully satisfy user's goals and the power to succeed against other similar technologies in a competitive environment. Social selection of technology assumes the user has the final say as to whether the technology works well, is useable, even desirable. User perceptions can be managed or conditioned by others and so technology implementation becomes a matter of social agency rather than technology fitness.

I'm not going to make a balanced argument here in favour of one or the other, instead I'll pitch the idea that aspects of both are present but that characterising the problem in terms of duality is yet another error. ICT implementation is neither 'either, or' nor is it a synthesis of both. It is something rather more complex, something that employs technology as material resources, involves experiential unfolding for individuals, includes social and political processes of groups, and aspects of aesthetics, usability, and design performance.

The history of leading edge technology implementations offer valuable lessons for current developers. Much analysis characterise the performance of high tech elements, standards, products and services in terms of economic performance (Moore, 1998, Shy, 2001). Classic examples of high tech adoption dynamics include fax machines, video standards, word processor markets (Shapiro & Varian, 1998). High tech products and services are network industries, network markets in which small positive or negative effects amplify demand and supply characteristics. These are markets which can be understood at the macro level by thinking in terms of network externalities, where external factors beyond individual control drive or depress demand for goods and services. However macroscopic analyses are poor guidance for product designers and software architects. The producer's concern should be to create something that is more than just a combination of necessary features, a high tech product or service must be usable, valuable, viable, and desirable if it is to have a chance of being successful; "success and failure are driven as much by consumer expectations and luck as by the underlying value of the product." (Shapiro & Varian, 1998:181)

Screen-display pointing devices
This story highlights the practical development of the computer mouse, illustrating how technological performance is linked to user perception and that the development process must therefore include cycles of prototype, user assessment, new development (Moggridge, 2006).

Doug Engelbart talks about creative variety that led to the development of the first generation computer mouse.
Doug Engelbart (in Designing Interactions by Bill Moggridge)

Stu Card recalls the design values and aesthetics that became the 'human factors' of optimal designs for hand operated display pointing devices.
Stu Card (in Designing Interactions by Bill Moggridge)

Paul Bradley recalls the prototype-refinement process that was involved in the design of the Microsoft mouse which set a standard for both human factors and manufacturing refinement.
Paul Bradley in (in Designing Interactions by Bill Moggridge)

The case of RTC enabled groupware
An illustration of the complexity associated with developing new high-tech product categories by reviewing peer-to-peer web telephony in the early 2000s.

Where was internet telephony heading in the early 2000s? VOIP saw very slow adoption and demand in the 1990s in spite of hype and positive commentary. It was impossible to select a technology and say with any certainty that it would succeed in the market or to what degree it will be successful. This holds for technology elements (e.g. WAP, SMS, GSM, HTTP, VRML, EDI etc) and for systems built using underlying elements (e.g. VOIP and precursor services such as ICQ, AIM, IM, Skype, Google Talk, Lotus Sametime, and Fring).

The precursor to web telephony was chat and messaging. My own organisation's first reaction to the rise in internet chat via IM and ICQ services was to actively block or restrict their use, thinking of them as leaky information channels, external services hosting and storing the firm's internal communications. This fear of exposure was compounded when we discovered our engineers using messaging to exchange code fragments when troubleshooting software among themselves or when supporting clients. Once we accepted the inevitability and indeed the value of messaging as necessary element of our IT infrastructure we had to consider which clients could be installed, what external servers to use and what limitations to place on acceptable use policy. While as a management team we really struggled to accommodate and manage messaging services VOIP was easier to understand; cheap or free phone calls, simple! The question simply being how much bandwidth was taken and what data charges were entailed?

By 2003 Skype appeared to have burst upon the stage as an overnight success. One of things that Skype got right early was ease of use and the introduction of a new way of thinking about people's availability. While 'presence status' features predated Skype's green or red availability icons Skype successfully popularised the concept of passively broadcasting an individual's virtual availability. Users could establish a 'feeling' for the presence of friends and colleagues via the status icon. More broadly Skype's early success can also argued to be due to its innovative peer to peer architecture which coincided with greater broadband availability, media enabled PCs, weakened POTS monopolies, and a host of other external factors but usability and social appropriation was a significant determinant in Skype's early success. Skype's minimalist offering was instantly appealing, word-of-mouth recommendations generated interest, and the number of Skype users online any time around the world reached 3 Million by 2005. Presence based information and real time collaboration tools had now become basic entry level features for all kinds of groupware products or services. The era of social networks had dawned.

References
Grint, K. & Woolgar, S. (1997) The Machine At Work: Technology, Work And Organization, Polity Press.
Moggridge, B. (2006) Designing Interactions, Cambridge, Massachusetts, MIT Press.
Moore, G. A. (1998) Crossing the Chasm: Marketing and Selling High-Tech Products to Mainstream Customer, Oxford, Capstone Publishing Ltd.
Shapiro, C. & Varian, H. R. (1998) Information rules : a strategic guide to the network economy, Boston, Mass., Harvard Business School Press.
Shy, O. (2001) The Economics of Network Industries, Cambridge, U.K. ; New York, Cambridge University Press.