1 / 51

Guest Lecture The Audience Analysis Problem

FIT2043 Technical Documentation for Software Engineers Lecturer: Dr Carlo Kopp, MIEEE, MAIAA, PEng. Guest Lecture The Audience Analysis Problem. Defining Information – Shannon and Weaver. Shannon – ‘…that which reduces uncertainty…’

rlim
Télécharger la présentation

Guest Lecture The Audience Analysis Problem

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FIT2043 Technical Documentation for Software Engineers Lecturer: Dr Carlo Kopp, MIEEE, MAIAA, PEng Guest Lecture The Audience Analysis Problem

  2. Defining Information – Shannon and Weaver • Shannon – ‘…that which reduces uncertainty…’ • Shannon & Weaver – ‘The quantity which uniquely meets the natural requirements that one sets up for “information” turns out to be exactly that which is known in thermodynamics as entropy.’ • Shannon & Weaver - ‘Information is a measure of one’s freedom of choice in selecting a message. The greater this freedom of choice, the greater the information, the greater is the uncertainty that the message actually selected is some particular one. Greater freedom of choice, greater uncertainty greater information go hand in hand.’ (Sveiby 1994)

  3. Defining Information - Krippendorf • Krippendorff - - ‘Literally that which forms within, but more adequately: the equivalent of or the capacity of something to perform organizational work, the difference between two forms of organization or between two states of uncertainty before and after a message has been received, but also the degree to which one variable of a system depends on or is constrained by another. E.g., the DNA carries genetic information inasmuch as it organizes or controls the orderly growth of a living organism. A message carries information inasmuch as it conveys something not already known. The answer to a question carries information to the extent it reduces the questioner's uncertainty.’…

  4. Defining Information - Hornung • Bernd Hornung - ‘Information is the meaning of the representation of a fact (or of a message) for the receiver.’

  5. Key Issues in Definition • Information is a means via which the state of uncertainty in an entity can be reduced or changed. • Entropy in thermodynamics is a measure of the state of disorder in a system; entropy in information theory is a measure of the state of disorder in an information processing system. • For information to have effect the entity must understand the message in receives; if the message has no meaning to the entity receiving it, it cannot alter the state of uncertainty in that entity. • If a message which is understood contains information, it will alter the system by changing the state of uncertainty. • Information can be measured.

  6. State Changes

  7. Meaning in Information • A key issue which is often not considered in definitions is the matter of meaning – can the message be understood? • A work of Shakespeare written in English will be rich in information content, but only to an English speaker. • An English speaker with good knowledge of Elizabethan English will perceive greater information content than a reader without; a reader with better knowledge of period history will perceive greater information content than a reader without; and so on … • In mathematical terms, the receiver of the message must be capable of decoding the message, to determine what information it contains.

  8. Example • DNA encodes the definition of an organism’s structure and function. • Alter the DNA chain of an embryo and the resulting organism will be different, possibly in many ways. • Does this mean that we can splice DNA in any manner we choose? • For the DNA to be properly decoded, it needs to be inside a biological entity which can process (understand) what the DNA tells it to do. • If the species between which the DNA is being spliced are too different, the DNA is not likely to be decoded in the manner intended, resulting in a non-viable organism.

  9. Shannon’s Channel Capacity Model (1)

  10. Shannon’s Channel Capacity Model (2) • The model have five key components: • The ‘information source’ which generates messages containing information. • The ‘transmitter’ which sends messages over the ‘channel’. • The ‘channel’ and associated ‘noise source’, this could be any number of physical channel types including copper or optical cable, radio link or acoustic channel. • The ‘receiver’ which detects and demodulates messages received over the ‘channel’. • The ‘destination’ or ‘information sink’ which responds to messages by changing its internal state. • It is implicitly assumed that messages sent by the ‘information source’ can be understood by the ‘sink’.

  11. Shannon vs Documentation Writing • The writer of a document is performing the functions which in the Shannon model are represented by the information source and the transmitter. The message is the idea the writer intends to communicate. • The channel is the medium used to convey the message to the reader, for instance a hardcopy or electronic document. • The reader of a document is performing the functions which in the Shannon model are represented by the receiver and the information sink. • The aim of proper error free communication between the writer and the reader can only be achieved if the message is transmitted correctly and its meaning understood.

  12. Shannon vs Documentation Writing (2) • In practical terms, a writer of a document is thus confronted with two problems in implementing proper communication. • The first problem is ensuring that what is intended to be seen by the reader is actually seen by the reader. This means that the document is properly structured, formatted, is free of typographical and spelling errors, avoids any ambiguity, and is an exact representation of what the writer intends to state. • The second problem is the more difficult one, which is ensuring that the reader understands the intended message in the intended manner. • This is the audience analysis problem.

  13. The Audience Analysis Problem (1) • When a human being reads a passage of text, the words and syntax in that text are parsed and associated with specific ideas. • The meaning of the text is then interpreted. • Difficulties may arise in several critical areas: • The vocabulary in the text is understood in a manner different to that intended, or not at all. • Assumed prior knowledge by the writer is absent, incomplete, or quite different to that assumed. • Assumed prior understanding of context is is absent, incomplete, or quite different to that assumed. • If any or all of these three conditions arise, the reader will understand the text in a manner different to that intended.

  14. The Audience Analysis Problem (2) • Most failures we observe in communications and documentation arise because the writer made incorrect assumptions about the reader, resulting in failures of: • Vocabulary understanding. • Relating meaning to prior knowledge. • Understanding the context of the document. • Unless the writer has an accurate model of the reader’s vocabulary, prior knowledge and understanding of the context, partial or complete failure to communicate accurately will occur. • Empirical experience shows this is the most frequent cause of communication failures in documentation.

  15. Dealing with the Audience Analysis Problem • Producing an accurate model of an audience is not always an easy task. • Experienced writers often use ‘test audiences’ comprising individuals who have similar backgrounds to the intended audience. • For specific documentation types a shortcut exists – we can usually make reasonably accurate assumptions about the skills sets and backgrounds of the readers. • As a result, we can write a document to fit this notional ‘generic reader type’ and achieve reasonably good effect. • Difficulties will however arise if the audience is mixed or does not fit the model.

  16. Product Proposal • The audience for a product proposal can vary widely, depending on the nature of the product and the organisation proposing it. • Large products which may require long and very expensive development times usually lead to decisions at a very high level within the organisation. • Therefore the people reading the proposal are very likely to be shareholders, financiers, senior managers, directors. The audience may therefore not understand technical detail well, and the language must be simple. • Technical detail is placed into an appendix or annex to the document. • Small products which require short and cheap development times usually lead to decisions at a low level within the organisation.

  17. Short Product Proposals • Therefore the people reading the proposal will be senior programmers, project leaders or managers. • Such an audience is highly technically literate, allowing the use of technical language and detail in the body of the document. • In general, it is important that a product proposal is very clear about the intended capabilities of the product. • Since the product does not exist as yet, you should use modal auxiliary verbs such as ‘should’, ‘could’ and ‘might’ in describing proposed product capabilities. • It is a good practice to outline future development paths for the product, to provide the reader with an idea of what other things the product could be made to do easily.

  18. Product Specification • The audience for a software product specification almost exclusively comprises programmers, software engineers, project leaders or other expert readers. • As a writer you can safely assume the readership will understand all technical detail you incorporate into the document. • The language in a specification must always be exact. • Ambiguity in a specification, or opportunities for a reader to interpret the contents in any way other than intended, is not acceptable in such a document. • Ambiguity is one of the most frequent causes of failure in code development.

  19. User Manuals • The audience for a user manual can vary widely, depending on the nature of the product and the experience level of the intended user. • Large and complex products will frequently be provided with ‘basic user manuals’ and ‘advanced user manuals’. • The basic manual will be written for beginners, who may be very weak in computing skills, the advanced manual for experienced and highly literate users, who may have expert level computing skills and knowledge of related or similar software products. • It is a good practice with all categories of user manual to include reference information in appendices or dedicated reference chapters at the end of the manual.

  20. Short User Manuals • Small or simple products, or products with highly specialised technical usage, such as development tools, frequently have only one user manual. • Such a manual must be ‘all things to all readers’, which is a demanding requirement to meet. • The audience may vary between the highly technically literate and the virtually technically illiterate. • In general, it is important that a user manual is very clear about the features and usage of the product. Short and simple sentences are needed.

  21. Product Technical Manuals • The audience for a technical manual comprises almost exclusively software developers, system integrators, project leaders and managers. • Therefore the writer of a technical manual can safely assume that the document will only ever be read by knowledgeable users. • A technical manual, like a specification, must provide exact and unambiguous explanations of the product’s inner function. • Whereas a specification will refer to features in the future tense, and may discuss possible features, a technical manual must always be written in the present tense.

  22. Journal Articles • The audience for a article will vary widely, depending upon the nature of the journal. An article written for Scientific American or New Scientist can make very few assumptions about the reader having prior knowledge, while an article written for an IEEE or ACM scientific or academic journal can assume a very high level of reader knowledge. • The language must not only be clear and concise, but must also flow and follow the practice of essay writing. • Most journals employ competent ‘copy editors’ who can polish the language used in an article. Nevertheless, it pays not to annoy the editor!

  23. Marketing Literature • Marketing literature can be broadly divided into that which is targeted at lay readers, and that targeted at specialist technical readers. • In effect these are two distinct audiences and very different assumptions must be made about readers. • Much the same model for the audience will apply as is used in a product proposal – this is a byproduct of the basically identical aim of such a document. • Failure to understand the audience can produce commercially disastrous effects in marketing literature production.

  24. Advanced Audience Analysis Considerations • Many documents may be of such importance to require a very detailed and meticulous application of audience analysis. • An example may be a submission to an inquiry, an affidavit, or a critical commercial proposal. • In such situations a deeper analysis of the audience, however time consuming, will be justified. • Such an analysis involves the specific ‘profiling’ of individual readers who are expected to read the document. • A profile of such a reader will typically be a checklist of known personality traits, idiosyncrasies, prior education and experience, specific biases or prejudices etc. • Every line in the draft document is then critically tested.

  25. Summarising the Key Points (1) • Understanding the abilities, needs and expectations of the intended readership is vital to successful document writing. • Specific types of document will always be written for one or more categories of reader. Within these categories of reader, you may see a considerable variation in reader skills. • The best technique for systematically analysing the intended audience is the use of a “checklist” or “grid”. This provides a framework for understanding the readership. • It pays to be very thorough when analysing the audience, since getting it wrong can largely defeat the purpose of writing the document.

  26. Summarising the Key Points (2) • Who will be reading the document? Are they programmers, managers, end users, customers etc? • What do they need from the document? Is it to explain how something works, is it a tutorial, is it a reference document? • What is their experience or skill level? Are they experts, users with some experience, complete beginners? Do they have a postgraduate degree, undergraduate degree or only a high school education? • What prejudices or biases do they have? Are they “wedded” to particular vendors’ products and technical culture? Does their education or training predispose them to prefer particular solutions?

  27. Summarising the Key Points (3) • What prejudices or biases do they have? Are they “wedded” to particular vendors’ products and technical culture? Does their education or training predispose them to prefer particular solutions? • What environment will the document be read in? Is it to be read while trying to debug a system under time pressure or otherwise? • What is the readers’ job function? Are they technical workers, managers, administration workers or home users? • What related documents may the readers be familiar with? Will they have read the manual for the previous release, or the beginner’s user manual, or a specification document?

  28. FIT2043 Technical documentation for software engineers Lecturer: Dr Carlo Kopp, MIEEE, MAIAA, PEng Information Theory Concepts - Reading

  29. Reference Sources and Bibliography • There is an abundance of websites and publications dealing with basic information theory. • Examples include: • http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html • http://okmij.org/ftp/Computation/limits-of-information.html • http://www.sveiby.com/articles/information.html • http://pespmc1.vub.ac.be/ASC/INFORMATION.html • http://www.mtm.ufsc.br/~taneja/book/node5.html • http://www.mtm.ufsc.br/~taneja/book/node6.html

  30. Defining Information – Shannon and Weaver • Shannon – ‘…that which reduces uncertainty…’ • Shannon & Weaver – ‘The quantity which uniquely meets the natural requirements that one sets up for “information” turns out to be exactly that which is known in thermodynamics as entropy.’ • Shannon & Weaver - ‘Information is a measure of one’s freedom of choice in selecting a message. The greater this freedom of choice, the greater the information, the greater is the uncertainty that the message actually selected is some particular one. Greater freedom of choice, greater uncertainty greater information go hand in hand.’ (Sveiby 1994)

  31. Defining Information - Wiener • Shannon and Wiener define information differently – Shannon sees it as measured by entropy, Wiener by the opposite of entropy. • Wiener – ‘The notion of the amount of information attaches itself very naturally to a classical notion in statistical mechanics: that of entropy. Just as the amount of information in a system is a measure of its degree of organisation, so the entropy of a system is a measure of its degree of disorganisation.’ (Sveiby 1994)

  32. Wiener Continued • ‘One of the simplest, most unitary forms of information is the recording of choice between two equally probable simple alternatives, one or the other is bound to happen - a choice, for example, between heads and tails in the tossing of a coin. We shall call a single choice of this sort a decision. If we then ask for the amount of information in the perfectly precise measurement of a quantity known to lie between A and B, which may with uniform a priori probability lie anywhere in this range, we shall see that if we put A = 0 and B = 1, and represent the quantity in the binary scale (0 or 1), then the number of choices made and the consequent amount of information is infinite.’

  33. Defining Information - Krippendorf • Krippendorff - - ‘Literally that which forms within, but more adequately: the equivalent of or the capacity of something to perform organizational work, the difference between two forms of organization or between two states of uncertainty before and after a message has been received, but also the degree to which one variable of a system depends on or is constrained by another. E.g., the DNA carries genetic information inasmuch as it organizes or controls the orderly growth of a living organism. A message carries information inasmuch as it conveys something not already known. The answer to a question carries information to the extent it reduces the questioner's uncertainty.’…

  34. Krippendorf - Cont • … ‘A telephone line carries information only when the signals sent correlate with those received. Since information is linked to certain changes, differences or dependencies, it is desirable to refer to theme and distinguish between information stored, information carried, information transmitted, information required, etc. Pure and unqualified information is an unwarranted abstraction. information theory measures the quantities of all of these kinds of information in terms of bits. The larger the uncertainty removed by a message, the stronger the correlation between the input and output of a communication channel, the more detailed particular instructions are the more information is transmitter.’ (Principia Cybernetica Web ).

  35. Defining Information - Hornung • Bernd Hornung - ‘Information is the meaning of the representation of a fact (or of a message) for the receiver.’

  36. Key Issues in Definition • Information is a means via which the state of uncertainty in an entity can be reduced or changed. • Entropy in thermodynamics is a measure of the state of disorder in a system; entropy in information theory is a measure of the state of disorder in an information processing system. • For information to have effect the entity must understand the message in receives; if the message has no meaning to the entity receiving it, it cannot alter the state of uncertainty in that entity. • If a message which is understood contains information, it will alter the system by changing the state of uncertainty. • Information can be measured.

  37. State Changes

  38. Meaning in Information • A key issue which is often not considered in definitions is the matter of meaning – can the message be understood? • A work of Shakespeare written in English will be rich in information content, but only to an English speaker. • An English speaker with good knowledge of Elizabethan English will perceive greater information content than a reader without; a reader with better knowledge of period history will perceive greater information content than a reader without; and so on … • In mathematical terms, the receiver of the message must be capable of decoding the message, to determine what information it contains.

  39. Example • DNA encodes the definition of an organism’s structure and function. • Alter the DNA chain of an embryo and the resulting organism will be different, possibly in many ways. • Does this mean that we can splice DNA in any manner we choose? • For the DNA to be properly decoded, it needs to be inside a biological entity which can process (understand) what the DNA tells it to do. • If the species between which the DNA is being spliced are too different, the DNA is not likely to be decoded in the manner intended, resulting in a non-viable organism.

  40. Shannon’s Entropy • Shannon defines entropy as a measure of uncertainty, where H is entropy, and piis the probability of a symbol or message (Theorem 2): • The logarithm is base 2. • Shannon’s proof is well worth reading – refer: ‘Properties of Shannon's Entropy’ • This is based on the paper ‘A Mathematical Theory of Communication’

  41. Thermodynamics - Entropy • The second law of thermodynamics ie ‘the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value’is often represented as: • Where kB is Boltzmann’s constant or k = 1.3806505 x 10−23 [joules/kelvin]. • This form is for all intents and purposes the same as that proven by Shannon in the Entropy Theorem.

  42. Information in a Message • The Entropy Theorem presents the total entropy (information) in the system, across all of the possible messages in the system. • For many problems we are interested in the information in one of the N messages. This can be represented thus: • As is evident, highly probable messages contain little information and vice versa.

  43. Information vs Likelihood of Message

  44. Information vs Likelihood of Message

  45. Shannon’s Channel Capacity Model (1)

  46. Shannon’s Channel Capacity Model (2) • The model have five key components: • The ‘information source’ which generates messages containing information • The ‘transmitter’ which sends messages over the ‘channel’. • The ‘channel’ and associated ‘noise source’, this could be any number of physical channel types including copper or optical cable, radio link or acoustic channel. • The ‘receiver’ which detects and demodulates messages received over the ‘channel’. • The ‘destination’ or ‘information sink’ which responds to messages by changing its internal state. • It is implicitly assumed that messages sent by the ‘information source’ can be understood by the ‘sink’.

  47. Shannon’s Channel Capacity Model (3) • Shannon demonstrated that for a ‘noisy’ channel, ie one in which random noise could additively contaminate the messages flowing across the channel, the capacity of the channel (amount of information it could carry) is defined by (Theorem 17): • Where C is channel capacity, W is channel bandwidth, P is signal power, and N is noise power. • This equation is most commonly used in the following form, as P/N is the widely used measure of ‘signal to noise ratio’ or ‘SNR’:

  48. Shannon’s Channel Capacity Model (4) • Assumption (1) – the additive noise is ‘white thermal noise’ ie it has a normal distribution in time/power; • Assumption (2) – the power of the signal (message) in the channel is the average, rather than peak power; • Assumption (3) – the power is not limited by the transmitter’s peak power rating; • Metrics (1) – channel capacity is defined in bits/sec; • Metrics (2) – bandwidth is defined in Hertz (cycles/sec); • Metrics (3) – signal and noise power are defined in Watts; • In numerical applications which compute capacity, it is customary to use this form as log2 is often not available: C=B*(1/log(2))*log( 1 + S/N);

  49. What Does This Model Tell Us? • It is possible to trade between bandwidth and signal to noise ratio to achieve an intended capacity – an example is in spread spectrum communications; • Channels with severely limited bandwidth but very high signal to noise ratio can still achieve high capacity – example are voiceband modems running over digital switches; • Where the SNR >> 1, the second term approximates the logarithm of SNR; • Where the SNR << 1, the second term -> 0, and bandwidth becomes the dominant means of improving capacity; • By manipulating the bandwidth and SNR of a channel we can manipulate its capacity, and thus how much information it can carry.

  50. Channel Capacity Example

More Related