Texts
- The Problem of Learning
- Courseware on Problemistics
- Corso su Problemistica
- Resources Management
- Manuale/Intellettuale
- Campagna/Città
Problemistics - Problémistique - Problemistica
The Art & Craft of Problem Dealing
Chapter 5
Learning and Communication guidelines
The Communication Process
The Shannon-Weawer Model
The New Tools
Hypertext : Origin and Development
Hypertext : Philosophy and Features
Introduction(^)
Having singled out the components of the researcher's skill (chapter three) and having surveyed/analysed them in terms of definition-composition-causation (chapter 4) it is now possible to formulate a list of guidelines for building a learning experience in research methods.
The guidelines will indicate both, what has to be known (content) and how it can be made knowable (form).
Learning guidelines (^)
A learning experience in research methods should provide subjects with access to general/specific domains of knowledge (content) and should allow them to perform personalized activities and operations in approaching and dealing with this content (form).
Consequently, some of the following twelve point guidelines refer to the content and some to the form of learning. (The division between content and form is made only for analytical purposes).
A learning experience in the area of research methods should, then, involve the following aspects:
Content
1 - Knowledge of signs
2 - Knowledge of facts
3 - Knowledge of rules
4 - Knowledge of categories
5 - Knowledge of relations
6 - Knowledge of principles
Form
7 - Varieties of coding
8 - Levels of processing
9 - Effort after meaning
10 - Fitting on to prior cognitive structures
11 - Freedom of organization (exploration-modification)
12 - Flexibility for manipulation (simulation-extrapolation).
A learning experience is deemed to take place within a communication process. For this reason it seems necessary to examine, if only briefly, what a communication process is and how it can be effectively implemented.
This would contribute to the refining of the learning guidelines previously listed by giving further confirmation or by providing grounds for modification.
The communication process (^)
The process of communication has been variously viewed as (i) the sending of information, (ii) the exchange of meaning, or (iii) the attempt at influencing behaviour; the viewers having stressed either (i) the role of the sender, (ii) that of the receiver, or (iii) the interaction between sender and receiver.
Various models have been put forward to account for the dynamic of this process (D. McQuail, 1981).
In this paper a specific model, the Shannon-Weawer Theory of Communication (1949) has been selected and will be examined in its main points.
This model has been chosen because of its simplicity, its elegance and its relevance to some aspects expressed in the guidelines.
The Shannon-Weawer Model
According to Shannon and Weawer, the communication process presents problems at three different levels:
Level A. The Technical Problem.
It is concerned with the accuracy of transference of symbols of communication from sender to receiver, the accuracy depending on the technical efficacy of the medium (e.g. a telephone set) and of the channel (e.g. a telephone cable).
Level B. The Semantic Problem.
It is concerned with identity or satisfactorily close approximation of the intended meaning of the sender and the interpretation of meaning by the receiver.
Level C. The Effectiveness Problem.
It is concerned with the extent to which the received meaning affects the conduct of the receiver in the way desired by the sender.
As remarked by Lin (N. Lin, 1973) the problems of communication in the Shannon-Weawer model can be seen as presenting three aspects:
i) syntactics (means)
ii) semantics (meanings)
iii) pragmatics (motives).
These three aspects involve three entities, namely:
i) a sender (source): the encoder of the message;
ii) a medium-channel (support): the transmitter of the message;
iii) a receiver (interpreter): the decoder of the message.
In order to produce/achieve effective communication (level C), the medium-channel (level A) and the sender-receiver (level B) need to have/express a certain satisfactory degree of capacity.
Capacity means the ability of :
- the medium-channel to successfully transmit the message;
- the sender-receiver to successfully encode/decode the message.
This capacity could be impeded by two factors:
- Technical Noise, which refers to the distortions originating from the medium-channel in transmitting the message;
- Semantic Noise, which refers to the distortions in meaning (encoding/decoding) originating from the sender and/or the receiver.
In order to reduce the noise (technical, semantic) and increase the capacity (medium-channel, sender-receiver), Shannon and Weawer introduce and explain the function of the concept of redundancy (predictability) as opposed to that of entropy (uncertainty).
Redundancy refers to messages (or parts of them) which convey to the receiver conventional or highly predictable information.
Entropy refers to messages (or parts of them) which convey to the receiver unconventional or highly unpredictable information.
Redundancy and entropy in messages vary from receiver to receiver and, for an individual receiver, from message to message, in relation to the type-form of information conveyed and the type-level of his/her cognitive background.
Shannon and Weawer state that, in order to achieve effective communication in transmitting a message, it is necessary to find an appropriate balance between redundancy and entropy.
As far as the receiver is concerned, if entropy is too high (i.e. if there are too many highly unconventional and unpredictable messages) the effort required by the receiver to decode the message is too great (with respect to his capacities) to be performed effectively or at all. On the other hand, if entropy is too low (i.e. if there are too many highly conventional and predictable messages) the effort required to decode them is almost nil because there is not really much information to be decoded.
It has been remarked (N. Lin, 1971) that in many languages (e.g. English, Spanish) current written or spoken messages show an approximately even balance of entropy and redundancy (0.5 each, assuming entropy + redundancy = 1.00). When this balance is achieved, effective communication is taking place.
Communication Guidelines (^)
The brief survey of the Shannon-Weawer model has highlighted three interrelated aspects that can serve as communication guidelines for setting a learning experience. They are:
1) The importance of noise reduction. Referring in particular to semantic noise (distortions in meaning), this means that a very accurate/appropriate encoding (i.e. clear explanation of signs, multiple code representation of facts and concepts, well defined specifications for the discovery of rules and relations) is likely to lead to a very accurate/effective decoding (that is to a successful "effort after meaning").
2) The relevance of redundancy. Redundancy can be present as:
i) conventional redundancy, which refers to the style of presenting a message;
ii) code redundancy, which refers to the numerous ways (U. Eco, 1975) the content of a message can be represented;
iii) content redundancy, which refers to the iteration (repetition, reformulation) of the same message.
Effective communication is based on presenting/representing a message with variable types and amounts of redundancy, and allowing each individual to find his/her required level in terms of form presentation, code representation and time iteration. With respect to learning guidelines this means flexible organization-exploration and personalized depth of processing.
3) The necessity of balancing redundancy and entropy. This means that messages, in order to be accepted and successfully decoded, need to present points of contact with messages already known (background knowledge) or to be in syntony with the capacity (type-level) of the individual to deal with them (i.e. to fit on to prior cognitive structures).
These three points, assumed as a sort of communication guideline, bear an affinity with the learning guidelines previously listed, and this affinity gives support and strength to the plausibility and validity of both.
The General Frame (^)
The integration of learning and communication guidelines leads to the outlining of the features (activities, resources) of a learning experience as product of interactive communication and , also, to the finding of the needs this experience should take into consideration.
The guidelines taken as a whole underline three basic requirements which constitute the general frame within which effective learning experiences can take place and learning needs can appropriately be satisfied.
Although these requirements are implicit in the guidelines, nevertheless they need to be made clearly explicit. They are:
- Individualization. This means the possibility of using resources and of being engaged in activities on an individual basis (as well as on a group basis). Individualization satisfies the need to achieve a subjectively required level of redundancy (e.g. repetition, reformulation).
- Personalization. This means flexibility in the use of resources and in engagement in activities, in order to suit personal preferences towards certain types of (code) representation, in order to grant the reaching of personal levels (depths) of processing, in order to allow freedom of access and exploration, and in order to facilitate the fitting on to personal prior cognitive structures. Personalization satisfies the need to reduce noise in a way and in a measure appropriate to the decoding capacity of each individual.
- Integration. This means the fast and flexible availability of resources variously coded (text, video, audio) and the doing of various activities (exploration, manipulation, extrapolation, simulation, etc.) through the use of the same learning tool and in the course of the same learning experience. Integration as the overcoming of boundaries not only in knowledge representation but also in knowledge exploration, satisfies the need for the right balance of redundancy and entropy insofar as it allows the linking and building of new knowledge on the basis of prior personal cognitive elements and structures.
In conjunction, individualization, personalization and integration, by reducing noise, permitting appropriate redundancy and effectively balancing redundancy and entropy, maximize the capacity of the decoder.
The learning resources available until recently (teachers, books, audio visual aids, etc.), while performing well in certain regards (e.g. a human tutor as a problem stimulator, a book as an information supplier) fall short in one way or another with respect to the contemporary satisfaction of the three requirements previously expounded.
For example, a learner has to take into account the rigid time availability of a human tutor and his/her general lack of docility when required to repeat or reformulate the same message over and over again; or the lack of flexibility in personal exploration (e.g. jumping from point to point) of an audio-visual.
For this reason new more advanced learning resources must be employed if implementation of the guidelines is to be achieved.
The new tools (^)
During the '80s, a series of communication resources (Hardware - Software) which promises finally to achieve a satisfactory level of individualization / personalization / integration, has emerged .
There are, for example, electronic lexicons-thesauri (explanation of signs), knowledge bases (representations of facts, concepts), expert systems (manipulation of inferences), multimedia devices (text, graphic, sound).
Towards the end of the '80s and the beginning of the '90s, a new tool which sums up the features contained in these resources is spreading and is becoming used more and more within learning experiences.
The features of this new tool, Hypertext, will now be examined briefly to see if and to what extent it can be taken as an answer to the needs previously noted.
Hypertext : Origin and Development
The definition "Hypertext" comes from T. H. Nelson and refers to "a combination of natural language text with the computer's capacity for interactive branching or dynamic display ... of a non linear text ... which cannot be printed conveniently on a conventional page" (T. Nelson in J. Conklin, 1987).
Vannevar Bush, director of the United States Office of Scientific Research and Development (1941) is credited with being the originator of the idea of an hypertext tool. He first expressed this idea in an article in the "Atlantic Monthly" (1945) where, after criticizing as inadequate the current methods of transmitting and receiving research information, he called attention to the need for a device (which he called "Memex") in which "an individual stores his books, records and communications and which is mechanized so that it may be consulted with exceeding speed and flexibility" (V. Bush in I. Richtie, 1989).
The rationale behind the proposed device was the fact that "the human mind ... operates by association" building intricate webs of trails. In contrast, the conventional methods of data finding (alphabetical or numerical) are hierarchical; that is "information is found by tracing it down from subclass to subclass" and "one has to have rules as to which path will locate it, and the rules are cumbersome". Moreover "having found one item ... one has to emerge from the system and re-enter a new path". Instead, the "Memex" would allow the user to freely join items of knowledge, to build personal trails and to recall them instantly, "merely by tapping a button" (V. Bush in I. Richtie, 1989).
The "Memex" conception began to be implemented during the '60s, following the work of Douglas Engelbart at the Stanford Research Institute.
The NLS (oN Line System) he developed, "allowed users to create electronic documents based on connected concepts, to build hierarchies of information and to collaborate with others on the joint development of documentation" (I. Richtie, 1989).
Another important contribution came in 1963 with Ted Nelson's self-published book, "Dream Machines" in which he first employed the term Hypertext to refer to "non sequential writing". His later project called "Xanadu", after the magical place of Coleridge's poem, aims at building an hypertext server of the world's literature based on a web of links that can be extended by the user forming new pathways of access and exploration.
Other people and research centres involved in developing the hypertext concept have been: Xerox's Parc with the leading figure Alan Kay (PCW, Dec. 1987) producing "Notecards" (F. C. Halasz, 1988); Carnegie Mellon University designing first "Zog" and then "KMS" (R. M. Akscyn, D. L. Mc Cracken, E. A. Yoder, 1988); Brown University devising "Intermedia" (N. Yankelovich, B. J. Haan, N. K. Meyrowite, S. M. Drucker, 1988); and the University of Kent (U.K.) shaping "Guide" (G. Einon, 1990).
Hypertext: Philosophy and Features
The aim behind the development of Hypertext is that of simulating the way the human mind works and stimulating (enhancing) its working power.
For this reason Hypertext has been described also as "a computer-based medium for thinking and communication" (J. Conklin, 1987).
Thinking. The process of thinking is a complex multi-faceted one in which many ideas, at various levels, from heterogeneous sources, undergo many different operations (i.e. are examined, linked, compared, weighted, assessed, criticized and so on).
Communication. Communication is a process which, at its fullest, involves all the physical and mental faculties in the effort of organizing (encoding/decoding) messages in a multiplicity of ways (linear, non linear; hierarchical, non hierarchical), variously suitable to a plurality of subjects.
As a tool for thinking and communication, hypertext shows features that give rise to a series of possibilities, namely:
- Flexibility of organization. People can organize (transmit/access/explore) messages in a fast and highly flexible way which allows for linearity as well as non-linearity, for hierarchical as well as non-hierarchical or multi-hierarchical structuring.
- Modularity of composition. The message is arranged on a support of nodes from which depart links that produce nets, and the whole is a structured web of potentially accessible/practicable knowledge paths.
- Variety of representation. The message can be represented using different codes, such as written text, sound, fixed images, graphics and animation.
For this reason the word Hypermedia has been introduced to account for Hypertext programmes which offer the integration of different representational codes.
Besides that, the new generation of Hypertext programmes (1990) offers 'active' hypertext-hypermedia systems which support inference engines and rule-based reasoning (D. A. Carlson and S. Ram, 1990), and this makes it possible to combine in the same tool the properties of a multimedia knowledge base with those of a multimedia expert system.
For a basic survey of Hypertext-Hypermedia philosophy and related programmes, the following magazine articles are particularly worth referring to: IEEE Computer (September '87 and January '88), Byte (October '88), The Computer Journal (June '89), Communication of the ACM (July '88, July '89, March '90).
Guidelines and Hypertext (^)
The general frame of Learning and Communication Guidelines has stressed the need for experiences, in learning and communication, based on individualization, personalization and integration.
The CAL tools surveyed in Chapter Two already show a trend towards the achievement of these requirements.
The new Hypertext-Hypermedia tools go further in the same direction and, matched with current hardware development, seem to be leading, in the future, towards:
- Hyper-individualization. The cost, in real terms, of computers has continuously fallen, especially considering the cost/performance ratio. This means that computers have become and probably will continue to become more and more economically affordable not only by small educational institutions but also by individuals and families with an average income.
- Hyper-personalization. The increasing availability of programmes (coursewares), their facility of use (friendliness), and their versatility (responsiveness) make personalization achievable in four respects:
i) Personal Selection. Freedom to choose the most appropriate (personally suited) program amongst an increasingly vast collection.
ii) Personal Exploration. Possibility of freely navigating (in a personally tailored way) through the information contained in coursewares (i.e. by inquiring, browsing, gaming, simulating, extrapolating, etc.).
iii) Personal Modification. Freedom to introduce additions and changes (e.g. notes, updating, etc.) for personal use.
iiii) Personal Production. Availability of easy-to-use authoring languages which allow the production of self-customized coursewares.
- Hyper-integration. The introduction of optic disk and laser technology, with its capacity to store massive amount of information and to access it very fast and flexibly, makes the full integration of representational codes (text, image, animation, sound) and the interlinking-intercrossing of previously distinct domains of knowledge, easy to achieve and very efficient to perform.
Conclusions(^)
The above guidelines for learning and communication, and the tools deemed capable of implementing them, together allow for the more precise formulation of the working hypothesis already put forward, in general terms, in chapter three.
The reformulated hypothesis is as follows: that the use of an hypertext-based courseware in the area of research methods will promote individualization, personalization and integration of learning experiences, and this would generally (i.e. in most cases) lead to an increased efficiency in learning performances and to an enhancement of the researcher's skill (i.e. memory, mastery, consistency, creativity).
Starting from this hypothesis and on the basis of the considerations advanced so far, a courseware has been produced on research, design and planning methods under the title "Problemistics : the art and craft of problem dealing."
Only an extensive use of the courseware and continuous feedbacks from the users could corroborate or not the hypothesis.
For corrections, criticism and comments on these materials, please send an e-mail
References(^)
- [1945] Vannevar Bush, As We May Think, Atlantic Monthly, July 1945, pp. 101-108
- [1949] Claude E. Shannon (vith an essay by Warren Weaver), The Mathematical Theory of Communication, University of Illinois Press, Urbana, 1963
- [1971] Nan Lin, The Study of Human Communication, The Bobbs-Merrill Company, Indianapolis, 1973
- [1974] Ted Nelson Computer Lib/Dream Machines, Tempus Books of Microsoft Press, Redmond, Washington, 1987
- [1975] Umberto Eco, Trattato di Semiotica Generale, Bompiani Editore, Milano, 1978
- [1982] Denis McQuail and Sven Windhal, Communication Models for the Study of Mass Communication, Longman, London, Second Edition 1993
- [1982] Denis McQuail with Sven Windahl, Communication Models, Longman, London, Second Edition 1993
- [1987] Byte, Heuristic Algorithm, October 1987
- [1987] Interview, Alan Kay : face values, PCW, December 1987
- [1987] Jeff Conklin, Hypertext : An Introduction and Survey, Computer, September 1987
- [1988] Communication of the ACM, Special Issue : Hypertext, July '88
- [1988] Frank G. Halasz, Reflections on Notecards : Seven Issues for the Next Generation of Hypermedia Systems, Communications of the ACM, Vol. 31, nº 7, July 1988
- [1988] Nicole Yankelovich, Bernard J. Haan, Norman K. Meyrowitz and Steven M.Drucker, Intermedia : The Concept and the Construction of a Seamless Information Environment, Computer, January 1988
- [1988] Robert M. Akscyn, Donald L. Mc Cracken and Elise A. Yoder, KMS : A Distributed Hypermedia System for Managing Knowledge in Organizations, Communications of the ACM, Vol. 37, nº 7, July 1988
- [1989] Communications of the ACM, Interactive Digital Video, July '89
- [1989] I. Richtie, Hypertext - Moving Towards Large Volumes, The Computer Journal, vol. 32, nº 6, 1989
- [1990] David A. Carlson and Sudha Ram, HyperIntelligence : The Next Frontier, Communications of the ACM, Vol. 33, nº 3, March 1990
- [1990] Geoffrey Einon, Hypertext. Behind the Hype, Practical Computing, March 1990
- [1990] Jakob Nielsen, The Art of Navigating through Hypertext, in Communication of the ACM, vo. 33, nº 3, March 1990