TC03 - ICT tools in education (ÁTNÉZNI!: sok link és icon rossz/hiányzik)

Site: Tenegen
Course: TC03 - Educational ICT tools
Book: TC03 - ICT tools in education (ÁTNÉZNI!: sok link és icon rossz/hiányzik)
Printed by: Guest user
Date: Thursday, 18 April 2024, 8:09 PM

Description

-

Multimedia

Learning Objectives

When you have completed this session, you should be able to

  • meaningfully define multimedia systems,
  • identify different stages of a multimedia system.

reading Reading

Media is the plural form of the Latin noun medium originally meaning a person able to mediate between people and the spiritual world.
In contemporary everyday English language medium refers to a tool suitable for conveying a message, while the word media is a collective term for the tools and institutions of mass communication (TV, radio, the printed press).

MPC

Multimedia (i.e. several media) conveys a message through several channels at the same time and is able to integrate several media (texts, audio elements, pictures, videos, animations) into one communication system.

Methods of digitizing text are as old as PCs but analog-digital conversion has not been easy, even since the wide-spread use of personal computers.

The technological novelty of the first Multimedia PCs (at the beginning of the 1990s) was their ability to display analog films onto a monitor operating under the control of digital technology. Ever since, we define a computer system as a multimedia system when it is able to integrate at least one discreet (time-independent) and one continuous (time-dependent) medium.

The widespread use of multimedia dates back to the beginning of the 1990s when the capacity and speed of personal computers supported the storing, transmission and playing of memory hungry media elements (pictures, audio materials, videos). 1992 was an important year in the historym-learning of multimedia, as the introduction of the World Wide Web made the use of multimedia elements a near necessity.

In this age of technological convergence, analog technology was laregely discarded in favour of digital solutions. In parallel there are now more and more devices - like mobile phones - that are multimedia capable. In the 90s a multimedia PC was considered a curiosity. By 2009 all personal computers and laptops were able to play multimedia presentations without the need for special IT support to create presentations or to transmit multimedia messages (that is, messages containing videos, text, pictures, music or human voices) over the Internet.

The development of communication tools based on computer technology, - technological convergence – has significantly speeded up the spread of multimedia, and the areas of application are becoming ever broader. Multimedia provides interactive information in museums, satellite positioning and navigation systems (GPS), surveillance camera systems in streets, and digital televisions – which will replace traditional televisions within a few years – which can be regarded as a multimedia. The next generation will see us borrow books from a multimedia “world library”, and watchingdigital TV -- though the actual appliance will not necessarily be a TV as we know it.

Virtual Peru Virtual reality is a special field of multimedia developments which is no longer restricted only to the entertainment industry. It is spreading in every field of science and arts.

Multimedia and the Internet fundamentally have changed our understanding about human communication and about the ways of creating and distributing knowledge - and at same time it has had a significant impact on how we now think about about learning and teaching.

TT

exercise Excercises

"Advances in technology have powered paradigm shifts in education (Frick, 1991)...the video from the award winning authors of Fluency in Distance Learning (www.tdsolutionsonline.com) encourage educators to use technology in new ways to support and transform erstwhile ways of teaching and learning." (YouTube)

Do you agree with the author?

Watch this video, and write your opinion about its message in your learning diary!

in Education

Aims of learning Learning objective


When you have completed this session, you should be able to

  • define term "multichannel mediation",
  • list the work phases in multimedia development,
  • list the competences needed for multimedia development,
  • estimate the costs associated with educational multimedia,
  • identify your own potential role in educational multimedia development.

Reading Reading

One of the key quoted features of multimedia is that it: conveys a message through several channels at a time using information aimed at several sense organs (though listening and seeing, for example) simultaneously.

Research has shown that if we use several organs of perception at the same time, we are able to process more “data” per unit time so the intensity of learning through multimedia can improve. On average men are able to remember 20% of information heard, 30% of information seen and 50% of information simultaneously heard and seen, but the best result (80%) can be achieved if we see, hear and need to “act” during a lesson.

Educational multimedia - multimedia software - offers instructional designers a range of opportunities to involve the learner intensively with the learning process , including the potential of choosing a personal learning path. It is individual vision and ability that limits the range of possible integrated interactions , including quizzes, problem solving activities, special simulations and animations, etc. All should, however, also require action of the part of the learners. The terms 'interactivity' originated from computer technology, refering to human-computer communication through the specific user interfaces and interactions with software systems. The terms has a special, extended meaning related to educational multimedia because interactivity - shifting the learner's role from observer to participant - is a dominant factor in the improvement of the effectivness of the learning process.

Multimedia

In spite of the overhyped expectations of e-learning developments in the 90s - focused mainly on educational multimedia - the expected impacts have not been realized: e-learning based on multimedia solutions has not been able to revolutionize education. The educational world is now over this first period of euphoria, and over the 'e-learning hype', but many teachers are now skeptical of the real demand for educational multimedia, and about its effectiveness for improving learning processes in schools. Multimedia CD development -- heavily promoted in the 1990s -- did not manage to integrate e-learning methods into the pedagogical practices of schools, even in front-runner countries. (2)

What is the problem with the multimedia?

In a Hungarian comparative study (Nádasi, 2002) the efficiency of different media in learning produced results worth noting:

  • Neither the 'traditional' or electronic media was proven to be significantly better than the other in terms of efficiency of learning/teaching.
  • Each medium carries a specific (additional) opportunity, but it can be exploited only in a well-determined learning environment -- one appropriate to the given medium.
  • The efficiency of learning depends to a great extent on how much the teaching material is in harmony with the specific features of the delivery medium.
  • Not all teaching materials can be presented effectively through every medium.
  • When selecting the delivery medium, and from an efficiency viewpoint, it is essential to consider the learner’s age, abilities and cognitive level , as well as the way a teacher uses the given tools.
  • A perfectly elaborated medium, which has already proven to be effective, can be used badly.


According to a national survey (K. Radnóti, 2006), only 54% of Hungarian teachers think that it is worth integrating multimedia into lessons; 21% think it may be useful at times, and 20% of teachers said there was no justification at all to use multimedia in education.

Why is this? Are there specific problems with educational multimedia?

Pro


"Multimedia will provoke radical changes in the teaching during the coming decides, particularly as smart students discover they can go beyond the limits of traditional teaching methods. Indeed, in some instances teachers may become more like guides and mentors along the learning path, not the primary providers of information and understanding - the students, not teachers, become the core of teaching and learning process.This is a sensitive, highly-politicized subject among educators, so educationla software is often positioned as "enriching" the learning process, not as a potential substitute for traditional teacher-based methods." (Tay Vaughan, 1994)

Contra


“After an initial period of enthusiasm, often described as ‘hype’, there are growing doubts about the real demand for educational e-content, and about its relevance for improving learning” (European Commission, 2002). "Learning is based on motivation, and without teachers that motivation would cease to exist. (Educating the Net Generation, 2005) “Despite the considerable efforts undertaken, the eLearning sector is still fragmented and there are many open questions on how to exploit the potential of ICT in education and training. A broad partnership between the various stakeholders of industry, education and training, public sector and civil society is needed for Europe to reap the full benefits of ICT and learning in the knowledge society.” (A review of studies of ICT impact on schools in Europe, European Schoolnet, 2006)

Pro


“Open-endedness and flexible combinations of text, graphics, video and audio therefore are the key stepping stones to enabling e-Learning design to become easier. Designers need to be the creators envisioned by Weizenbaum in the early 1980s, exploring with curiosity and supported by multidimensional ways of working with information.” (B. Holmes and J. Gardner)

Contra


"Obviously it is easier for a student to understand the conformation of ciklohexane in the chemistry lesson if it is presented in 3D format. However, by doing this we might as well remove one small “brick” from the student’s development instead of “building” and promoting understanding. Instead of trying to imagine and understand the position of the binding angles in space, the student sits and waits for feeding his brain with an easily understandable pulp. If the student’s space perception does not develop and the ability to imagine the position of atoms in space regresses, his chances are considerably reduced to be able to imagine the structure of a crystal lattice (just to take another example from chemistry in order to draw your attention to the problem). Naturally students can survive by using several mass-produced mental “crutches” and artificial legs until the end of their high school studies, but this way of learning and teaching cannot be considered normal…” (G. Hanczár, 2007)


Pedagogical and psychological approaches are given special emphasis in the development of educational multimedia. Multimedia software should also pay particular attention to ergonomic requirements. It is relatively simple to determine whether particular material deserves to be classified as “educational multimedia”: if it does not utilize opportunities offered by modern technology in order to promote understanding, materials are merely an example of simple demonstration tools. This poses the question: how difficult is it to meet learning expectations for educational multimedia?

Development of educational multimedia - what does it mean? MMProject

Proper development of multimedia needs the collaboration of several experts within a well-managed project framework. Depending on the pedagogical aims, the project's size and the subject, the project work is generally a team effort that may requirie the participation of instructional designers, content developers, editors, multimedia designers, (graphic artists, animators), interface designers, video producers (camcorders, film editors), musicians, audio engineers and software engineers.
So the development may require experience of producing traditional textbooks, but could also require the craft of the motion picture industry and the software industry. The main stages of an average multimedia project are:

  • planning and costing,
  • designing and producing,
  • testing,
  • delivery.


Similarly to the production of a film, in the design phase creative plans, manuscripts, and storyboards should be prepared. The media elements (pictures, graphics, musics, sound effects, narratives, video clips should be produced, with a high quality. When integrated, the resulting system -- which may be quite complex -- should be an interactive presentation with high visual and semantic consistency. Team

"Now computers can be television-like, book-like and 'like themselves'. Today's commercial trends in educational and home markets are to make them as television-like as possible. And the weight of the billions of dollars behind these efforts is likely to be overwhelming. It is sobering to realize that in 1600, 150 years after the invention of the printing press, the top two bestsellers in the British Isles were the Bible and astrology books! Scientific and political ways of thinking were just starting to be invented. The real revolutions take a very long time to appear, because as McLuhan noted, the initial content and values in a new medium are always taken from old media." (Alan Kay, 1996.)


To answer the question posed at the start, some of the problems can be explained by the fact that in early dvelopments the concept of 'multimedia' was not thoroughly elaborated, particularly as the complexity of the theme was greater than first thought. There is no doubt that educational multimedia -- with attractive, visually consistent animation, simulation games, with pedagogically validated interactivity -- will play an important role among the toolkits that teachers will use in this information era, but there is a need to be aware of some facts:

  • development is very expensive,
  • that low quality multimedia can do more harm than good,
  • not all teaching subjects are suitable for presentation as a multimedia.


In this 'e-learning 2.0' period we are over the missconception that only educational multimedia can provide relevant pedagogical tools that meet the expectations of the information society, and hence that we should develop all learning content in the form of multimedia animation and simulations.

While web 2.0 tools for networked learning and online collaborations are coming to the fore, multimedia remains important, but not as the 'king' of the e-learning space, but as one of the educational tools which can be applied in relevant situations by teachers.

"This new kind of "dynamic media" is possible to make today, but very hard and expensive. Yet it is the kind of investment that a whole country should be able to understand and make. I still don't think it is a real substitute for growing up in a culture that loves learning and thinking. But in such a culture, such new media would allow everyone to go much deeper, in more directions, and experience more ways to think about the world than is possible with the best books today. Without such a culture, such media are likely to be absolutely necessary to stave off the fast-approaching next Dark Ages." (Alan Kay, 1996.)

The role of teachers in development

As we have seen the roles within multimedia development generally involve professionals from the motion picture industry, from the software industries , and from other creative areas. Multimedia development is not a personal venture (albeit Leonardo da Vinci was scientist, architect, creative designer and poet folded into one!).

However educational multimedia cannot be sucessful without the experience of teachers. Teachers should be present in the developments, as instructional designer, author, pedagogical or methodological experts.

In e-learning 2.0 the focus hass moved to applications for sharing small electronic educational resources - learning objects - prepared for special pedagogical aims. Learning Objects can be published and shared through the social networks, and among the self organized online social communities of teachers. Web 2.0 tools offer a very different paradigm to that of the large multimedia systems with their complicated publication development paths. Now teachers can construct valuable repositories, storing a large numbers of small educational elements and offering opportunities for teachers students an outlet for their creativity.

Let's try it!Exercises

1. Please record your opinion, in your learning diary, about educational multimedia you have already used in your pedagogical work!

For this course we have established social bookmarks for sharing links related to the topics discussed in the lessons. The website of the bookmarks: http://www.diigo.com/user/tenegen. Please extend the lists with links to online educational multimedia you think is worth sharing with us. (To enter a new bookmark you need to login with the account details of user: tenegen, pasword: netgen555.)

2. Do you have creative students who are experienced in creating digital pictures, editing videos and digital sounds? If yes, would you (could you) involve him or her in your development? What do you think about the future: could members of the net generation ever take part in the teachers' pedagogical work as creative partners?

References

1. A. Nádasi: Educational technology and tools, ELTE, Budapest, 2002.
2. Werner B. Korte, Tobias Hüsing: Benchmarking Access and Use of ICT in European Schools, Empirica, 2006, http://www.empirica.biz/empirica/publikationen/documents/No08-2006_learnInd.pdf
3. Z. Kerber: Bridges between the subjects National Institute for Public Education, Budapest, 2006, K. Radnóti: What kind of assessment methods are preferred by the teachers in the Hungarian schools?
5. G. Hanczár: What is the problem with the Multimedia? New Pedagogical Journal, 2. issue, Budapest, 2007.
6. Theodore Roszak: The Cult of Information: The Folklore of Computers and the True Art of Thinking,1986.
7. B. Holmes and J. Gardner: e-Learning- Concepts and Practice, SAGE Publications, London, Thousand Oaks, New Delhi, 2006.
8. Alan Kay: Revealing the Elephant: The Use and Misuse of Computers in Education, Educom review, 1996.
9. Educating the Net Generation, edited by Diana G. Oblinger and James L. Oblinger, 2005 EDUCAUSE, http://www.educause.edu/educatingthenetgen

in the communication

Aims of learning Learning objectives

When you have completed this session, you should be able to

  • differentiate between the concepts of hypertext, hypermedia and multimedia,
  • understand the historical significance of changes in human communication.

Reading Reading

Hypertext, hypermedia, multimedia

HipertextThe three concepts listed in the title are interrelated; trying to define one of them will inevitably lead to the second or the third one.

Social scientists rely on written materials, so it is easy to understand their excitment over the features that text on the World Wide Web can have. There is the possibility of following innumerable links which lead in various directions and to many mental adventures, are seemingly infinite expandable and able to be searched, edited, and modified. Ever since the Internet opened up and made available for higher education institutions, many analyses have been published predicting fundamental changes in the history of human communication.

Though first advocated over 60 years ago, the newly exploited concept of hypertext is one of the mainstays of the technology and offers a new paradigm in the organisation of content.


Hypertext is about electronically stored documents interconnected through nodes, or links. Texts built this was can be extended through linking without obvious limit, and can be found, studied and read by following the linkages. When reading a text, it is easy to follow new text 'nodes', offering new branches, by uniquely identifying the next document in the chain. The identifiers used in the nodes are called “hyperlinks” or simply links.

„Electronic linking shifts the boundaries between one text and another as well as between the author and the reader and between the teacher and the student. It also has radical effects on our experience of author, text, and work, redefining each. Its effects are so basic, so radical, that it reveals that many of our most cherished, most commonplace, ideas and attitudes toward literature and literary production turn out to be the result of that particular form of information technology and technology of cultural memory that has provided the setting for them. This technology - that of the printed book and its close relations, which include the typed or printed page - engenders certain notions of authorial property, authorial uniqueness, and a physically isolated text that hypertext makes untenable.” (Georg P. Landow, 1991)

Hypertext does not force the reader to follow a strictly linear route. Any path can be followed by clicking on a link in the text, and it allows for a relatively easy return to the original text itself. In the confines of a single document, we often do this when we read a footnote or follow a reference to the back of the book. We also do similar when we pick up a completely different book referenced by the one we started to read. The only difference is that in the case of real we have to physically find it on the shelves of the library.

Hypermedia is an extension of the concept of hypertext. In addition to text (text documents), different kinds of media elements -- pictures, audio recordings or videos -- can be found in the referenced nodes. Hypermedia started to take off at roughly the same time as the World Wide Web which was also when hardware limits on the forwarding and presentation of media elements eased. Hypermedia is now becoming dominant on the Internet.

„Hypermedia is a holistic world and knowledge model, for it intends to include theoretically everything, the “whole”, and everything is inevitably related to everything. Specialization, which used to save the life of civilization, has become life-threatening, because the parts do not communicate with each other and we forget to think about the consequences. The storing capacity and the operation speed of computers allow us to approach the world in a complex, universal way again.” (J. Sugár)

Did multimedia or hypermedia come first? Hypermedia was certainly postulated before before multimedia; however, multimedia CDs came into existence before real hypermedia. This was soon followed by hypertext on the then largely text-based Internet. Now the use of hypermedia on the World Wide Web -- based on multimedia elements -- is firmly established. Multimedia can be considered to be a set of medium established for conveying a particular message – for instance an electronic syllabus – with finitely many predetermined access routes. It is more definite than hypermedia in that is is the development of established goals, but this fixed definition limits the potential of following unexpected and unanticipated routes.

The history of the hypertext

Working with the first computerized word processors was not particularly easy. Even today we tend to need some instruction in the use of word processor software to write even the simplest of letters. However the potential to search electronically stored texts on the Internet is so great partly because of its simplicity for researchers, authors, philosophers, librarians, etc. No longer do they have to physically search through thousands of pages of books in a library.

The names hypertext and hypermedia were first used by Theodor Holmes Nelson -- the American philosopher and sociologist(*) -- in 1963 when he was thinking about designing a universal, computerised word processor. With the prefix “hyper” he intended to emphasize that this is a kind of electronically stored text having fundamentally different structure from the traditional ones.
He presented his thoughts in 1965 to the ACM (Association for Computing Machinery) conference about a word processor capable of supporting “non sequential” writing, and which could compare different versions of texts page by page, able to return to any of the previous versions. With the help of so called “connecting lists”, any unit of a given text could be linked to a similar unit of another text, with a link created between each.

Nelson’s idea was, in turn, a result of him having attended his teacher’s (Vannevar Bush) presentation about a computerized document processing system called Memex. Vannevar Bush was working as the scientific advisor of President Roosevelt in 1945. In this duty he had to coordinate the work of many thousand scientists. By reading linearly printed texts he was unable to make progress at the rate that his work required. In 1945 (a few months before the handing over of the first computer, ENIAC, was announced at a news conference) he drew up in his now famous article showing that the just as the human mind is more associative than linear, there would be great advantage in creating create a machine that could support this information arrangement.

"The human mind does not work that way. It operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain. It has other characteristics, of course; trails that are not frequently followed are prone to fade, items are not fully permanent, memory is transitory. Yet the speed of action, the intricacy of trails, the detail of mental pictures, is awe-inspiring beyond all else in nature. Man cannot hope fully to duplicate this mental process artificially, but he certainly ought to be able to learn from it. In minor ways he may even improve, for his records have relative permanency." The tool MEMEX imagined by Bush is referred today as the intellectual ancestor of hypertext, however the appearance of “non linear” text structure is dated back much earlier. (**)

Nelson's presentation in 1965 did not create much interest because he was unable to support his ideas properly from a technical point of view. Computer scientist pronounced his ideas a daydream. In spite of this Nelson looked for sponsors, partners (computer programmers) and established the “Xanadu” project Over the next 20 years he with his workmates worked towards the realization of this dream. The technical solutions and underlying methods did not prove to be of value, in 1988 they convinced the company Autodesk to support the project. The software was completely redesigned and programs were rewritten but deadlines slipped badly. In 1992, months from completion, Autodesk went bankruptand financial support ended. Though he spent 29 years from 1965 to 1992 on the development, this was not enough for Nelson to prove that his idea could be fully realised, though he made many advances.

Meanwhile the English information technologist Tim Berners Lee (today the leader of World Wide Web Consortium) devised a simple plan for a World Wide Web of information, and presented it in 1989 to fellow physicists as CERN, in Geneva. His plans were accepted and as a result of the developments the World Wide Web (WWW) appeared in 1992. In 1993 CERN and MIT established the W3C consortium (Tim Berners Lee became its president) and shortly after the first graphic browser software -- MOSAIC -- was developed.


Ted Nelson did not like the WWW at all. In his criticism he pointed out that URL based identification is a much poorer solution than the one they had designed in XANDAU. XANDAU could have been able to identify all units, all letters, all picture fragments and the sound scraps of every document stored in the system, or on the 'net'. Perhaps it was this over ambition that hedl up progress?

Finally...

Marshall McLuhan (1911-1980), a Canadian literature historian categorizes technical mediums into generation. The first generation mediums are simple extensions of biological sense organs. The second generation can be connected to the appearance of alphabetic handwriting, while the third is connected to the printing of books. The foundation of the fourth generation medium is analogue signal transformation. This covers the development of the radio, the telephone and the camera. Finally, with digital technology came the improvement of electronics and the creation of computers, and hence the establishment of fifth generation mediums.

In his books published in 1962 (The Gutenberg Galaxy) he postulated realtively heretical ideas that caused great controversy. He dared to question whether the closest educating medium to human nature was the written text or the printed book.

His other revolutionary statement was that technical mediums have such a great effect on society that they can change the forms or habits of human production, consumption and contact. In this way they are able to fundamentally influence the means of social development. “Medium is the message” that reformulates the world.

McLuhan did not fully back up his opinions, as would be expected from a scientific work. However through his avant-garde ideas he nevertheless generated mass reaction from his contemporaries – both pro and contra. The resulting disputes about his thoughts are not over yet. Prior to his work, no one had dared to criticize the positive aspects of writing and no one was thinking about how book printing had locked mankind into a visual world for centuries (pushing other sense organs in the process of knowledge acquisition into the background.). He also postulated the “typographic man” who was trapped in an artificial system of signs for centuries.

How is this all related to multimedia? Real pedagogy is mostly concerned with necessary and inevitable paradigm change. Education researchers have determined that, at this time in their development, neither multimedia nor e-learning can meet the levels of educational expectations claimed. The last word here goes to the respected scientist, Georg P. Landow. Though he does not give any recipe for the future, he does comfort us with the fact that our ancestors were not any better at transforming education than we are.

"First of all, such transitions take a long time, certainly much longer than early studies of the shift from manuscript to print culture led one to expect. Students of technology and reading practice point to several hundred years of gradual change and accommodation, during which different reading practices, modes of publication, and conceptions of literature obtained. According to Kernan, not until about 1700 did print technology "transform the more advanced countries of Europe from oral into print societies, reordering the entire social world, and restructuring rather than merely modifying letters". How long, then, will it take computing, specifically, computer hypertext to effect similar changes? How long, one wonders, will the change to electronic language take until it becomes culturally pervasive? And what byways, transient cultural accommodations, and the like will intervene and thereby create a more confusing, if culturally more interesting, picture?" (Georg P. Landow, 1992)

Let's try it! Exercises

Practical experience that assess the impact on learning of hypertext and hypermedia are realtively rare. Do any of your students use this technology when preparing their homework? If yes, do you think is useful or not?

Share your opinion with others on the forums.

References

1. J. Sugár: The medium of thinking, 1998 http://artpool.hu/hypermedia/index.html
2. Dr. Vannevar Bush: As We May Think, Atlantic Monthly, July 1945.
3. Marshall McLuhan: The Gutenberg Galaxy. The Making of Typographic Man. University of Toronto Press, 1962.
4. Georg P. Landow, 1991: Analogues to the Gutenberg Revolution,Johns Hopkins University Press 1992. (http://www.cyberartsweb.org/cpace/ht/jhup/contents.html )

Notes

*Ted Nelson's website: http://xanadu.com.au/ted/

**"Selection by association, rather than by indexing, may yet be mechanized. One cannot hope thus to equal the speed and flexibility with which the mind follows an associative trail, but it should be possible to beat the mind decisively in regard to the permanence and clarity of the items resurrected from storage. Consider a future device for individual use, which is a sort of mechanized private file and library. It needs a name, and to coin one at random, ``memex'' will do. A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory." (Vannevar Bush: 1945)

evaluation

Aims of learning Learning Objectives

When you have completed this session, you should be able to

  • list the criteria for evaluating educational multimedia,
  • do an evaluation with respect to your own pedagogical aims.
A good rule of thumb for curriculum design is to aim at being idea-based, not media-based. Every good teacher has found this out. Media can sometimes support the learning of ideas, but often the best solutions are found by thinking about how the ideas could be taught with no supporting media at all. Using what children know, can do, and are often works best. After some good approaches have been found, then there might be some helpful media ideas as well. (Alan Kay, 1996)

It is not easy to devise a list of evaluation criteria to judge the quality of educational multimedia, but it is probably true to suggest that even if a particular solution meets all the technical requirements, it is not certain that it will also satisfy our pedagogical aims .

To determine the pedagogical value of educational multimedia may be as difficult as explaining why do we like, or why we do not like, a painting. Some important points for evaluation are listed below, but this list contains only the "necessary" conditions, or requirements, which may not always be sufficient to help decide whether there is a proper application for multimedia or not. These are also minimum requirements, applicable to all educational multimedia.

An evaluation must take into consideration aspects of general educational content, which can actually be satisfied by textbooks or other more traditional materials used for educational aims. Multimedia goes well beyond this set of content, including user software which has to be examined for conformity and generality. THere also has to be attention paid to the overall presentation, i.e. the consistency of media elements within the overall multimedia show.

Pedagogy, didactics, psychology

At a minimum an evaluation of electronic teaching material must consider the following.

  • The structure and the actual content must comply with the set aims, it should be able to be adapted to the learner’s individual learning style, and it should allow the learner to plan the overall learning activity independently (including making it possible to skip certain units).
  • The material should attract and maintain the learner’s interest, it should be interactive and should utilize the presentation opportunities of the computer without shifting the emphasis from the content to the way of presentation,
  • The material should offer opportunities for practice, through examples, should opportunity for self-assessment, it should motivate by rewarding correct answers, and it should analyze and evaluate the learner’s results at certain times

Assessment criteria

  • Does the teaching material meet the set of learning objectives?
  • Does it fulfil the target group’s expectations?
  • Does it maintain the learner’s interest in the material, i.e. is the principle of maintaining attention achieved?
  • Is the principle of patient waiting/expectation achieved?
  • Is the principle of confirmation achieved?

Ergonomy

The ergonomics of human behaviour, abilities, limits and other human characteristics should be taken into account when designing tools, machines, systems, work tasks, work environments in order to achieve efficient operation, and to provide a safe and convenient way of application. This is true also of multimedia.

It should be simple to use, should ensure easy navigation, and the layout and the use of icons should be logical. The images and colours used should support and not hinder the handling of the material. The proportion of pictures and texts should be well-balanced and font sizes should be selected to make the text easy to read

Assessment criteria

  • Image layout, general impression, originality of image design.
  • User friendly (adjusted to the age group) work environment.
  • The quality and systematic layout of the navigation elements.
  • Occurence of errors.
  • The simplicity of the instructions to use (how memory consuming is it to learn).
  • Incidents of exhaustion, tension, frustration during usage (appropriate setting of action-reaction time, waiting time).

Media elements

Since media elements affect all the previously set requirements, it is practical to highlight the criteria governing their applications. A basic principle is that media elements and sound effects should be used moderately and only when justifiably used in electronic teaching materials.

  • The length of the used video clips should be max. 1-1,5 min, and they should really contain extra information
  • Animation should be used only in justified cases, and they should not be too fast
  • Sound quality should be appropriate, the narrator’s voice and speed of speech should be comfortable, and any text should be clear

Assessment criteria concerning media elements

Texts

• Simplicity
• Legibility
• Clear structure
• Conciseness
• Eye-friendly image

Symbols-logos

• Simplicity, clarity
• Aesthetic apparence
• Relevance to the symbolized object, phenomenon
• How much do they promote to highlight the main points

Audio materials

• Coherent integration (sg. relevant in the right place)
• Quality, integration of narration
• Originality, appropriate application of background music

Images

• Coherent integration (sg. relevant in the right place)
• Colours, colour combinations
• Quality of images
• Optimalization ( size, quality)
• Quality of figures

Videos

• Coherent integration (sg. relevant in the right place)
• Quality of the video clips
• Optimalization ( size, quality)

Animations

• Coherent integration (sg. relevant in the right place)
• Dynamism ( quick, well-balanced, slow)
• Promoting understanding, drawing learner’s attention
• Graphics

Summary

Creating a textbook requires considerable technical expertise to compile comprehensive, high quality material. Using the right amount of high quality illustrations, selecting the right letter types etc. are all important considerations in the process. However, educational multimedia has to harmonize many more elements. In many cases the material may look really attractive, but the content that the information the creator intended to convey is lost, simply because the multimedia elements are not co-ordinated well and the because emphasis is shifted to unimportant information.

Let's try it! Exercises

Choose one of the multimedia elements that you have already used in the classroom, or if you have never used any find one on the Internet appropriate to your subject. Evalute it according the criteria listed above and publish the results in your learning diary.

References

Alan Kay: Revealing the Elephant: The Use and Misuse of Computers in Education, Educom review, 1996. (http://net.educause.edu/apps/er/review/reviewArticles/31422.html)

Media elements

Aims of learning Learning objectives

  • you will know (be confident in) the special properties of digital text, picture, sound, video, animation

    Reading Reading
    Inside the course this module contains a lot of readings and exercises as well. None of the modules are compulsory and we find important to emphasise it in this case here in this module! Not every teacher would like or because of lack of time is able to retouch a picture, make noise reduction in a sound, edit a video, apply effects and make from these files a whole interactive multimedia application. There are some who are interested in that and can have some time, too. Maybe he or she will not make every part of the electronic material, but he/she could get an insight into multimedia and can help more adequate the professionals. One of the main reasons is that not only the pedagogical purposes count but a lot more other factor, too.
    Who feel the power in themselves be brave and do it (it will not be easy, we tell it before), the other members just read through the books or articles in which they are interested in.

    We have got to know the idea of multimedia during the first (TC01) module. Now we go on and take a look at the details, we do exercises not only in theory but in use. We wil know what does a media element mean and what kind of properties they have. It will be clear how to integrate the various elements and even how to create each of them. One of the most important skill we will get is how to plan an electronic course and later how to create one!
    Above all we will get to know the digital text, image, motion picture (video, animation), sound amongst the multimedia elements and their properties or features.

    After reading the theoretical parts you will get the practical knowledge, useful exercises and little tasks from additional books.


    Bibliography:

    [1] Tay Vaughan: Multimedia: Making It Work, Berkeley, USA, McGraw-Hill, 1996.


    Text

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • list the basic rules of digital text,
    • estimate the quantity and quality of digital text,
    • identify your role in choosing the appropriate style that helps the understanding.

    The Digital Text

    Written text still remains the standard medium transmitting educating content even in case of a syllabus that contains the highest level visual techniques.

    We may think that from the point of view of a final comparison text is the simplest unit, however we can point out that only a few multimedia educational materials or educating content published on the Internet are able to combine the typographic possibilities (font, font size, colors) the way texts could remain comprehensible, well readable, and suitable for studying.

    It mustn’t be forgotten that following the typographic rules in case of electronic syllabus is just as important as in the case of printed book designs.

    Technology can be very tempting to stylize digital text. We can experience this during a text editing process when working with a computer while making a presentation. We can encounter with many Word processed texts and see that even the basic editing or writing rules were not clear for the authors. What are these basic rules?

    Definitions You will find some rules below!

    The Basic Writing Rules of Electronic Texts Management

    Inserting spaces in front of punctuation marks (period, comma, question mark, and exclamation mark) is forbidden but after them is mandatory.

    Parentheses must be “closely adjusted“ next to the enclosed text. In front of the opening piece of parentheses there is a space and after it there is no space left. For the closing piece of parentheses it is applicable the other way around.

    Don’t use minus mark instead of a dash mark!

    After a colon the text must always be started with a small letter.

    The text should not be “over ornamented”. There are numerous editing possibilities in the word processor. If we let our work to be influenced by the temptation it becomes tasteless, “kitsch”.

    Text management should not be done by using spaces. Paragraphs or any segments of the text organized by spaces usually do not get aligned precisely under one another. Proper tools for formatting texts are tabs, paragraph alignments (to the left, right or center), and rulers used for “regulating” the paragraph or the firs line.

    In front of, and in back of a dash mark, special “non extendable” spaces must be inserted from the symbols. Otherwise automatic line brake may remove the dash mark from the text spoiling the esthetic appearance and the comprehensibility as well.

    For formatting the text styles must be used which remarkably ease latter modifications. Keeping typographic rules is especially important in case of a projected presentation. We all have seen presentations where “ornaments” made impossible reading the projected presentation content. (The presenter was able to read it well at home in his/her computer but did not take into consideration that the projected material would have a different final effect.)

    Pages, where the patter of the text and the color of the letters are not in harmony, can often be seen on the web. This way the text itself becomes unreadable.

    Hardly readable text

    In both cases you can see a very disturbing compilation. Texts can be read hardly and they are not so aesthetic. An other mistake in the format of the text (maybe at first sight it cannot be seen) is that there are so called tunnels in the texts. Why is that? Because the text is justifed, but there are not any hyphenation. The word processor stretches the spaces between words. We can often meet this problem in printed media, too.

    The minimal expectation from a text appearing in some electronic education context is that the size of the fonts should be readable in every definition.

    The readability of a text is determined together by the following factors:

    • font type ( Times New Roman, Arial , Verdana , etc.)
    • font style ( bold , italics, underlined, etc.)
    • the background of the text ( the background of the text )
    • the selected colors

    The Standard Rules of Text Styles

    • Texts containing all capital letters should not be used at many places,
    • More than two type of fonts should not be used in one page,
    • A given page should not be over crowded with text,
    • Readability can be improved by highlights and frames, (shading).

    The Background

    The background of the screen changes the disposition of the page. In case of a descriptive text choosing a simple background is practical. The color of the background in this case should be soft. If the aim is to arouse attention bright colored or perhaps patterned background can be used.

    Colors

    Colors communicate atmospheres, impressions and determine the complete effect. According to experts using saturated colors should be avoided and more than four colors should not be applied in one page. When using different colors do not forget about the color blinds and color missing people. Therefore picture units should be distinguished not only by colors.

    In the above mentioned we have tried, without the intention to be exhaustive, to highlight only the standpoints related to the textual part of educational materials that are important to all teachers who apply computer processed materials in class.

    However it must be emphasized that in case of a large scale project development it is worth relying on the opinions of experts (typographers, designers).

    Let's try it! Exercise

    Emphasize the keywords in a short text using a word processor, like Word. Take care about readability and clearness but do not miss the aesthetic look of the text.


    Agora What do you think?

    Which font and size do you prefer to use? What sort of style do you use for emphasizing a word or expression (color, underline or bold style)?

    Please reflect these in your blog!


    Reference Works:

    [1] Tay Vaughan: Multimedia, Published by Osborne, Berkeley, California, USA,

    [2] Csánky Lajos: Multimédia PC-s környezetben. LSI Oktatóközpont, Budapest, 2000.

    Images

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • define term "image",

    Images, Illustrations in the Electronic Syllabus

    Comprehension in the electronic syllabus can be aided by the integration of visual elements (motion pictures, still pictures and animations). Possibilities offered by technology are just about unlimited, however proper application is not a simple task. The quality of electronic syllabus is fundamentally determined by the correct and moderate application of visual elements.

    This chapter is a brief professional overlook of the theoretical basics of digital picture construction. This may be important to anyone even if participating as a content editor or methodological expert and not as an information technologist in the development.


    Let's try it! Exercise

    Let's have a look at your favourite website. Turn off the appearance of pictures in your webbrowser. Which one is better?

    Displaying Images on the Screen

    In order to decide what sort of picture or illustration to be used for a given part of lesson many standpoints must be taken into consideration during the planning process. First of all how that will look like on the screen or in printing. We must be aware that the picture appearing on the screen basically depends on two things:

    • the quality of the graphic card ,
    • the quality of the monitor .

    Picture editing

    How?

    Picture editing

    The graphic card communicates to the monitor what and how to display. The monitor makes the picture draw from different pixels in a grid. Each of the pixels can have a unique color. We can say that the screen of the monitor is a mosaic. In the picture to the left you can see the same image zoomed by 16.

    Picture Definition

    The density of dots defines the purity of the picture. In case of smaller size but higher number of pixels purer, larger size but fewer number of pixels rougher quality picture can be observed. The number of lines and columns of the bitmap results the definition of the picture.

    The widest-spread picture definitions are the following:

    • 640x480
    • 800x600
    • 1024x768
    • 1152x864
    • 1280x1024
    • 1600x1200

    There are higher definitions than the above listed ones as well but those can only be displayed with very good quality monitors.

    In order to see the proper picture resolution on the screen both the graphic card and the monitor must be able to produce it. There is no use to send a 1280x1024 definition picture by the graphic card to the monitor if that is not able to display it and so is the other way around.

    These days almost all graphic cards are suitable for 160x1200 definition. Usually monitors mean the narrow end.

    The larger the definition is the smaller the pixels are and the more information can be displayed on the screen so images look smaller. Take a look of the same picture fraction with a 1024x768 and with a 640x480 definition.

    Color Depth and Picture Refreshing

    Pictures shown on the screen are effected by two more things:

    • the number of colors,
    • the frequency of picture refreshing.

    The number of colors tells that how many colors can define the value of a pixel According to this a picture can be:

    • 2 color (black and white, 1 bit),
    • 16 color (vga, 4 bit),
    • 256 color (svga, 8 bit),
    • 65536 color (16 bit),
    • 16 777 216 color (24 bit).

    The number of colors that can be displayed is called the color depth and usually is given in bits (8 bit color depth, etc.). This comes from the fact that 8 bits are needed to store 256 colors (1. The measurements of data storage).

    The quality of a true picture is highly influenced by the number of colors applied.

    The characteristics of the different types of pictures:

    2color picture:

    In this case the only possibility we have is to define for each dot which color out of two they should be. This type of picture is simply called bitmap picture.

    Monochrome picture:

    Typically black and white photos are made with this method. Each dot has some gray shade. All together there are 256 shade variations from white to black.

    256 color picture:

    There are 256 colors all together to use which can be selected by the user. For each pixel it can be defined which color of the 256 to be used.

    True color picture:

    This is the perfect solution for displaying color pictures because practically all shades can be displayed. Today most pictures are made this way.

    variant

    The displayed image disappears after a short time therefore it must be displayed again. The frequency of picture refreshing tells how many times in a second the monitor must display the image. The higher this value is the more stable the image is so the less vibration it produces.

    The number of colors that can be displayed and the image refreshing frequency is given with the resolution as the following:

    640x480 24 bit 100Hz,

    which means that besides a 640x480 resolution, the device (a monitor or a graphic card) can perform with a 24 bit color depth and a 100 Hz image refreshing.

    Summary:

    Images are produced by a graphic card and displayed by a monitor. The monitor displays the pictures in dots (in a grid). The density of the dots results the purity of fineness of the picture. All pixels can have different colors. The quality of the image is also determined by the number of colors and by the frequency of image refreshing.


    Let's try it! Exercise

    1. Look at the resolution, the color depth and refreshing rate of your screen. You can change the values to see the differences.

    2. Search for a picture (a photo, e.g. on Internet) and zoom in as long as you can with the help of a picture viewer application. Can you see the pixels?

    Types of images

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • make differences between the main two types of images,
    • calculate the various sizes of an image.

    Types of ImagesVektor-graphic Image

    The computer stores images in the form of numbers that describes the image data in some method. Basically, there are two ways of doing this. According to this images can be classified by two big groups as follows:

    • Vector-graphic and
    • Pixel-graphic images.

    Vector-graphic Images

    Vector-graphic images (shortly vector images) contain elements that can be well described by mathematic formulas. Lines, rectangles (especially the square) ellipses (especially the circle), Bezier-graphs, texts etc. belong here. In the case of these images the only stored information is where the given elements must be located in the picture, what color and what kind of contour line they have, etc. The computer creates the image according to these pieces of information. Mainly those pictures created this way that contains geometrical formations, texts, drawings. In the case of drawing cartoon like figures are considered here.

    Pixel-graphic Images

    Pixel-graphic images (pixel images) made of dots located in a bitmap (a grid) as the picture itself which appears on the screen. In the case of these types of images the color of each pixel must be stored. In the case of a 500x300 pixel image this means the color of 150 000 pixels must be stored!

    Pixel-graphic Image

    A pixel-graphic image is basically a huge grid. When the image is constructed the color of each dot (pixel) being in the grid is determined.

    Another name of these images is bitmap images due to the fact that the image is similar to a huge map. (The name bitmap precisely only applicable for two color pictures but today the usage is expanded.)

    As a result of the two different ways of data storage vector-graphic images take less area in the memory (and on the hard disc) however for color shaded images only pixel-graphic technology is suitable.

    Summary:

    There are two types of computer images:

    • Vector-graphic and
    • Pixel-graphic images.

    The data of vector-graphic images are stored in the form of mathematic formulas while for pixel-graphic images the data of each pixel color must be stored.

    Vector-graphic images need less memory space and mainly used for creating figures contrary to pixel-graphic images used for creating photo quality pictures .

    Pixel Image

    It can be a question: which kind of image to be applied vector-graphic or pixel-graphic?

    For answering to the question let’s quote a sentence from the end of the chapter, Creating Web-Graphics from Deke McClelland (the author of Photoshop 5 Bible):

    „I close the introduction with a basic statement, bitmap graphics reign the Web."

    Summary:

    Images can be purchased, scanned, created by picture editor software, taken as a picture or downloaded from the Internet. In every case copyrights must be taken into consideration!

    Measurements and Sizes

    In the field of defining the size of an image there is a great inconvenience among those that are involved in image editing only at a hobby level. At least four sizes of an image are taken into consideration:

    · the image size taken on the disc,

    · the image size taken in the memory,

    · the image size on the screen,

    · the image size when printed.

    Most images are created to print at the end. For this reason image editor software show in default mode the sizes of printed images using measurements in inch (1 inch = 2.54 cm).In most cases it has very little to do with the size observed on the screen. It may occur that a full screen image has only the size of a stamp when printed.

    The size of an image appearing on the screen is determined by how many pixels wide and high it is. This is the size of the image measured in pixels.

    The Size of Printed Images

    The size of printed images is determined by two things:

    • the size of the image measured in pixels,
    • the typographic resolution of the image (referred as resolution here from now on).

    In order to understand the resolution of the image the principles of image making process with printers must be known. Printers create the images from tiny dots. In the case of printers the information of how many dots can be used for drawing a one inch long horizontal and a one inch long vertical line is given. These two values define the resolution of the printer. The higher the resolution is the better image can be printed with the printer. An average (household) inkjet printer has the resolution around 600x600 dpi. Dpi is the abbreviation for dots per inch. A typographic scanner can have the value of 2450 dpi.

    Let us return to the image resolution.

    Image resolution is defined by how many dots are printed by a printer in one inch. In accordance to this the size of a printed image can be calculated. Let us look at an example:

    The picture size in pixels is 800x600. The resolution is 300 dpi horizontally and vertically as well. What will be the size of the printed picture?

    Let’s take a look at the horizontal direction first. The printer must print all together 800 dots. 300 dots will make up 1 inch length. So, 800 dots will fit on 2.66 inch length which is about 6.7 cm.

    In a vertical direction exactly 2 inch length is given due to the fact that 600 dots fit on 2 inches. This is about 5.1 cm. So, the printed image will have the size of 2.66x2.0 inches which is about 6.7x5.1 cm.

    When images are designed for Web applications only fortunately the resolution is not so important. Only the size measured in pixels count because all monitors create the images from pixels and the resolution does not have any role in it.

    Do not forget however to adjust the measurements of the image editor to pixels! Otherwise it will be more difficult to get oriented in the picture.


    Let's try it! Exercise
    Let's imagine you need a picture in your identity card. If you have a printer and a picture of good quality you can prepare it on your own. Now just calculate the size and resolution of the picture you need to have the best solution.

    Creating, editing, seaching for

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • collect pictures from different "places",
    • list the color depth values and file formats of an image,
    • estimate the gain of size using lossy compression.

    · Purchasing

    Several companies are engaged in the business of producing, and selling images. One of the largest sources of vector-graphic images is the Corel Company’s clipart collection. In this collection they offer more than a million images to the users today.

    · ScanningScanner

    Scanner is a device to digitalize texts and images. It is data input hardware. Images being on paper and texts can be input and transformed into computerized digital data with this device.

    · Image Creation

    Images can be created with the help of image editor programs. One of the most popular vector-graphic editor software is CorelDraw while one of the best pixel-graphic software is Photoshop program from the Adobe Company. Graphic designers can make miracles with these programs.

    · Downloading from the Internet

    The advantage of images downloaded from the Internet is that they seldom require supplementary work though it should not be forgotten that for any following publishing the permission of the owner is needed (which is also true in the case of any other type of sources downloaded from the Internet).

    · Taking Pictures

    Digital cameras extremely eased the storage and application of still (or motion) pictures because the created images can directly copied in the computer via the USB connection.

    Storing Images

    Color Depth, Memory Size, Size of Image Files

    The data of images are stored in files on a disc . When the image is being displayed on the screen the image data are loaded into the memory with the help of an image editor program. The size of the file located on the hard disc may differ from the size of image loaded in the memory (in most cases they are different). The reason for this is the following:

    When the image is being in the memory every image pixels have to be displayed. For this reason the color of each pixels must be known, all values of the pixels are needed. On the other hand, when the image is stored on the hard disk different data compressing techniques can be used since the image does not have to be shown. For instance when there are a hundred white pixels next to each other it is not necessary to save the data of the precise color for each in a file only the same data about one hundred following white pixels. This way the size can be drastically decreased. When the image is loaded from the disk in the memory the display software decompresses the data and creates a real image.

    The Way Color Depth Gets in Focus

    Let us look at how a 256 color image is stored. All pixels can have one of the 256 colors this way the value of a pixel is a number between 0 and 255. The representation of this number is done in one bite. So it can be said that one bite is necessary to store a pixel.

    How is it done with true color images?

    In this case all pixels can have some of the 16 777 216 kind of colors as a value. So, the value of each pixels is a number between 0 and 16 777 215. For representing such a large number one bite is not enough. Three bites are needed. Briefly, in this case three bites are required for storing one pixel.

    From the above mentioned facts the following conclusion can be drawn. When we have two images with the same number of pixels but one of them is a 256 color image, the other one is a true color image the true color image takes three times more place in the memory (but not on the disk).

    What is the situation with images having two colors? One bit is enough for storing a pixel because in this case a pixel can only have two values just like a bit. So this way a bite is enough to store 8 pixels.

    Look at the different sizes of the following image with different color depths and compressions:

    24 bit color depth,

    average jpeg compression

    8 bit color depth, gif compression

    Picture size on the screen

    945x716 pixel

    945x716 pixel

    Picture size in the memory

    2 029 860 bite

    676 620 bite

    Picture size on the disc

    115 110 bite

    374 337 bite

    Colors

    File Formats, Compression Schemes and the Results

    Every image is saved in compressed files on the Internet in order to have the smallest size possible. Basically there are two compression schemes:

    • the lossy and
    • the lossless .

    When using lossy compression the quality of the image deteriorates but the compression ratio is higher. This way the size of the image data on the disk can be one tenth of the size taken in the memory.

    When lossless compression is used the quality of the image is not spoiled (there is no loss of quality) but the compression is not so effective.

    However there is a problem with the lossless compression: it can only be used with 256 color images. For this reason if we intend to compress a true color image with this scheme in fact we lose quality unless the image contains 256 colors or less.

    Recently a new scheme has been developed for the lossless compression too. However earlier browser programs did not support it therefore it has not become so popular (though some people predict prosperous future for it).

    Images made with lossy compression have jpeg format. Images created with lossless compression have gif format . True color images made with lossless compression have the file format png.

    Each compression scheme has several ways of adjusting that influence the size and the quality of the compressed image.

    Which Compression Scheme to Use?

    On the Internet there are three wide-spread image formats: JPEG, GIF and PNG. Which one is to be used for a given situation?

    Few standpoints:

    JPEG format results the best compression ratio and perfectly useful for true color images especially when they contain many color shades. It is developed typically for compressing photo images. Besides the good compression ration it can not offer any extras (but this is enough advantage).

    GIF images are made when large continuously colored areas are in the picture. Typically useful for compressing figures (e.g. diagrams). Storage in GIF format allows some parts of the image to be transparent. When using this possibility for GIF format; do not forget first to convert the image to a 256 color image.

    The advantage of PNG images is the lossless compression in the case of true color images too. Furthermore the transparency of images can be controlled freely. Images can be created that continuously changes from being entirely transparent to fully covering. (According to my experiences this quality the browsers cannot really utilize.) PNG does not even come close to the ratio of JPG compression.

    Lossless Compression Schemes

    Lossless compression schemes operate with the method that equal parts in the data stream can be substituted by shorter symbols. These substitutions are listed in a chart at the beginning or the end of the file. For example if the numerical sequence of 0, 1, 2, 3, 4, is applied at several places in the file this can be substituted with the number 1. In the chart number one is listed as equal to numerical sequence of 0, 1, 2, 3, 4. The Huffmann coding is also based on this methodology.

    Lossy Compression Schemes

    JPEG is the widest-spread compression scheme in case of pictures. This method divides the picture into 8x8 pixel grids and for all grids the average color of the pixels is calculated. After doing so for each pixel only the difference from the average is stored. If the difference is small it will be zeroed. The goal is to zero the most numbers possible.

    This method results the loss because obviously this way not the original color but a color nearby it is being stored. The larger difference allowed for the zeroing the greater the loss will be but at the same time the smaller the size of the file will be.

    After zeroing most pixels a lossless compressing scheme is applied for the file.

    Standpoints of Image Editing

    Images are typically files that take some very large area in the memory and on the disc too. However data flow on the Internet is not exactly fast (especially when using a modem) therefore a very important criteria when creating a Web image its size measured in bites. Certainly, keep in mind the other very important standpoint; the aesthetics of the image. Shortly a web image should be small and nice.

    The question which size of the picture should be small the size in the memory or the size taken on the disc shall arise. Due to the fact that the file must be transmitted on through the net the size of the file must be created as small as possible. When the file arrives at the computer (being downloaded) it is the task of the browser to unpack and display it. This happens very fast compared to the time of downloading.

    When an image is being aesthetic is certainly a subjective matter but there are a few basic principles that should be kept in mind:

    • possibly not every one watches our image on the same monitor which we used to create it and this effects the colors and the brightness,
    • with a different resolution the image may appear smaller or larger than designed and this may be disturbing especially in the case of texts,
    • if the image is scanned disturbing interferences may arise that can be eliminated if scanning is done with the highest resolution,
    • if an images downloaded from the Internet is being edited the quality may worsen greatly,
    • if jpeg compression is used do not use the image so much as the quality becomes inadequate.

    What to Use Images for?

    Since downloading images takes a long time consider well how many images you place on your site. Larger texts should be inserted as images too because this gives a more aesthetic result. These days almost in every web page there is some sort of menu with switches made of images. Moreover switches have more status (they look different when the mouse is above them and different when they are activated) which must be designed separately.

    The backgrounds of web pages are also some kind of patterns. Usually they are small size images displayed by the browser as mosaic filling out the pattern of the screen this way. This is why the backgrounds of many web pages appear like wallpapers.

    Protopage Search the Internet for websites where you can download images, photos of good quality maybe for free. Share the links you have found useful!


    You can find detailed description about editing a picture in the book TC03 - Picture editing !


    Reference Works:

    [1] Tay Vaughan: Multimedia, Published by Osborne, Berkeley, California, USA, 1996.

    [2] Csánky Lajos: Multimédia PC-s környezetben, LSI Oktatóközpont, Budapest, 2000.

    Color thesis, color models

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • define term "color",
    • estimate which colors to choose.

    Color Thesis, Color Models

    The Visible Light

    Visible light is electro magnetic energy with wavelengths ranging between 400 and 700 nanometers comprising the spectrum perceivable by human eye.

    The visible light spectrum

    The spectrum of visible light

    Light out of this range is invisible by human eye (ultraviolet and infrared).

    In order to have the sensation of color generated three things are needed: a light source, some object to be sensed, and a sensor device. The perceived color of an object depends on the color of the light source. The same object looks different in sunshine and in neon light. Light sources are classified according to spectral emission. The Sunlight ray has approximately even distribution in every wavelength of the light spectrum. Artificial light sources cannot provide this therefore they do not have neutral light. Color shift or the purity of white light is measurable by color temperature (1. in more detail about color temperature). (On better quality monitors the color temperature can be adjusted)

    The Color of Objects

    Objects can be characterized with their spectral reflection or transparency depending on whether reflected or passing through (e.g. slides) light arrives to the detector.

    Neutral colored objects (grey) reflect (or let through) equal amount of radiating energy on each wavelength of the visible spectrum (in each color).

    Colorful objects absorb lights of certain wavelengths so the reflected light becomes colorful.

    Color Mixing

    Different colors are practically created by color mixing. This means that we have a few basic colors and all other colors are mixed from these.

    There are two types of color mixing procedures:

    • The additive and
    • The subtractiv e

    Additive Color Mixing

    Additive color mixing has three basic colors: red, green, blue. Every color can be produced by adding them together in different ratio. (This fact was observed by Newton in 1730.) After the English mane of the colors this method is called RGB color mixing.

    Imagine additive color mixing the following way:

    RGB Color Mixing

    We are sitting in a completely dark room which has white walls. If there is no any source of light we cannot see anything, so the walls are black (this is why in physical terms black is not a color but the absence of light). Let us project red, green and blue colors on the walls. Where all three colors are mixed we see white. The mixture of blue and red results magenta, the mixture of blue and green results cyan, and the mixture of red and green results yellow.

    Cathode ray tubes of televisions and monitors also use additive color mixing. If we take a look at the screen with a magnifier we can see that the picture is created from tiny red, green, and blue dots. In our eyes these colors mix and this way the proper sense of colors is created.

    Subtractive Color MixingCMY Color Mixing

    The three basic colors of subtractive color mixing are the cyan, the magenta, and the yellow. Subtracting from white color (which contains every colors) these colors all other color can be produced. It is called CMY, according to the first letters of the English names of the colors.

    Demonstrating subtractive color mixing

    In this case the light source must emit white light which have equally intensive components in all spectrums. With this light different objects can be exposed which absorb certain wavelengths of the light. This way they subtract these wavelengths from the white light. The remains reach the eye of the observer creating the proper color effect. A cyan object absorbs red color, a purple one absorbs green, and a yellow one absorbs blue.

    Our eyes perceive colors according to the subtractive model and printers for instance use this model too.

    RGB and CMY color mixing methods are the opposites of each other in some interpretations. Mixing two and two base colors from one color mixing methodology results one of the base colors of the other color mixing methodology. Mixing red, green, and blue colors results white while mixing cyan, magenta, and yellow colors results black.

    The Characteristics of Colors

    Colors have three particular characters: hue, saturation and brightness.

    Hue defines the actual color. Saturation is related to the power of color expressing the distance from grey. The less this distance is the more contaminated the color is and the other way around the more the distance is the more saturated the color is. Brightness determines the power of light exposure of the color from black to white.

    HLS Color Mixing

    Colors on the Computer

    The computer monitor (and all displaying equipment in general) applies RGB color mixing process to display colors.

    The cathode placed in the tube of the monitor emits electron rays. These rays hit the phosphor layer in the inside of the monitor tube. This effect generates red green and blue light emission. The power of light emitted depends on the intensity of electron rays. Observing a monitor from a very short distance we might see that the image is actually made of red, green and blue dots having different intensity.

    Why are monitors unable to create every color?

    Theoretically every component of the colors (red, green, and blue) can have infinite number of intensity, so infinite number of colors can be created. In the information technology however the intensity of all colors are determined digitally and this number is finite. Since the different components are stored in one bite every one of them can have 256 kind of intensity. The combination 16 777 216 kind of colors the computer able to produce. This is called the RGB color palette. This is a very large number of colors more that human eye can differentiate. However we cannot be satisfied because unfortunately many colors are missing from this palette. It means that there are colors which have numerous shades in the RGB palette and there are some represented only with a few shades. Typically there are a lot of blue and green shade but only a few brown and yellow.

    Even more colors are missing from the CMY color palette so fewer colors can be used. This is the reason for editing images usually in RGB process and only at the end of the work they should be converted to the CMY color palette.

    Other color palettes have been developed too. LAB color palette can be mentioned because this contains the most colors, this is the most complete one. The application however requires great routine and therefore usually professionals use it exclusively.

    The CMYK Color Palette

    When using color mixing; only theoretically can every color be produced. In practice it is useless to mix cyan, magenta and yellow to produce black. That will never turn out. This is a result of the fact that the basic colors and the tools used are not perfect. This is the reason for adding black to the three basic colors of CMY model in typography (K comes from the last letter of the word black).

    Monitors cannot either create all colors from red, green, blue.

    Sometimes we have to switch from one color processing method to the other especially when an image is being edited on computer using RGB technique and later that is intended to be printed out. Due to the fact that printers work according to CMYK color system somehow colors have to be corresponded to one another. Most image editing software help in this. Images created for the net practically should be composed by using RGB model since they are intended for monitor usage operating in this system.


    Agora What do you think?

    What is the importance of colors in teaching? What color do you use for correcting and why? What colors are present in the classroom? Which one dominates? What was the color of a board long ago? Reflect these questions in the forums, in your blog!


    Reference Works:

    [*] Tay Vaughan: Multimedia, Published by Osborne, Berkeley, California, USA, 1996.

    [**] Csánky Lajos: Multimédia PC-s környezetben, LSI Oktatóközpont, Budapest, 2000.

    Computer animation

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • define term "animation",
    • find animations on Internet.

    Computer Animation

    Computer animation has built a great career in the last decade. The new technology that interested only computer programmers a few decades ago gradually penetrated into the world of art. Traditional drawing method of cartoons used in the movie industry has been succeeded by computer animation. Computer made three dimensional, stereoscopic presentations in the scenes, filled up with artificial characters, have become omnipresent parts of modern films. The current speed and storage capacity of processors have created a world of unbelievably rich images and sometimes we are unable to differentiate between the computer made and real life scenes. What is common and what is different between the two?

    The starting point and the procedure is the same: when still pictures followed by one another are displayed in adequate speed in front of our eyes we cannot differentiate the slides and this phenomenon we perceive as motion picture. In the cinemas 24 slides are projected in every second. The procedure used for computer animation is a little bit more economical because in order to crate the illusion of motion pictures it is enough to play 15 images per second one after another.

    Creating moving pictures

    One of the basic ideas of computer animations is known for long. We all played with those small figures drawn in our notebooks. When we rolled the pages they became "animated".

    Picture animated

    There are unlimited possibilities of creating illusions. The only question is which method shall be chosen. The background or the foreground of the pictures should be changed. The position or the shape (or both) of an object can also be changed. In each phase we can proceed along a chosen track. There is a vide variety and imagination creates the only limit we have.

    Graphic programs significantly simplify the production of images. Permanent backgrounds have to de drawn only once for instance and afterwards they can be copied from one picture to the other. In case of some specific type of animations there is hardly any need for drawing due to the fact that some kind of algorithms are implemented in the software which can automatically transform the objects. For example, if we would like a butterfly flying across the screen we have to input the trajectory and create some of butterfly’s postures in the key positions of the trajectory. The phases in between are generated by the software automatically.

    The effects can be increased. Objects can be influenced by other effects during motion. It can become smaller or larger, may change its own shape and transform itself from butterfly into a dragon for instance.

    Animations with size variant

    The finished product can be saved as a standard format digital file and later it can be inserted in any educational multimedia material or loaded up to an Internet page. A very important thing regarding to the final result, that the image pictures are not stored by the graphic programs in most cases. They create a small program which contains the mathematic algorithm that generates the image pictures.

    In the electronic syllabus animation plays a key role. They are tools for creating attention, illustrate, creating interactivity, and they are also the proper tools for demonstrating processes simulate machine operation and create life like 3D presentations.

    Animations are the most critical parts of the electronic syllabus in respect of planning and production. For the planning methodological, ergonomic, and psychological considerations are required primarily. Secondary, artistic quality graphic planning and accomplishment and proper technical solution is needed. Animations have to be calculated always as the greatest amount among production expenses.

    Editing Animation

    The most known and most popular animation editor software is the Macromedia Flash application which is mainly used for creating web animations and creating wed pages. Flash is basically a vector-graphic program. This way, animations need smaller area and down loading takes less time. This software uses the same units that other vector-graphic programs (e.g. Corel Draw) but it is supplemented with a high level command tool for programming interactive components (games, tests, etc.). Beside the explicitly animation specialized software, in almost all graphic programs (e.g. PhotoShop, CorelDraw) a tool kit for making simple animations can be found.

    3D animationsSoftware used for creating stereoscopic (3D) animations may deserve a separate chapter. Among them the most widely used is 3D StudioMax. In this program 3D animations can be created that make the impression of vivid virtual reality. The VRLM language (Virtual Reality Markup Language) was created in 1995 by the initiation of Tim Barners Lee and Dave Ragett for developing stereoscopic animation appearing on the Internet. According to the aims of the designers the VRLM platform is independent, open, and has a relatively small need of broad band. It is constantly improved; the newest versions support the programming of interactivity.

    Finally, it should be mentioned that mathematic teachers can select among software that can demonstrate more sophisticated functions with 2D or 3D animations. (On the figure a 3D torus can be seen which was generated by an Apple Grapher). One of them is the Maple program pack the Mathematica.

    Animations have different file formats, depending on which programs they were made by. The most used animation making programs are Macromedia Director (.dir, .dcr), Flash (.swf), 3D StudioMax (.max), etc. The completed animation can be converted in standard (AVI, MOV, MPEG, GIF) format in most cases. This makes easier the publishing on the Internet and the integration in multimedia presentation.


    Let's try it! Exercise

    Search the internet for animations. Can you find those kind of animations that were not intended as an advertisement but for demonstration / illustration purposes? Think about it, what kind of animation could you use in your own subject!?


    Reference Work:

    [1] Tay Vaughan: Multimedia, Published by Osborne, Berkeley, California, USA, 1996

    [2] Csendes Béla: Ablak az aktuális világra, Budapest, 1998.

    [2] Csánky Lajos: Multimédia PC-s környezetben, LSI Oktatóközpont, Budapest, 2000.

    Digital audio technology

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • define basic terms of sound.

    Digital Audio Technology

    AudioThe sound (audio wave) is an analog signal. Sound can be converted into electric signals with the help of microphones and this way it can be processed, transmitted, and stored electronically. A binary code must be added to the sound waves (they must be digitized) in order to represent them on the computer. This is done by the sound card. The audio wave is the sum of sinus graphs. The changes of amplitudes in time result a graph which has to be coded.

    The Characteristics of Sound:

    - sound speed: (330-340 m / s in air)

    - frequency: the number of oscillation in a time unit. The measurement used is the number of frequency per second (Hz). This determines the pitch.

    - period time: the time of one oscillation (the period of a sinus wave)

    - intensity: the value of audio output, audio pressure for a surface unit. This is in direct proportion to the amplitude of the audio wave.

    - tone: the manner of sound signal in the frequency area (the amplitude and phase relation of different frequency sounds).

    - volume: the amplitude of sound oscillation. The value is measured in acoustic decibel. The threshold of hearing is 0 decibel. The threshold of pain is 120 decibel.

    volumeDigitizing is made by “sample taking”. From time to time the amplitude of the graph is determined. The amplitudes measured are rounded off to a whole number. The numeric sequence given represents the sound. This transformation (and the opposite) is done by the sound card.

    Analog sound frequency signal can take any value of amplitude which may also vary constantly in time interval. For digital sound storing in time units samples must be taken from the sound (time quanting). When amplitude samples, produced this way, are divided into determined number of units the size of the amplitude sample can be represented by a number (amplitude quantizing). The numbers received can be stored in digital format.

    Analog Recording and Playing of Sound

    Recording

    1. the conversion of sound signal into electric signal. The frequency and amplitude of electric signal corresponds to the original sound signal. ( 1. Sound is detected with a microphone and converted into electric signal ).
    2. the analogue recording of the electric signal ( 2. the turned up signal is transmitted to the cutting head which creates the tracks on the disc )

    Playing

    1. the detection of recorded signals and conversion into electric signal
    2. the turn up of electric signals and conversion into sound

    Digital Recording of Sound

    The only difference compared to analogue recording is in the process of recording.

    1. conversion of sound signals into electric signals. The frequency and amplitude of electric signal corresponds to the original sound signal
    2. conversion of analogue electric signal into a digital sequence of signal by sample taking, quantization
    3. recording of the digital sequence of signals

    Sample taking

    This is the conversion of analogue electric signals, constant in time and value, into some sequences of discreet impulses. The original analogue signal can only be restored from a sequence of impulse without distortion if the frequency of sampling is at least double of the highest frequency occurring in the original analogue signal (Shannon theory).

    Quantization

    The amplitudes of impulse sequences sampled are converted into binary numbers. Some determined number of bits is available to provide the values of amplitudes. This is the length of quantization. This can be 8, 16 or 24 bit which is 256, 65 636 or 16 million different values of amplitudes.

    Sound Cards:

    SoundBlaster

    SoundBluster: introduced sample taking, quantization, and wave chart

    Roland MT-32: integrated a ROM on the sound card which contained the wave chart.


    You can find detailed description about editing a sound in the book TC03 - Sound editing !


    Digital video technology

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • define basic terms of video.

    Digital Video Technology

    CamcorderDigital video and audio technology helps archiving our records on CDs or DVDs. We can publish them on the Internet and we can even create special effects easily that would be very difficult and expensive with analogue equipment.

    Since the digitally stored video and audio data is transmitted via Internet; it is important to compress the size of the data as much as possible. Different algorithms have been developed for this application which will be discussed in the next chapters.

    The editing of audio and video files is done by suitable software. The most known video editor is Premier from the Adobe Company. Well known audio programs are Sonic, Soundforge or Audition (earlier called CoolEdit) from the Adobe Company.

    The analogue and digital video technology is based on television broadcasting. Therefore it is good to know some of the basic principles of television.

    Television broadcasting

    TV picture is made from a series of still pictures. The pictures are transmitted to the receivers when divided into lines. These lines are displayed by electron rays onto the phosphorous coating of the tube surface. In order to achieve a picture free of blinking, and for increasing the frequency of the screen interlaced scanning is used. Electron rays first display the odd lines then returns to the top of the screen and the even lines are displayed between the odd ones. A half TV picture is called field while a full TV picture is called frame.

    For color image transmitting the three basic colors (RGB: red, green, blue) must be transmitted beside the brightness value (Y). When knowing the brightness value and two other basic colors the third one can also be determined. Therefore in practice the Y value and the R-Y and B-Y difference is transmitted. This method is called YUV coding. Three color TV standards are used in the world. The most important specifications are the followings:

    NTSC (National Television System Committee)

    The oldest norm used in the United States and Japan.

    The number of lines forming the picture is 525
    Refreshing frequency 59,95 Hz
    Half refreshing frequency 29,975 Hz
    Line frequency 15734,2657 Hz
    Number of visible lines 475
    (Only 475 lines can be seen because during half picture refreshing (25 line time) no video signal can be displayed.)

    PAL (Phase Alternating Line)

    It was created by modifying NTSC norm in order to eliminate the sensitivity of the successor for transmission phase torsion.

    The number of lines forming the picture 625
    Refreshing frequency 50 Hz
    Half refreshing frequency 25 Hz
    Line frequency 15625 Hz
    Number of visible lines 575

    SECAM (Séquentiel Couleur á Memoire)

    Norm introduced in France. The effect of phase torsion is decreased when switching from line to line only one color difference signal is transmitted. In other parameters it is the same as PAL norm.

    Interlaced Scanning

    TV image in its full extent made of 625 lines (according to CCIR and OIRT standards). After the 625th line the electron ray returns from the bottom of the screen and this process goes on and on. Drawing the 625 lines on the screen takes 40 ms; this is why the frequency of refreshing is 25 Hz.

    Let us stop here for a second. The observation speed of human eye is limited which means it has a significant inertia. This inertia made possible to create TV norms with relatively slow diversion and therefore with small broad band width. If the pixels subsequently following one another in time and the entire picture are flashed with the proper speed our eyes perceive a constant image.

    Interlaced Scan
    Source : http://www.anchorbaytech.com

    There is no any problem with the lines refreshed by every 64 µs. However our eyes are not so imperfect and able to detect the 25 Hz flickering of the constantly refreshed screen. The threshold of perturbing flickering is around 40 Hz. According to this if the time of display takes 20ms with 50 Hz the 625 line interval should be decreased to half. It means double speed divergence and wider broad band. This requirement would have caused serious difficulties when TV norms were established.

    This problem was solved ingeniously by interlaced scanning. The quintessence of the method is the next (see the right figure): in 20 ms only half of the 625 lines are displayed but the electron ray scans the entire screen during this time from top to the bottom and returns to the empty lines (between the lines drawn on) and fills in the second half of the screen.

    Digitizing Analogue Video Signal

    Digitizing analogue video signals, similarly as digitizing audio signals, is done by sample taking. The picture is also dissolved into basic units, pixels and the difference of brightness and color values are quantized. Audio signals are usually digitized simultaneously with video signals. Images and sounds are mostly stored in AVI (Audio Video Interleave) format.

    During the video digitizing process a series of 16 bit color depth YUV coded bitmaps are recorded which means a huge amount of data. In case of a 25 frame / sec 768 x 576 pixel resolution recording 22 MByte data must be stored in every second.

    Video cameras

    The development of video system began with television broadcasting. Broadcasted TV programs are the most poplar worldwide even today. However video systems are more and more frequently used in industry, commerce, education, and culture.

    The units of the simplest video systems are the camera, the video recorder and the monitor. Obviously for more sophisticated systems more cameras, video recorders, and monitors are used. Video cameras convert the image they detect into electric signals that can be recorded and / or displayed afterwards.

    The Theoretical Sketch of How Video Cameras Work

    The Theoretical Sketch of How Video Cameras Work

    The object to be recorded is projected to the light sensitive layer (2) of the objective (1). The distribution of electric charge on this layer - determined by the image – (charged image) is scanned by an electric ray (3) line by line. The signal created during this scanning induces a variety of voltage according to the light detected. Obviously if the sensitive layer does not receive any light the output signal is not changed (4).

    In practice this simple theory can only be realized with sophisticated electric circuits. Black and white cameras at the beginning of TV broadcasting weighted several hundreds of kilograms. Today there are portable color cameras weighting only a few kilos.

    When recording black and white images using only one objective is enough. For color televisions beside the brightness signal the signals of the three basic colors must be produced too.

    At the end of 1953 the development of a three objective camera which was compatible with flack and white TVs was successful. The three basic colors R, G, B, are created with an optical color decomposer made of prisms, and mirrors with filters.

    The Y brightness signal is created from the color signals. There are four objective video cameras too where the brightness signal is produced by a separate objective. However these are merely used in studios.


    You can find detailed description about editing a video in the book TC03 - Video editing !


    Integrate

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • define term "authoring system",
    • list the most widespread systems,
    • differentiate between the softwares.

    Reading Reading
    In this chapter you will see how to integrate the prepared or already made media elements into an e-learning system.

    Authoring Systems

    Authoring systems are software for designing, creating, and testing multimedia[tmoodle???] materials. Media elements crated according to the synopsis and script; have to be integrated in the authoring system to become interactive electronic syllabus. The basic services of authoring systems are:

    · provides proper graphic editor surface for placing multimedia elements in it (text, graphic, audio or video windows, animation sequences and interactive tools – buttons)

    · provides modules suitable for displaying and playing media elements (e.g. video, audio staff)

    · contains synchronizing tools for combining and simultaneous playing different type of data (text, picture, audio, video, etc.).

    Flash· contains user interactive tools. The interaction can be a simple start/stop/pause interaction or a touch-screen interface which activates cycles or conditional procedures.

    There are about 150 software that manufacturers call "authoring tools". All of them perform some part or all of the above describe criteria. However there is a great difference between the user hostility and user environment they provide. Some software are suitable to create "business presentation" only while others are good for developing booth type applications. (Both type refers to information booth here, when individual hardware and software communicate with information applicants). However comprehensive classification does not exist a few criteria for bringing decisions which characterize the software are listed here.

    • User interaction: its existence presumes some quality of the software. A basic part of all CBT [tmoodle???] /CAI applications.

    • Full video: the software contains the entire control interface and playback environment. These software are very near to the most advanced level.

    • Animation: handling animation is not being part of all authoring software. The reason for this is the lack of proper drivers or the lack of knowing the data format. In several cases special software is needed for animation.

    Authoring System Types

    Authoring Systems belong to the most advanced application developing tools. The following classification is accepted:

    Integrated Systems. Integrated authoring systems have all the tools and services which are needed for developing multimedia applications. Some parts of these integrated systems are programming languages but most parts use the tools of visual programming (diagrams icons). Integrated systems contain a wide variety of data processing tools, support many multimedia formats and can be used for a large scale of applications.

    Professional Graphic Systems. Professional graphic systems are created for multimedia developers, who are professionals in their fields, have the knowledge of innovative techniques but they are not professional programmers. Most of these systems provide the following opportunities:

    + standard graphic interface for (GUI) Apple System 7, Windows and Motif systems

    + user graphic play with dynamic modifying options

    + GUI based editing with icon display of events, "drag -and- drop " editing technique, graphic display of control, etc.

    + inserting GUI objects for user interactions (press buttons, tools, etc.) which can be activated with mouse or touch screen

    + simple animation on the screen, graphic and text display and management

    + produces object oriented structure which describes the profile of objects (time, position, etc.) with a data stricture. Many systems are capable to transfer profiles too.

    + Play back option for parts of the whole graphics, video and audio.

    Though applying graphic tools is much simpler than applying authoring or programming languages the condition of effective application is that the developer clearly understands what he/she does.

    An important characteristic of the system is what the created program is capable for on the target platform. There is such solution which can integrate still picture or digital sound in an executable program file but this is not possible for instance with NTSC video or CD sound. Another question can be whether the authoring system is able to produce run-time modules for controlling them.

    Another important viewpoint what data formats the system supports.

    Simple Graphic Systems

    Some software producers sell the simplified, cheap versions of the professional systems. These are equal in function to the original systems but do not know all the data structure needed for professional developments.

    Professional Authoring Languages

    These languages do not provide graphic support but in terms of data input and output, screen design, logical functions and interactivity equal to them. Moreover, in many cases they are suitable for creating special graphic effects too. Applying them does not necessarily require programming knowledge. They have the advantage to have a very low or no run-time license fee. In case of many authoring languages really fine, creative staff can only be made with serious programming knowledge. Learning this language is like learning a structured programming language. In many cases the drivers for the tools have to be written by the author too. However, somehow these languages mean alternatives beside the graphic tools in case of very demanding applications.

    To sum up with it is practical to purchase a high quality authoring software as good as possible which can simplify the author’s work. The purchasing price is refunded several times concerning the reduced time spent for development this way. When purchasing take into consideration the following viewpoints

      • Is it suitable for creating such material that can be played in good quality with the requested speed on the target platforms without any difficulties
      • Does it handle outside equipments (e.g. DVD)
      • What sort of data formats is it capable to read and create
      • What sort of audio and video editing functions does it have
      • How much object oriented
      • Do you have to pay license fee for the developed material
      • There are several authoring systems available on the market which fits for most criteria in a very high standard.

    Classifications of Authoring Systems

    Authoring systems are most often classified according to the structure they have.

    Page Oriented Systems

    PowerPoint (Microsoft)

    PowerPointWith Microsoft Office PowerPoint program effective and dynamic presentations can be created quickly.

    With the program company and product descriptions, presentations, smaller e-learning lessons, class supplementary or presentation materials can be created. Presentations can be assembled of slides. Pictures made this way can be projected with a projector, presented on the computer’s monitor or printed out. Texts, diagrams, pictures, sound effects, animations, and transitions can be added making presentations more spectacular and easily explicit.

        • Development is similar to how a book is written when
          • Associative linking of pages, according to hypertext-philosophy, create a complex web structure.
        • Objects are located in the slides.
        • Program codes can be ordered to the base units.
          • This way it can be determined what to occur after some events (e.g. a clicking).
        • Pre-defined events can be selected
          • E.g. clicking, observing the index of the mouse, even there is an option for saving macros.
          • If something new is intended to be created besides what is included in the program parts programming qualification is needed.
        • Open system can be created which can be extended by the user.
        • New slides can easily be made with the help of ready made patterns.
        • Media files are located inside the programs except audio and video files which do not increase this way the size of the file.

    ToolBook Assistant/Instructor (SumTotal)

    ToolBook is an e-learning developing system developed by SumTotal Systems. The authoring system is made of the variations of Assistant and Instructor. The gratest difference between them is that Assistant does not require much programming knowledge, doing any work with it is easy for an average user with all the commands selected from the menus. The basic concept of Toolbook authoring system is built on learning objects. There are several pre- made catalogs of learning objects for the users and the system offers a user friendly environment. Learning objects can be customized and the system offers a two level option to develop new objects which can be added to the existing catalog enlarging it this way according to what is needed.

    So, on the one hand Toolbook authoring system offers easy to use tool for all who intend to create quality e-learning material with the existing tools without developing learning objects . On the other hand it has an open architecture for IT professionals giving the option for further development.

    Multimedia Builder (Mediachance)

    Multimedia Builder belongs to the page oriented authoring systems. The editing environment is made of well separated parts. Screen pages remaining to pages are located on the bottom and the user can chose which one he/she wants to edit. Objects can be added from the left side column such as multimedia elements e.g. pictures or voices etc. All multimedia elements are located on the right side which can be re-used. The center of the program window is filled with the chosen page and above it there are more editor icons and menus to make editing easier. For instance pre-defined effects can be added from these menus to some units.

    Multimedia Builder

    Icon Oriented Systems

    Macromedia Authorware

    Authorware is a process controlled or with another name icon oriented system. The two name comes from the fact that icons have to be fitted in a process diagram which determine what should happen in a given phase.

    Running of the program means the execution of operations belonging to the icons (placed in the process diagram) from up to down.

    This software, specifically developed for publishing learning, multimedia and Internet material, is the most liked developing system of its type until these days. Due to its icon based control the operation is simple while the quality of published materials is just as good as the most demanding multimedia editions.

    The structure of the multimedia presentation can be developed, the organization of interactive links, the structure of database, so the multimedia subject can be designed with this software. During work all files made by graphic and audio software or images, video and audio files selected from basic materials can be used.

    Advantages: Excellent tools for developing content. Very good options for creating animations and special-effects.

    Disadvantages: High level programming knowledge is required. A separate software pack is needed for administrating the course and following the advance of students.

    Authorware

    Other authoring systems

    Adobe (formerly: Macromedia) Director, Flash

    Flash is the most known system amongst all the categories. It belongs to the family of timeline based authoring sytems. Director's architecture is almost the same, the main difference is that Director is the best tool for offline (CD, DVD) media while Flash is the best tool for online (Internet) media. There are other important differences, but both are suitable for e-learning stuff.


    DirectorFlash

    Object-oriented systems, eg. Visual Basic, Visual C++, …

    These authoring systems are mostly for programmers with object-oriented view. They can be easily handled by a programmer because they are visual in a matter of way, but extremely hard for a newbie. They can be used for special purposes, too.


    Let's try it! Exercise
    Try the most frequently used software, Adobe Flash. You can download a try-out version from the website of Adobe. You can use the software for 30 days without restrictions.


    You can find detailed description about making a presentation, integration into PowerPoint in the book TC03 - PowerPoint !


    Reference Works

    1. Gross, Phil: Director 8 and Lingo, Macromedia Press, Berkely 2 001.

    2. Tay Vaugham: Multimedia, Osborne, MacGraw-Hill

    Interactivity

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • define term "interactivity",
    • list the advanteges of interactivity.

    Why is it useful?

    Interaction

    From the very beginning computers offer the possibility for playing. Computers Educational movies – having effects on several senses – with the power of audio and video forces make all new knowledge more adventures. However multimedia [tmoodle???] means a real break through in this area.
    Films are able to keep interest permanently and may mean a real adventure but never can “involve” students in the subject as much as a special tool which can communicate and react to some interactions. Multimedia may give the answer to the everlasting problems of mentoring and teaching such as:

    1. How to arouse the interest of students for the importance of new knowledge?
    2. How can permanent attention be kept?
    3. How to persuade students to practice?

    Even the first teaching programs (in the 70s and 80s) could captivate teachers (as developers) and students as well although they could only ask questions, react to the students’ answers, calculate and store the performances in percents on monochrome monitors.
    Nowadays’ software technology offers practically unlimited possibilities. Fantasy and creativity is the only limit for the possibilities. This might be just as dangerous as advantageous. No wonder that in spite of the technological features very few good quality educational materials are created.
    Synchronizing interactivity with educational goals is one of the most difficult tasks a teacher can face. Besides pedagogical talents there is a need of sensitivity for good balancing, artistic vision, musical talent, the knowledge of ergonomic and engineering rules, and a great professional preparedness in order to have a product created which really fits for the purpose.
    Despite of this I recommend to all my teacher collages to participate in developments is there is a chance. The development itself is a great adventure and the feedback is an enormous joy.

    Every teacher who has several years experience gains an immense treasure during the years. Teachers (in Hungary) have lots of creative ideas. How good it would be if only a small proportion of these could remain even after they become retired. Interactive multimedia provides a chance to save these pedagogical treasures.

    Programming
    Interactivity can be integrated in the electronic syllabus by programming. Graphic elements, audio and video material can be integrated with no limitation with today’s software developing systems. Developing tools support interactive solutions with hundreds of commands installed in advance. Authoring systems have their own command kits for programming tests, simulations, practicing tasks. Such program languages are Lingo in Macromedia Director or the Flash script language, ActionScript, for educational materials published on the Internet. Most teachers (including many IT teachers) think that getting acquainted with these tools is impossible because programming is another profession. This is not even necessary. However, developing a knowledge that allows to become familiar with the possibilities would be worthy for every teacher. Teachers can participate as members of such developing teams when designing multimedia lessons only this way. Precisely this sort of abstention is the reason for not having really good multimedia syllabuses.
    In this lesson I try to present the possibilities with numerous examples. I think by looking at these examples my collages are going to realize they are not allowed to be left out of this thing!


    Reset This!

    Simple Flash program: a computer variety of an old but very fine little game.

    Only the pictures must be changed and it can be used for many subjects for practicing! If you want to try again, just click on the button in the bottom-left corner with the title "Újra!".

    Here below in this picture you can see the final solution:

    The solution

    Do you have any further thoughts for this idea?


    Program Sample in Flash

    The program codes are attached for those who have some programming knowledge. In authoring systems usually object oriented script languages are used. The Lingo language of Macromedia Director and the language used in Flash are such languages. The program is constructed of “functions” being executed when some event (e.g. user interference like a mouse click) happens. The syntax of Flash script language is similar to the C programming language but obviously during the program preparation Flash objects must be used and the programmer must know Flash environment. The Flash code of a puzzle is attached here as an example. Those who have programming experiences will see that the coding is not so difficult when the algorithm is created.
    Certainly making or writing programs cannot be expected from teachers undertaking the development of electronic syllabuses. The primarily task of developing mentors is the design work of interactive elements.
    We hope however that we could get the interest of IT teachers familiar with programming!
    // Setting start parameters
    wait hole = 1;
    wait point = 0;
    wait end = 0;
    wait maxprobe = 5000;
    wait places: Array = new Array();
    for (i = 2; i < 10; i++)
    {
    places[i] = i;
    }
    for (i = 2; i < 10; i++)
    {
    tile = eval("tile" + i + "_mc");
    tile.onPress = fnMozgasd;
    }
    again_btn.onPress = fnNewGame;
    // Start new game when clicking on this button!
    fnNewGame();
    function fnNewGame()
    {
    point = 0;
    end = 0;
    fnMixing();
    possibility.text = maxproba;
    score.text = 0;
    result_mc._visible = false;
    }
    // Moving – when the player clicks on a picture
    function fnMove()
    {
    if (!end)
    {
    which = this._name.substring(4, 5);
    point += fnMove(which);
    score.text = point;
    possibility.text = maxprobe - pont;
    if (fnReady())
    {
    loadMovie("grat.jpg", result_mc);
    end = 1;
    }
    else if (point == maxproba)
    {
    loadMovie("unsuccesful.jpg", result_mc);
    end = 1;
    }
    score.text = point;
    }
    }
    // Moving pictures
    function fnMove(unit)
    {
    direction = fnCanmove(elem);
    obj = eval("tile" + unit + "_mc")
    aktUnit = unit;
    if (direction == 1)
    {
    places[unit]--;
    hole++;
    obj._x -= 90;
    return 1;
    }
    if (direction == 2)
    {
    obj._x += 90;
    places[unit]++;
    hole--;
    return 1;
    }
    if (direction == 3)
    {
    obj._y -= 90;
    places[unit] -= 3;
    hole += 3;
    return 1;
    }
    if (direction == 4)
    {
    obj._y += 90;
    places[unit] += 3;
    hole -= 3;
    return 1;
    }
    return 0;
    }
    // Checking if the chosen picture can be moved?
    function fnCanmove(unit)
    {
    place = places[unit];
    //is the hole to the left?
    if (place != 1 && place != 4 && place != 7)
    {
    if (place - 1 == hole)
    {
    return 1;
    }
    }
    //is the hole to the right?
    if (place != 3 && place != 6 && place != 9)
    {
    if (place + 1 == hole)
    {
    return 2;
    }
    }
    //is the hole above?
    if (place > 3)
    {
    if (place - 3 == hole)
    {
    return 3;
    }
    }
    //is the hole under?
    if (place < 7)
    {
    if (place + 3 == hole)
    {
    return 4;
    }
    }
    return 0;
    }
    // The function gives a logic value being “true” when the puzzle is done
    function fnReady()
    {
    good = 1;
    for (i = 2; i < 10; i++)
    {
    if (places[i] != i)
    good = 0;
    }
    return good;
    }
    // Random ordering of pictures when starting a new game
    function fnMixing()
    {
    pcs = random(1000);
    for (i = 0; i < db; i++)
    {
    unit = random(9) + 1;
    fnMove(unit);
    }
    mixing = 0;
    }
    function fnSliding()
    {
    obj = eval("tile" + actUnit + "_mc");
    if (actDirection == 1) //left
    {
    if (obj._x > aktEnd)
    {
    obj._x -= 10;
    }
    else
    {
    obj._x = aktEnd;
    delete onEnterFrame;
    }
    }
    }

    Publishing online

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • define term "Internet",
    • list some important dates in history of Internet.

    Reading Reading
    In order to view the curriculum online you need to integrate the learning object (the multimedia element) for example into a webpage. From this chapter it will come to light, how! First of all some history.

    Internet history

    InternetIn 1962, a nuclear confrontation seemed imminent. The United States (US) and the Union of Soviet Socialist Republics (USSR) were embroiled in the Cuban missile crisis. Both the US and the USSR were in the process of building hair-trigger nuclear ballistic missile systems. Each country pondered post-nuclear attack scenarios.

    US authorities considered ways to communicate in the aftermath of a nuclear attack. How could any sort of "command and control network" survive? Paul Baran, a researcher at RAND, offered a solution: design a more robust communications network using "redundancy" and "digital" technology.

    The most important basic principle was that the network must not have any kind of centre. Some other priciples were very simple:

    to build robust, fault-tolerant and distributed computer networks.

    The ARPANET was one of the "eve" networks of today's Internet. In an independent development, Donald Davies at the UK National Physical Laboratory also discovered the concept of packet switching in the early 1960s, first giving a talk on the subject in 1965.

    The early ARPANET ran on the Network Control Program (NCP), a standard designed and first implemented in December 1970 by a team called the Network Working Group (NWG). To respond to the network's rapid growth as more and more locations connected, Vinton Cerf and Robert Kahn developed the first description of the now widely used TCP protocols during 1973.

    The opening of the network to commercial interests began in 1988.
    Although the basic applications and guidelines that make the Internet possible had existed for almost two decades, the network did not gain a public face until the 1990s. On 6 August 1991, CERN, a pan European organization for particle research, publicized the new World Wide Web project. The Web was invented by British scientist Tim Berners-Lee in 1989.

    By 1996 usage of the word Internet had become commonplace, and consequently, so had its use as a synonym in reference to the World Wide Web.



    References:
    Wikipedia

    www

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • define term "html, url, http",
    • define term "world wide web".

    HTML, URL, HTTP

    HTML, which stands for HyperText Markup Language, is the predominant markup language for web pages. It provides a means to create structured documents by denoting structural semantics for text such as headings, paragraphs, lists, links, quotes and other items. It allows images and objects to be embedded and can be used to create interactive forms. It is written in the form of HTML elements consisting of "tags" surrounded by angle brackets within the web page content. It can embed scripts in languages such as JavaScript which affect the behavior of HTML webpages. HTML can also be used to include Cascading Style Sheets (CSS) to define the appearance and layout of text and other material. The W3C, maintainer of both HTML and CSS standards, encourages the use of CSS over explicit presentational markup.

    In computing, a Uniform Resource Locator (URL) is a Uniform Resource Identifier (URI) that specifies where an identified resource is available and the mechanism for retrieving it. In popular usage and in many technical documents and verbal discussions it is often incorrectly used as a synonym for URI,[1]. The best-known example of a URL is the "address" of a web page on the World Wide Web, e.g. http://www.tenegen.eu

    The Hypertext Transfer Protocol (HTTP) is an Application Layer protocol for distributed, collaborative, hypermedia information systems.

    HTTP is a request-response protocol standard for client-server computing. In HTTP, a web browser, for example, acts as a client, while an application running on a computer hosting the web site acts as a server. The client submits HTTP requests to the responding server by sending messages to it. The server, which stores content (or resources) such as HTML files and images, or generates such content on the fly, sends messages back to the client in response. These returned messages may contain the content requested by the client or may contain other kinds of response indications. A client is also referred to as a user agent (or 'UA' for short). A web crawler (or 'spider') is another example of a common type of client or user agent.

    In between the client and server there may be several intermediaries, such as proxies, web caches or gateways. In such a case, the client communicates with the server indirectly, and only converses directly with the first intermediary in the chain. A server may be called the origin server to reflect the fact that this is where content ultimately originates from.

    World Wide Web

    wwwThe World Wide Web, abbreviated as WWW and commonly known as the Web, is a system of interlinked hypertext documents accessed via the Internet. With a web browser, one can view web pages that may contain text, images, videos, and other multimedia and navigate between them by using hyperlinks. Using concepts from earlier hypertext systems, British engineer and computer scientist Sir Tim Berners-Lee, now the Director of the World Wide Web Consortium, wrote a proposal in March 1989 for what would eventually become the World Wide Web. He was later joined by Belgian computer scientist Robert Cailliau while both were working at CERN in Geneva, Switzerland. In 1990, they proposed using "HyperText [...] to link and access information of various kinds as a web of nodes in which the user can browse at will", and released that web in December.

    "The World-Wide Web (W3) was developed to be a pool of human knowledge, which would allow collaborators in remote sites to share their ideas and all aspects of a common project." If two projects are independently created, rather than have a central figure make the changes, the two bodies of information could form into one cohesive piece of work.


    Reference:

    Wikipedia

    Html

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • list the most important elements of an HTML document
    • create your first simple webpage.

    HTML history

    htmlIn 1980, physicist Tim Berners-Lee, who was a contractor at CERN, proposed and prototyped ENQUIRE, a system for CERN researchers to use and share documents. In 1989, Berners-Lee wrote a memo proposing an Internet-based hypertext system. Berners-Lee specified HTML and wrote the browser and server software in the last part of 1990. In that year, Berners-Lee and CERN data systems engineer Robert Cailliau collaborated on a joint request for funding, but the project was not formally adopted by CERN. In his personal notes from 1990 he lists "some of the many areas in which hypertext is used" and puts an encyclopedia first.

    An HTML document, HTML elements

    An HTML element is an individual component of an HTML document. HTML documents are composed of a tree of HTML elements and other nodes, such as text nodes. Each element can have attributes specified. Elements can also have content, including other elements and text. HTML elements represent semantics, or meaning. For example, the title element represents the title of the document.

    In the HTML syntax, most elements are written with a start tag and an end tag, with the content in between. Tags are composed of the name of the element, surrounded by angle brackets. An end tag also has a slash after the opening angle bracket, to distinguish it from the start tag. For example, a paragraph, which is represented by the p element, would be written as

    <p>In the HTML syntax, most elements are written ...</p>

    However, not all of these elements require the end tag, or even the start tag, to be present. Some elements, the so-called void elements don't have an end tag. A typical example is the br element, which represents a significant line break, such as in a poem or an address. For example, the address of the dentist in Finding Nemo would be written as

    <p>P. Sherman<br>42 Wallaby Way<br>Sydney</p>

    Attributes are specified on the start tag. For example, the abbr element, which represents an abbreviation, expects a title attribute with its expansion. This would be written as

    <abbr title="Hyper Text Markup Language">HTML</abbr>


    This is a simple html code:

    <HTML> Hello world!
    <HEAD>
    <TITLE> ”Hello world!” HTML document </TITLE>
    </HEAD>
    <BODY>
    <P>Hello world!</p>
    </BODY>
    </HTML>

    Let's try it: enter the text above in the left side into an ASCII word processor, like Notepad and save it as first.htm (it is important, that the extension should be htm !). After that open this file in a web-browser and chech the result!

    We have used five tags in this sample: HTML, HEAD, TITLE, P, BODY.

    The HTML tag closes the whole document. HEAD contains the head of the page, BODY is the real content, the text that appears. Inside the heading there is the TITLE and its text appears in the titlebar of the browser application.

    Elements (tags) consist of two parts which close the text that appears on the webpage like parentheses (brackets).

    Rules

    There are multiple kinds of HTML elements: void elements, raw text elements, and normal elements.

    Void elements only have a start tag, which contains any attributes. One example is the link element, for which the syntax is

    <link rel=stylesheet href=fancy.css type="text/css">

    This link element points the browser at a stylesheet to use when presenting the HTML document to the user. Note that in the HTML syntax, attributes don't have to be quoted. When using the XML syntax (XHTML), on the other hand, all attributes must be quoted, and a trailing slash is required before the last angle bracket:

    <link rel="stylesheet" href="fancy.css" type="text/css" />

    Raw text elements are constructed with:

    * a start tag (<tag>) marking the beginning of an element, which may incorporate any number of attributes;
    * some amount of text content, but no elements (all tags, apart from the applicable end tag, will be interpreted as content);
    * an end tag, in which the element name is prepended with a forward slash: </tag>. In some versions of HTML, the end tag is optional for some elements. The end tag is required in XHTML.

    Normal elements usually have both a start tag and an end tag, although for some elements the end tag, or both tags, can be omitted. It is constructed in a similar way:

    * a start tag (<tag>) marking the beginning of an element, which may incorporate any number of attributes;
    * some amount of content, including text and other elements;
    * an end tag, in which the element name is prepended with a forward slash: </tag>.

    Attributes define desired behavior or indicate additional element properties. Most attributes require a value. In HTML, the value can be left unquoted if it doesn't include spaces (name=value), or it can be quoted with single or double quotes (name='value' or name="value"). In XML, those quotes are required. Boolean attributes, on the other hand, don't require a value to be specified. An example is the checked for checkboxes:

    <input type=checkbox checked>

    In the XML syntax, though, the name should be repeated as the value:

    <input type="checkbox" checked="checked"/>

    Informally, HTML elements are sometimes referred to as "tags" (an example of synecdoche), though many prefer the term tag strictly in reference to the markup delimiting the start and end of an element.

    Element (and attribute) names may be written in any combination of upper or lower case in HTML, but must be in lower case in XHTML. The canonical form was upper-case until HTML 4, and was used in HTML specifications, but in recent years, lower-case has become more common.



    You will find practical information about editing a webpage in Moodle in the book TC03 - Website editing !

     


    References:

    [1] Bócz Péter: A világháló lehetőségei, ComputerBooks, Budapest, 2001.

    [2] Paczona Zoltán: HTML technikák a gyakorlatban, Computer Panoráma, 2001. => http://franka-egom.ofm.hu/segedanyagok/html_konyv.pdf (TÖRÖLT LINK!!!)

    [3] Wikipedia

    E-learning curriculum

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • define term "synopsis",
    • create one of the most important documents in planning an e-curriculum.

    Synopsis

    Drawing up the synopsis is the firpst step of curriculum development. The synopsis (comes from the Greek synopsesthai – means see together) the same scenario is used in film production, a brief summary of the act of the film. Today's meaning is: overview, summary, abstract, content, bulky writing content outline. In relation to electronic curriculum development: Short but comprehensive presentation of all material of the curriculum, defines the approximately estimated development costs, and transparency over the development concept.
    Appropriate for teachers to show their didactic objectives, verify that the digital learning materials for the selected target groups offers more (as compared to traditional methods)
    For example we would like to win support for proposals for electronic curriculum development. Based on the synopsis the evaluators can decide that the proposed expenditures are in accordance with the expected results.

    Synopsis structure

    1. Objectives, define topics


    At this point the teacher sets out: the subject, within a topic, the age of the group to produce electronic curriculum, and the purpose of development.
    The content outline contains minimum the followings:
    a. objective, target group (ages)
    b. subject
    c. topic
    d. „sub” topic

    It’s proposed to choose such a unit from the topic, which is difficult to teach with traditional methods and using a computer is justified in methodological and didactic point of view, where interactive elements could help understanding and learning. For example learning new words in a foreign language or topic of transformations of the function in mathematica.

    2. The number and size of learning units


    Learning unit is a part of the e-learning phase which can learn in (20-40 minutes). The synopsis should describe how many learning units needed for the topic and measure the extent of text material.The standard unit of measurement of schoolbook writing is an “author sheet”, it means 16 A4 pages, with an A4 page 25 rows and 60 characters per line (1500) included.

    Learning unit

    Short content

    Size

    1.

       

     

    For the content summary of each course it's required to estimate the number of the

    a. sample tasks,

    b. practising tasks,

    c. test,

    (with keys),

    d. concept definition

    Learing unit

    Number of exercises

    sample task

    practising task

    test

    definition

    1.

           

    2.

           

    ...

           

    all

           

    The table provides information to estimate the necessary work. The requirements of interactive, and multimedia based tasks (programming, design, editing) are in the next block the following synopsis.Summary, in the second part of the synopsis we plan the required author’s work, text learning objects content and their number.

    3. Pedagigical methods


    cIn this section we write shortly about our pedagogical and didactical concept such as:

    • Is this learning unit for individual learning or for practising
    • Frontal or group learning
    • How to apply them
    • How can others use this finished material
    • How to control and evaluate the acquired knowledge
    • How to build the digital unit to the educational process
    • Prove a methodological summary we have a pedagogical thought outline, our concept is complete the development is not for itself



    Methodological outline

    Learning unit

    processing method (proposed time of period)

    Method of control and evaluation

    1.

       

    Examples:
    a, Within a subject for some special topic the lesson is in the computer lab. During the digital lesson the students under the teacher’s supervision process the learning material on their own.
    b, Use a presentation system ( projector, laptop) in the classroom and work in frontal or in groups.
    c, Use the electronic unit in addition for their classroom lessons, to practise, control them after their lessons.
    d, Our students write closing theme test in the computer lab. The system store the results for the teacher and shows for the students immediately.
    e, If we public the curriculum on CD, students can take it home.

    These possibilities can be combined.

    Note: Not necessary to use a table form!

    4. Learning Objects


    LIn the high-quality e-learning material development the significant part of costs comes from designing and planning media elements. In the synopsis it is required to define the type (picture, sound, video) and number of the LO-s in the unit.
    It can cause serious problem if the plan and the resource estimating is not accurate, because during the process probably external expert work
    will be needed. Based on the synopsis the external orders must be prepared. Particularly dangerous to underestimate the potential of IT and graphic design work, because in this case is forced to decrease the construction cost (at the expense of quality) or take on the additional work, which is not covered with the low calculated costs. If the media objects are not prepared for yourself, you should ask for a bid from an expert.


    The quantification is possible in a simple table, if we have approximately the same quality (same size) images, if the images and animations are simple, and the length of the sounds and videos are similar. In this case you also need to attach an explanation
    about the length of the sounds, and their type (narration, music or actor’s narration).

    Learning unit number


    graphic (digital drawing)

    Digitized image

    Animation

    Video

    Sound

    Interactive elements


    1.

    2

    3

    1

    0

    3

    10

    2.

               

    ...

               

    All

               

    In the column of interactive elements think of programming work required elements (for example, test exercise with animation or puzzle with graphic elements or matching tasks. In multimedia systems (e.g. flash) small programmes (scripts) ensure interactions, checking the solutions and evaluation.

    For planning real costs further additions are needed for the table if we plan a more difficult 3D simulation animation, or a small film clip work out in a film studio. Always ask for a specific bid to avoid to under or overestimate the costs.

    5. Format of media elements, software and hardware requirements for the production


    Format of media elements, software and hardware requirements for the production
    Very important to use standard digital formats, fit to several frame systems. Consult an IT expert to choose the required software and hardware for the production. It’s very useful if we would like to sell our end product. Look around to find some free software or calculate their costs.

     

    media element

    format

    software

    hardware

    photo

    JPG

     

    digital camera

    graphic      

    narration

         

       

    6. Source, copyright information


    The source should be listed rely on your curriculum like bibliography in books. Always keep up the copyright laws and if needed ask permission from the author.c 1

    If you want to sell your electronic curriculum, maybe you have to pay some royalty for some elements. Not forget to calculate with them and keep up the royalties in advance.

    7. Publication


    Write here about the software and hardware environment of your curriculum. The selected framework specifies the format of the LO-s should be prepared (e.g. image size due to transfer limit). For example for CD-s with a well-known software, or for an open source system on the Internet or both of them.

    8. Notes


    1. Take care the synpnopsis not to bee very large, but contains all the essential elements needed for the evalution of the concept.

    2. It is possible you need some modification during the implementation so it's worth calculating with some margin (max. ±10%) .


    Bibliography:


    [1] Tay Vaughan: Multimedia, Osborn McGraw-Hill, Berkely, 1999.

    [2] Rakaczkiné, dr Tóth K., Szabó J., Szentpétery Zsolt: Az e-tenanyag fejlesztésének pedagógiai-távoktatási alapjai, SZIE, GTK Közép-Magyarországi Regionális Távoktatási Központ, Gödöllő, 2002.

    Synopsis model plan

    Synopsis

    Scotland

    titled e-learning curriculum

    1 AUTHOR:

    name:

     

    school:

    primary school of Szigetújfalu

    2 OBJECTIVES, DEFINE TOPICS

    define learning unit

    learning objectives

    Scotland's profile

    target group
    High school students

    subject

    English language

    topic

    Scotland

    3 STURCTURE, NUMBER AND SIZE OF LEARNING UNITS

    curriculum

    learning unit

    content

    1.

    Introduction: Population, main characteristics, symbols

    2.

    Brief historical summary

    3.

    Culture: literature, dance, music

    4.

    Interests: food, drinks, clothing, legends

    5.

    Summary, Other / celebrities

    4 MEDIA ELEMENTS

    Multimedia elements to help understanding

    Learning unit

    text

    graphic, picture, animation

    video sound

    interactive elements

    1.

    2

    3

    1 1

    1

    2.

    5

    2

    2 1

    2

    3.

    3

    3

    1 2

    4

    4.

    2

    1

    1 1

    1

    5.

    3

    2

    2 2

    3

    összesen

    0

    6

    Note: It’s very important to use multimedia elelements table, because their type and number has significant impact on the budget. In e-learning courses it’s not always important, so it is enough to sign thier type without their number.

    5 COURSE COMPONENTS – MOODLE- RESOURCES, -ACTIVITIES

    In Moodle there are two categories of content: 1. curriculum; 2. activity. A crriculum is a static elements for learning, and new knowledge, and an activity are dinamic content elements with the participation of the students.

    Used Moodle-resources

    Type

    Learning unit

    Topic

    Pedagogical aim

    Text site

    1

    Scotland residents

    Comprehension, knowledge processing

     

    2

    William Wallace

    New vocabulary, self-learning topic

     

    3

    Robert Burns

    Reading, understanding

     

    4

    Clothing, haggis

    New knowladge

    5

    HTML , web site

     


     

    Book

    More units changeable New knowledge, self-learning text
    Attached file (pps, pdf, image …) All units   Illustration, new knowledge

    Used Moodle-activities

    Activity
    Topic
    Pedagogical objective

    Glossary

    any of them
    Expansion of knowledge, organization

    Diary

       

    Blog

    continuously Self-learning

    Questionary

    Course start and end
    Getting information

    Wiki

    continuously Getting knowledge

    Database

       
    Forum Occasionally repeat
    Chat
    Test After last topic practising, control

    SOURCE, COPYRIGHT INFORMATION

    • Free images
    • skocia.hu
    • Schoolbooks

    PUBLICATION

    School server

    Synopsis model

    Synopsis

    ..................................................................

    title for e-learning curriculum

    1 Objectives, define topics

    Define learning unit

    learning objective

    target group

    subject

    topic

    "sub" topic 1, 2, …

    2 Structure, Number of learning units, size

    learning units

    learning unit

    content

    size

    1.

    2.

    learning units

    learning unit

    number of tasks

    example task

    practising task

    test

    concept

    1.

    2.

    3 Teaching methods

    methodical outline

    learning unit

    proposed time frame

    processing method

    control, evaluation method

    1.

    2.

    4 Number of LO-s

    Multimedia elements to help understanding

    learning unit

    graphic (digital drawing)

    digitized images

    animation

    video

    sound

    interactive elements

    1.

    2.

    all

    5 Producting media elements

    media element

    format

    software

    hardware

    photo

    graphic

    narration

    6 Source, bibliography

    7 Publication

    Stroyboard

    Aims of learning Learning objectives


    When you have completed this session, you should be able to

    • define term "storyboard",
    • create one of the most important documents in planning an e-curriculum.

    Storyboard

    After the approval of he synopsis may star the detailed design work. The key planning tool for multimedia systems is the soryboard. First of all let's review the work phases, developing tools, and required experts.

    Activity

    What is it about?

    Who?

    Tools(software)

    Design

    Planing the screens. Composition, colors (background, text and navigation elements), font, font size, etc., template pages

    graphic artist, teacher

    graphic software, mint e.c. CorelDraw, PhotoShop (or manual)

    Storyboard compilation

    Flowchart discribing the structure of learning units. The items displayed on each screen describing the properties and their interactive steps.

    teacher, e-learning method expert

    Word processor, a special flow editor. (May be prepared manually!)

    Collect and edit media elements

    Create pictures, sounds, narrations, video, animations

    media editor, teacher

    Mediaeditor software

    Writing curriculum

    Writing the curriculum, practicing, illustrative of tasks and their solutions. Definitions, concepts.

    author, teacher(s)

    Word processor, HTML editor

    Proofreading I.

    Professional,linguistic, stylistic control.

    expert

    Word processor

    Integration, implementation [1]

    Edit the e-learning unit from the elements based on the storyboar in a frame system.

    teacher, mediaeditor, programmer

    Multimédia szerzői rendszer (pl. Macromedia Director, Authorware, Adobe Flash)

    Data entry

    In online frame systems you should upload the LO-s, such as definitions, test etc.

    teacher, administrator

    frame system

    Testing, modification, documentation

    Testing the completed e-learning unit in several enviroment (configuration, browser)

    teacher, student programmer

    Runtime environment (hardware+szoftware)

    Proofreading II.

    Methodological, pedagogical evaluation.

    expert, student

    frame system

    Modification

    Possible variation on a proposal of the reader

    programmer, media editor

    frame system

    Master copy

    Create a master Multimedia CD for the reproduction.

    programmer, media editor

    frame system

    Documentation, archiving

    Archiving the sources.

    teacher, programmer

    Word processor

    No strict order for the work phases. Some of them can be parallel (e.g. writing curruculum, design plans, collecting elements)

    ikonTeachers can create a digital repository from their LO-s. This allows picture, audio, video and more materials can also be used in several lessons.

    Storyboard

    Storyboards provide a complete picture of what the final program will look like, its a kind of constuction manual. Contains all the information for developing without asking the authors. Annex, or part of the revised textbook developed by the authors contains thee text elements (explanations, tasks, concepts).

    Storyboard has the following major elements

    There is no standard. The point is to reach a transparent documentation for processing.It's worth making a sketch even if the teacher is a person, author, and programmer. You can use flowcarts for example. More people working together in development, the more important that each item identified clearly.

    For larger software developments worth designing a template library, single tables, pre-defined content worksheets.

    Storyboard -General description:

    • basic information (author's data, topic, unit, structure)
    • navigation tools („scroll”, „search”, help, start test)
    • describes the orientation marks

    Description of construction (building from the elements):

    screenScreen planning

    You should describe all different screen structures and their elements, the physical location, role and content of each element. You need a list of media elements for each screens to identify them.

    Flowchart

    Represents the curriculum structure, progression of the different levels and navigation options.

      Screenplans

      Graphical presentation of the curriculum ( colour, background, font, navigation elements) Based on the design plan it's worth creating several templates,which significantly simplifies the design and construction as well.

      The consturctur needs the exact plan of each screen, place of elements, functions, interactivity). You can use a simple form, where sign the text, image, navigation elements

      Place of different elements

      Form for mediaelements

      The annex of screenplans is meadia elemets describing with ther metadata:

      Text elements: the whole text (file, path), font, size, type

      Graphic elements: name, size, format

      Sound, video: lenght, size

      Some additional data if needed:

      Unit:

      Metadata

      Screen:

      Element:

      description

      file

      format

      size

      media

      creator

      author

      copyright

      interaction

      notes


      Bibliography:

      [*] Tay Vaughan: Multimedia, Osborn McGraw-Hill, Berkely, 1999. Hazai kiadás: Panem Könyvkiadó, Budapest, 2002.

      [**] Rakaczkiné, dr Tóth K., Szabó J., Szentpétery Zsolt: Az e-tenanyag fejlesztésének pe­dagógiai-távoktatási alapjai, SZIE, GTK Közép-Magyarországi Regionális Távoktatási Központ, Gödöllő, 2002.

      [***] Farkas Róbert: Az ipari forradalom elektronikus tananyag, Szakdolgozat, Oktatásinformatikus képzés, Prompt Oktatóközpont,Gödöllő, 2005.

      [****] Hutter O., Magyar G., Mlinarics J.: E-learning 2005, Műszaki Könyvkiadó, Budapest, 2005.

      [*****] Kovács Ilma: Az elektronikus tanulásról, Holnap Kiadó, 2007

      Planning and Implementation

      Aims of learning Learning objectives


      When you have completed this session, you should be able to

      • plan an e-learning curriculum.

      Planning and implementation

      A multimedia curriculum is complex with a number of development projects. Key stages:

      Project preapring

      Analisys: needs, objectives

      Content, pedagogical, methodical planning – synopsis

      Feasibility study: spending plans, schedules

      Project start

      Project establishment, distribution of tasks, deadlines, responsibilities

      Content design

      write curriculum, proofreading

      Preparing graphic plans

      Preparing graphics – design elements

      Storyboard

      Implementation

      media elements

      LO-s integration, collating

      Documentation

      Testing, modification

      E-method proofreading, evaluation, modofication

      master preparation, documentation, archiving

      Project end

      Final reports, project evaluation and project closure

      Watch this video!



      pm

      1. Analysis

      • First of all you have to considered with the following:
      • Learning environment
      • Pedagogical objectives
      • Available tools, equipment
      • Sources
      • Pedagogical programme of the school

      2. Content, pedagogical and method planning

      In this section you should be demolished the development general objectives to learning unit level. Formulate specific objectives, requirements, building of the unit for "screen".

      Content planning

      E-curriculum sturcture

      Topic

      "sub"topic

      Learning unit

      LO-s

      · Text

      · media elements (image, sound, movie)

      · animation (pl. flash)

      Learning unit structure

      The learning units have been designed to the student in a learning stage - without a break - not more than 20-40 minutes for self-learning or teach them.

      Learning unit elements:

      1) Introduction: : a brief, one or two sentences to describes the specific topic

      2) Motivation: the student's motivation (for example, introduction to the benefits of this new knowledge

      3) Description: : The new skills

      4) Practising: tests, simulations or other interactive elements

      5) Summary: a brief summary of the topic

      Synopsis

      Synopsis is a short but comprehensive presentation of all material of the curriculum, defines the approximately estimated development costs, and transparency over the development concept.
      Appropriate for teachers to show their didactic objectives, verify that the digital learning materials for the selected target groups offers more (as compared to traditional methods). This documnet is the first step in e-learning development.

      3. Writing curriculum, proofreading

      In the third phase starts the professional content work, writing the curriculum, and preparation of assessment materials. Authors and assistants working together the finished curriculum is assessed by professional, linguistic and methodological assistants. It's different from the traditional curriculum writing despite the many similarities. Don't forget to think of screen-sized units.

      4. Preparing graphics – design elementsNavigation

      Design is an important element of an electronic curriculum. You decided that the texts will be readable or not the graphics, images, illustrations truly serve the desired effect or, conversely, the mediaelements distract the learner's attention from the essence.

      5. Storyboard

      Storyboard is one of the most important step in planning. In this work phase you should define the final design, content, function elements.

      Storyboard comes from film industry, which describes the production in small details. ( screen) The storyboard is based on the synopsis, but while the first synopsis of the content and teaching focuses on the objective, while the storyboard is for the implementation of technical manual.

      Based on the storyboard (graphic designers, programmers) step-by-step can be made the production without any further instruction.

      6. Preparing, editing media elements

      In this work phase carried out the necessary graphics, digitizing images, narrations, music, sound, video etc.) The quality of the media elements is very important.

      An important element of the period of work is programming, because the interactive elements (tests, exercises, simulations) required for programming work.

      During the preparation and collection of media elements, not forget the copyright issues!

      7. Implementation, LO-s integration, documentation

      Building the electronic learning material, follows the instructions of the storyboard in a software development environment or frame system. It's called implementation. Integration - depending on the environment - often requires programming work, and in the final work phase maybe "fine tuning" is needed. Fine tuning means to synchronize the media objects with each other.

      Required from the programers (if its even yourself!)to document the source code for the subsequent amendments. At the end of the improvements it's important to make the product documentation.

      8. Testing, e-method proofreading

      Testing is an important phase. All your efforts above are of little value if the product is notaccessible and usable. Consideration of usability factors actually begins in the planning phase but it should be formally tested during prototyping, then following full production. The importance of testing and considering the usability factors cannot be over-stressed.
      These steps require:
      • Knowing what standards should be aimed for (technical compliance and
      usability of the product being developed).
      • Establishing means by which to measure or test that standards and usability
      objective have been achieved.
      • Considering when to measure, and how information from this will feed back
      into the development process to achieve best outcomes most efficiently.

      9. Master preparation, archiving

      Latest you should archive all the sources for safety reasons and for subsequent amendments.

      Master copy is always be prepared, regardless of whether the product is a CD-ROM, DVD, or publish it on the Internet for reproduction, later re-using.

      If we use an integrated frame system archiving (regular backups) is automatical, but it's important to preserve the sources.

      Finally…

      The chapter is about professional development. If the development is well-prepared, the result is not be the all work of employees, it could be a effective product.


      Bibliography:

      [1] Tay Vaughan: Multimedia, Osborn McGraw-Hill, Berkely, 1999. Hazai kiadás: Panem Könyvkiadó, Budapest, 2002.

      [2] Rakaczkiné, dr Tóth K., Szabó J., Szentpétery Zsolt: Az e-tenanyag fejlesztésének pedagógiai-távoktatási alapjai, SZIE, GTK Közép-Magyarországi Regionális Távoktatási Központ, Gödöllő, 2002.

      [3] Hutter O., Magyar G., Mlinarics J.: E-learning 2005, Műszaki Könyvkiadó, Budapest, 2005.

      [4] Kovács Ilma: Az elektronikus tanulásról, Holnap Kiadó, 2007