Comparing learner interactions and media use in two groups working in parallel within an audio-graphic environment

Sunday 5 September 2010 by Marie-Laure Betbeder, Maud Ciekanski, Thierry CHANIER

"We recommend that research both mediate social interaction in the community, and also observe the community. For the former we need to explore student interactions distributed across space, time and media, and with data in a variety of formats." (Woolf et al., 2010: 53).

For this practical work attached to the Eurocall Workshop, we propose you to explore learners’ interactions distributed across media, relationships between individual usage and group usage of modes and modalities, between multimodal strategies and the organization of collaborative work.

Learners have to fulfill a collaborative writing task in a foreign language within an audio-graphic environment: they can talk using their microphone, type in a textchat or in the shared word processor, vote by using icons, etc. They work online at a distance (everyone is in a different location, generally at home). During the online course, the tutor has divided the whole class in 2 small-size groups and given them the same task. They work in 2 independent virtual rooms before coming back in the same one for the debriefing session. This practical work is organized around this 2 parallel sub-sessions.

Working environment: an Audio-Graphic Synchronous Environment (AGSE), namely Lyceum, which includes communication tools and shared editing tools.
Subjects: 2 small-size groups of students (age 25-50), French-speaking students learners of English (level false-beginner), one English-speaking tutor.
Learning goals: The course aimed at developing vocational English and competences in open distance teaching through spoken and written English. Task examined here during the last (8th) online session: write in the shared text editor a short report evaluating Lyceum
Data: 2 corpora, one per group with video screenshot and transcriptions of multimodal interactions. These two corpora are part of the Learning and Teaching Corpus (LETEC), Copéas where data of the whole course are structured. All corpora are open access through the Mulce databank.
Analysis Tool :Tatiana Software
Task recommended for the practical work: We provide you with extra data for displaying in Tatiana learners usage of modes and modalities in one group. Apply similar rules of display for the second group and try to answer these questions :

  • During this task can you observe patterns of usage of modes and modalities when participating to the writing task at an individual level ? Do they correspond to monomodal or multimodal strategies ?
  • If yes, can you notice individual variation ?
  • Can you observe patterns at the group level ? Do they correspond to monomodal or multimodal strategies ?
  • If yes, are both group patterns the same ?


1) Create on your disk a new folder

Create on your disk a new folder (let us name it "mulce-workshop") where you will put everything : Tatiana software and our data.

2) Install the sofware Tatiana

Download it from Tatiana (2010) . Let us suppose you have a Windows system (it also works fine on Mac, but names may slightly change). The file to download is named "". Unzip it. Keep only "TatianaWin" folder, and delete other folders. Tatiana can be launched just by clicking "Tatiana.exe", which is "TatianaWin" (hence mulce-workshop\TatianaWin\ Tatiana.exe). At this stage you will not be able to see many things since you have not included data for the analysis. But check that it can be launched (you need Java).

See further explanations on Tatiana in another part of the workshop.

3) Collect corpora from Mulce platform / databank

  • Acess Mulce databank at Mulce-pf(2010) . Register (it is free and rapidly done, "User space" tab). In the "Browsing" tab, choose the link "Object selection". In the"Analysis Tools" frame, select "synchronized multimodal layout (Tatiana)". A list of corpora will appear. Download "mce-copeas-T8_lobby_s101-all" and "mce-copeas-T8_s102_lobby-all".
  • As you can see in the detailed description (access on the right link, near the download ones), videos are not included in the corpora since they are quite large (over 400 Mo each). You have to download them separately. Come back to the "Browsing" tab, choose the link "Identified Resources". Then type the ids of both videos and download them "mce-copeas-T8_s102_lobby-mov" and " mce-copeas-T8_lobby_s101-mov". Make sure that your video files, which have a MOV type, are associated to QuickTime and no other player. In order to check it, just double click on any video and see whether it is played with QuikTime. If not, change files types associations (see Windows help).

More explanation on the website interface in (Mulce, 2009)

4) Collect our first Tatiana analysis on these corpora

  • Just download this ZIP archive (1 Mo) named "". Once unzipped, you get a folder named "TatianaWorkspace".
  • Replace the folder included in " TatianaWin", which have the same name, with this one (i.e. replace "mulce-workshop\TatianaWin\ TatianaWorkspace" with "mulce-workshop\ TatianaWorkspace"
  • Place video "" in the folder " mulce-workshop\TatianaWin\TatianaWorkspace\T8_1\traces\" and video "" in the folder " mulce-workshop\TatianaWin\TatianaWorkspace\T8_2\traces\"

5) Explore our analysis in Tatiana

Well done ! Now you can play with Tatiana. Launch it. Develop corpus "T8_1", and double-click on the replayable T8_1. You should see this, i.e. the transcription on the right part of the window.

  • Discover parts of the way the first group worked : double-clicked on "" in Tatiana window (folder "traces") ; click on button "Synchronize" located in left-bottom part of Tatiana window ; the remote "Control panel appear", as well as the video screenshot. With this panel you play the video and see the alignment with the transcription. Look (and hear on your audio system) at the interactions at different times, for example around 48:40 (i.e. 48 min 40 sec on the remote control panel, which correspond to 12:14:00 in the transcription, which was the time of the day this session occurred). Try several times in order to feel what happened.
  • Discover visualization rules associated to modalities. Select the replayable T8_1 (left part of Tatiana window). In the "File" menu choose "Visualise as > Score sheet". In the right frame of the Tatiana window colored lines should appear. You can see details or have an overview by zooming (right-click in the right frame and a choice for zooming in or out will appear). In order to discover the visualization rules associated to this display, click on the tab named " Visualisation Rules" located on the bottom part of Tatiana window. It is quite straightforward to interpret them:

    • Tutor’s actions are displayed on the second line, 50 pt under the first line
    • Then come learner AT1s action (100 pts under the first line), AT3 and AT6
    • Audio acts appear in blue, acts in the word processor in green and in the textchat in red.

Note that the first line correspond to actions of other participants and also special "participant" named "sil". When looking at the transcript you may have noted sil acts. They correspond to long silences (from 3seconds to several minutes) in the audio modality. We introduced this notation in order to have a continuous audio flow. Generally during these silence participants act in other modalities (Ciekanski & Chanier, 2008 ; Chanier & Vetter, 2006).

6) Do your practical work

It is now time for you to look at what happened in the second group, corpus T8_2 in Tatiana, build your own visualization rules for the replayable T8_2 and answer the questions above.

7) Discover extra more analysis

Double-click in the Tatiana windows, folder "analyses", on "Categorie Principale" and "sous-categories". Garrison, Anderson, and Archer categories for interactions related critical inquiry process will appear. When you come back to T8_1 transcript, you will see that almost every act have been categorized, in T8_1 and T8_2.

You could also build visualization rules in order to display these categories and see who did what.

Eventually a small subset of acts, categorized in such manner, has been selected and a very small graph built relating "Trigerring Event" acts with "Exploration" and "Integration" ones. Double-click on "Graph1" on the left to have them appear. You can see the result when coming back in the window were the score sheet is displayed (also confusingly named "T8_1").

Thierry Chanier, 4th September, 2010


Part of this practical work, including analysis, have been prepared with the help of Mario Laurent, Master Recherche Linguistique et Informatique 1ère année.


Arnold, N. & Ducate, L. (2006). Future foreign language teachers’ social and cognitive collaboration in an online environment. Language Learning & Technology, 10 (1), pp. 42-66.

Chanier, T., Vetter (2006). A. "Multimodalité et expression en langue étrangère dans une plate-forme audio-synchrone". Apprentissage des langues et Système d’Information et de Communication (Alsic), vol. 9. pp 61-101.

Ciekanski, M., Chanier, T (2008). Developing online multimodal verbal communication to enhance the writing process in an audio-graphic conferencing environment. Recall, vol. 20 (2), Cambridge University Press. 162-182. doi:10.1017/S0958344008000426

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2-3), 87-105

Mulce (2009). Accès aux corpus et objets de Mulce-pf. (2010). Documentation on Mulce project [website]

Mulce-pf (2010). Mulce platform, access to databnak of corpora [website]

Tatiana (2010). Code and executable of Tatiana [software]

Woolf, B.P., Shute, V.,VanLehn, K. et al. (2010). A Roadmap for Education Technology, Report of the Global Resources for Online Education, (GROE) [report] , 80p.

Home | Contact | Site Map | | icone statistiques visites | info visites 28095

Follow site activity en  Follow site activity News  Follow site activity Eurocall Mulce Workshop   ?

Site powered by SPIP 3.1.3 + AHUNTSIC

Creative Commons License