RARE poster

ELTAC2

ELTAC Poster

More Photos

Feedback :: Report

Report on delegate feedback ‘Institutional Pragmatics’ – JISC Institutional Innovation/ Lifelong Learning & Workforce Engagement Programme SSBR Online Event, 12 November 2009

Compiled by Patsy Clarke
December 2009

Introduction

This summary report follows the same format as the reports of earlier events in terms of feedback from the submitted forms. Included is information on responses about how the platform was experienced, types of equipment used, the nature of participation, what delegates valued, and what they would like to be different at future events. It concludes with a list of recommendations arising from selected feedback.

Feedback collection

Of the 60 delegates attending the event (this is excluding the support team and JISC managers), 13 submitted responses to the feedback form which is 21.6% return rate.  These figures are similar to those from the SSBR July online event when 62 attended and 16 responded. The feedback form that collected the information summarised in this report is included in the Appendix and can be accessed online at:http://ssbr1109.inin.jisc-ssbr.net/feedback-form

Representation of programme phases

Table 1: Feedback forms submitted and attendance totals

Group

Feedback forms

Totals attended

IIN (Phase 2 projects) 8 37
LLLWFD (Phase 3 projects) 05 23
TOTAL submitted 13 60

Booking and joining instructions

All responded favourably about booking though with one left blank.

Only one of the submissions responded negatively about joining instructions though they were referring to the instructions to join Elluminate. The reason given was that the instructions were easy to miss if you were expecting to receive them in an email and were not checking the SSBR page.

Other comments provided about booking and joining instructions were favourable:

  • ‘Well organised and easy to use’;
  • ‘Easy to understand and access’
  • ‘Excellent’

Using the Elluminate audiographic platform

‘Very good impression from my first online event. Thought it was great’.
As was arranged prior to the online event in July, a series of familiarisation sessions were offered to meet the needs of delegates who might be unfamiliar with the online Elluminate audiographic application platform. Of the 60 delegates who attended the event 5 signed up for a familiarisation session. Referring only to the feedback forms, the figures in Table 2 illustrate the levels of familiarity with Elluminate together with the proportion that signed up for familiarisation sessions. These indicate that from those who responded only one new to the platform took up the opportunity to get practice in a hands-on interactive session. Conversely there were two delegates who did take up the opportunity despite previous experience with the platform.

The proportion of Elluminate novices (3) this time around has completely reversed compared with July when there were 11 novices.

Table 2 Elluminate platform familiarity and sign ups to training sessions

Level

No.

Novice 3 (1 signed up for training session)
Familiar 10 (2 signed up for training session)
Total 13

Based on their experience of the event, 10 of the 13 who submitted feedback would consider using Elluminate for their future project meetings or assemblies. These included the three novices to Elluminate who responded.

Favourable comments about the experience of using the platform in this event included:

  • that the experience was much better than previous attempts to use it
  • experienced less sound interference and echo than at previous event
  • that it was a good tool and would be recommended for future university events and meetings
  • while it was more difficult than a ‘round the table’ conversation it did help prevent that the conversation be dominated by a few
  • there was a friendly, encouraging atmosphere

Despite comments of improved experiences including with sound, the adverse comments on the platform were mainly focused on sound quality. These may also be dependent on individual equipment and bandwidth quality with some problems resulting from ‘lag’.

At one extreme was the experience described as ‘mostly a waste of time’ given the difficulty of following the event owing to audio quality and continuity issues. Another delegate who experienced audio difficulties with sound breaking up indicated that it was difficult to go beyond half a day with the sound cutting out as it did.

Discouraging use of external speakers was a recommendation made after the July event and implementing this and reminding delegates to switch off their mics when not is use did help reduce some of the more extreme audio problems previously experienced.

Although more delegates made use of webcams at this event, compared with video use at the July event, there was only one negative comment on video, namely that it was ‘less than useless’ though no explanation was added as to whether this was related to receiving or transmitting.

Tables 3 – 5 that follow outline the type of connectivity as well as the range of computer, audio and video equipment used by delegates. Some delegates used more than one type of connectivity or equipment and others did not report fully what they used. The result is that totals are thus not consistent with the total who responded.

Table 3 Audio and video equipment types

Equipment

Types

No.

Speakers Built-in 3
External 3
Mini-earbuds 1
Mic Built-in 3
USB 2
Mini-jack 2
Headset USB 5
Webcam Built-in 3
USB 3

Table 4 Internet connectivity

Connectivity

Location

No.

WiFi University 4
Home 1
Wired University 5
Home 1

Table 5 Computer type

Computer

No.

Laptop 7
MAC 5
Desktop 4

Participation methods and level

Table 6 illustrates that all of those who submitted feedback forms had participated with text chat during the event, with more than half contributing via voice. Close to half presented slide sessions. All five of the Phase 3 project delegates who responded used a webcam. One of the Phase 2 project delegates who responded used one..

Table 6 Participation methods

Participation methods

No.

Text chat 13
Voice 7
Presented 6
Webcam 6
Other 2

Session participation

In the tables that follow all totals exclude support team members and JISC managers.

Table 7 includes the total number of delegates  who attended the various sessions as well as of those who submitted feedback forms. Sessions are presented in the order in which they occurred during the event.

The totals reflecting participation on the day in the break-out sessions is illustrated in Table 8. LLLWFD (Phase 3) projects were encouraged to gather in specific break-out rooms (Room 4) where session emphasis specifically targeted LLLWFD projects. Attendance in other break-out rooms was organised by random allocation.

The morning  break-out session provided opportunities for presentations by projects with feedback and discussion while the later one focused on contributors to change with the LLLWFD session including an introduction to the newly developed and populated database of emerging project themes.

Table 7 Session participation

Sessions Total  attended No. submitting feedback
Keynote : Managing effective change in institutions 56 13
Unlocking the Potential of JISC Projects 11
Coffee Break with radio ? 5
Assembly Bazaar: project feedback & discussion 55 8
Radio Interviews ? 3
Managing JISC Programmes 51 12
Workshop: Analysing change 54 10
Plenary session 45 12
Wrap up & close 10
Social event 6? 1

Table 8 Attendance at break-out sessions

Session1:

No. attended
Assembly Bazaar: project feedback & discussion Room 2: 27
Room 3: 14
Room 4: 14

Session 2:

No. attended
Workshop: Analysing change Room 1: 16
Room 2: 16
Room 3:   9
Room 4: 13

Level of engagement

To provide an approximation of the overall level of engagement as perceived by delegates they rated this on a 5 point scale from 1 (Very weak) to 5 (Very strong).  Figure 1 indicates that far more delegates reported stronger rather than weaker levels of engagement with a median score of 4 compared with a median score of 2.5 in response to the same question after the July online event.

Figure 1: Level of engagement reported by delegates in feedback forms

Most valued at the event

‘I thought it was a great intro to an online conference. Look forward to the next one’.
Much of the value gained from the event reported by those who submitted feedback focused on the opportunity to ‘meet’ other projects, hearing about the range of experiences and activities; finding commonalities including common challenges that could lead to further linkages.

Other valued aspects reported were:

  • The workshop guiding questions that helped focus thinking and generated more involvement
  • Interactivity including the ongoing information from facilitators who were able to answer questions on the chat during discussions
  • The opportunity to present resulting in more active involvement leading to the opening up of links for follow up communication
  • The opportunity to build confidence with the technology.
  • Input from keynote speakers
  • Clarification from JISC including guidance from Lawrie Phipps
  • Useful and interesting information about the final report

Not everyone gained value from the event. The delegate who found the day a ‘poor use of time’ considered it unhelpful that the event was aimed at ‘fulfilling abstractions of problems seemingly just for JISC reports’.

Another delegate appreciated the difficulty of meeting the challenge of designing for activity balance as ‘too much interaction is hard and no interaction is simply boring’.

Suggestions for future events

In response to what delegates would like to be different in future events the requests in the feedback included:

Content

  • More information regarding final and completion reports

Format

  • More opportunities/ areas for ad hoc/established interest groups to meet and discuss relevant topics ‘e.g. flocking!’
  • More time for breakout sessions

Presentations

  • More and better graphics
  • The ability to move through the slides

Processes

  • An agreed procedure for moving participants from break-out rooms

Media

  • Higher quality audio

Conclusions and recommendations

The predominantly positive feedback from those who submitted feedback forms confirms the success of the event organisers having incorporated suggestions submitted in feedback from earlier events. In addition, despite remaining issues for some delegates with the audio quality, there is evidence of increased confidence and familiarity with Elluminate as an event facilitation application. Structuring discussion sessions around guiding questions also contributed to favourable feedback.

As opportunities for networking and making connections across projects continues to be the most highly valued aspect of events, online and face to face events should continue to incorporate both structured and informal opportunities for such activities. Although one phase of projects is reaching completion new connections made may still contribute to post-programme collaborations with respect to future funding opportunities.

For future events another valued topic, particularly as projects head for completion is on managing outputs including diverse reporting mechanisms. Appendix: Online feedback form
(Available online at http://ssbr1109.inin.jisc-ssbr.net/feedback-form)