Skip to main content

Deaf and Hard-of-Hearing Assistive Technology in the Classroom

DARLENE: Welcome, everyone. It is Darlene McLennan here. On behalf of ADCET and ATEND I would like to welcome you all to this webinar, Using Technology in the Classroom for Deaf Students. It is fantastic to have my friend and colleague, Gary, presenting to us today. Gary, as many of you would know, is a national disability coordination officer based in Deakin University in Victoria. Gary is deaf and has been in the profession for over 30 years. It is fantastic to have his expertise online today.

A few housekeeping items before we start. We are very fortunate, today we are trialling having an Auslan interpreter on screen as well as live captioning, which we normally have. So I would like to say thank you to Rebekah from Bradley Reporting for the live captioning and also to Ryan from Auslan Services for the Auslan interpreting. We hope this provides people with a first-hand experience of some of the things that Gary will be talking to us about today.

A couple of other issues. Tasmania are experiencing a bit of an Internet issue at the moment. I am hoping we won't disconnect, but if we do, we won't be far away. It normally just takes a minute to reconnect, so please stay with us. Also, unfortunately, the Gotomeeting webinar platform is not as accessible as we would desire for screen readers. If you have questions or comments, please e-mail jane.hawkeswood@utas.edu.au. All participants have been muted. This is to ensure as little background noise is received during the webinar. The presentation will run for around 50 minutes. We have received questions prior to this webinar and Gary is going to answer those at the beginning. But please feel free, throughout the presentation, to enter your questions into the pod question box or chat box and I will choose a few at the end of the presentation if we have time. If you have any difficulties throughout the webinar, please feel free to e-mail Jane at jane.hawkeswood@utas.edu.au and she will try to sort you out. Alright, with great pleasure I would like to hand over to Gary, thanks, Gary.

GARY: Excuse me, I have been sitting here for five minutes waiting to cough. I didn't want to interrupt Darlene's talk. But welcome to this talk. Just to explain, I am deaf. You probably can hear that in my voice. Sometimes when I get excited I start to speak a bit loud, I start to speak a bit high pitched as well. If any of you suffer a hearing loss as a result of listening to me today, at least the presentation will give you some tips on how to deal with it.

Moving on to the first slide. You will hear from me, from time to time, I'll say next slide, next slide, next slide. Darlene controls the slides. What I thought I would do today is talk about deafness, talk about the classroom, talk about some of the technologies that can help deaf and hard-of-hearing people in the classroom. I have also got 40 questions here which I received yesterday. I will try to incorporate those questions into my talk.

As an example, the first question somebody asked me, when using hearing aids or Cochlear implants, when they require additional services such as transcriptions and captions, is there a criteria that we can use to assess them? Now, I will answer that in just a second, because when I talking about deafness in this talk, I am talking about deaf people . . . I am talking about hard-of-hearing people who use hearing aids or Cochlear implants, I am talking about people who have a combination of Auslan and speech, and some prefer captioning, some prefer sound, some prefer a combination of all of those. No-one is the same. They all have their own preference. If you hear me pause, that is only because I am giving the interpreter a bit of time to catch up.

So in answer to that question, if there is any criteria, the only criteria you can have is to talk to the deaf or hard-of-hearing person and they tell you what they need. Some people transcripts over captioning, some prefer captioning, some prefer Auslan; the only criteria is to put the deaf or hard-of-hearing person in control in deciding what works best for them.

Next slide. Thank you. Talking about technology, what would I know? Now, I was thinking about this last night because I had the last couple of months, I basically had this set in my mind what I was going to talk about and then last night I was thinking about a time when I started a program in Adelaide, it was a mental health program for people who are deaf or hard of hearing, young people who are deaf or hard of hearing. The goal of that program was to promote positive mental health in young people who are deaf or hard of hearing. So, we had an issue because a lot of these young people were living in the country and were isolated. The young people said they don't have a lot of social outlets and find it hard to be included. We wanted to find a way to include those young people who are living in the country with the activities we had in the city. So we developed a relationship with Adelaide TAFE. Basically, what happens is they had a video conferencing unit. We wanted to link up the people in the country and a TAFE in the country to the TAFE in Adelaide and we could have video conferencing with them. So we could do that, we could do mentoring, useful planning, teach Auslan to families if we wanted to, all that sort of stuff. But back then, we couldn't do what we are doing now. The Internet was too slow. So you had to do video conferencing through the phone, basically. So we spoke to Adelaide TAFE, but what happened is they wanted a big TV, or their video conferencing, so we agreed to buy them a big TV if they could give us 12 months of free video conferencing. And they did. But, back then, with video conferencing, it was bound through the three phone lines, so the phone line, for it to work properly, the quality you see with Ryan now . . . Phone lines, and the picture was very, very jerky. But three phone lines for an hour was about $600 worth. So it was very expensive. Luckily, we had a . . . and we experimented in doing this with two phone lines. And although it was a bit jerky, we found if the people at each end adapted applications, we could communicate.

So we did all sorts of things. We had a youth group that planned a social at the end of the year through video conferencing, we had a family in Port Pirie where we taught them Auslan to communicate better with their daughter, we had done mentoring young people in the country through this video conferencing. So this was the first time I began to realise that, yes, we can use technology to provide access to people who are deaf or hard of hearing.

Now, moving along, when I moved to Ballarat in 2003, we discovered that it was very difficult to pay for interpreters, because for interpreters to travel to Ballarat, which was one hour from Melbourne, it was double the cost of getting an interpreter in Melbourne. Then if I moved out to Warrnambool or Portland or anywhere like that, it was triple, or quadruple the cost. So I did a bit of work with Auslan Services to work out if we could do something, and by doing that you can cut the cost of interpreting. So you only had to pay them, no driving, no time on the road, and you could get two interpreters, one interpreter, for the same cost of what you were paying in Melbourne. So again, I started to experiment and find out the best way to do it and did it with the Internet, fixed line Internet, what do you call, the dongle, 3G. We were successful with that as well.

And then the final part is - I will let the interpreter catch up - the final part, one day I was watching, I was exploring the Internet and I found Bristol University, this is 2004, long before we knew what the Internet was going to do, what it does now, and on the Bristol University page they had the news and they basically had deaf people who were on the screen, filming themselves, signing the notes. And it showed it very clearly. Then I started to think, yes, there are things you can do, you can teach people doing this, you can add captions to this, and even if you can't get interpreters out into the country, you can prerecorded stuff and make it accessible.

So in my head, going all over the place, I know, I thought technology has a place to provide access to people who are deaf and hard of hearing. And basically, it spiralled from there. Basically, it just went from there.

I would like you to remember that access is not an option. It is compulsory. So that's my message. I don't like to hear, "Can't afford it" because right now at this point in time there is absolutely no reason why access cannot be provided to people who are deaf and hard of hearing. Years ago, maybe, but not now.

Just adding on, I had a question here, talking about technology yesterday. Somebody asked me about the Echo 360 system. I have to tell you, up until yesterday I knew absolutely nothing about the Echo 360 system and I had to actually talk to Jason and Barney from Bradley Reporting and find out what it was. And the question basically was talking about how you could get stuff on Echo 360 captioned. And Jason said it was about sending your audio files to a captioner, live captioning, what that told me, although I can't answer the question very well, is that technology is constantly evolving. I don't know everything. There is a lot that I don't know. I encourage you today, if anything I talk about, talk it about it among yourselves and share the information. Alright.

Next slide, please. Now, I want to define the classroom. A lot of people think the classroom is a fixed thing; the teacher talking, people listening and learning, the teacher tells, you listen, you learn. But, of course, we all know a classroom isn't that. We have an example here online of a classroom, an online virtual classroom, and we are providing access through technology to an Auslan interpreter and captioning at the same time. I have a mobile phone and if somebody asks questions, Darlene will send the questions to my mobile phone. There is lots of technology in place. As the classroom operates, it is very different from what it was in the past.

Now, a very important message I want to give to you is that it's not just about hearing the instructor, and a lot of the stuff I will talk about today is making sure that the best student gets access not only to the instructor but to discussions that are happening, so to their peer learning as well. So that demand, we have a lot of flexibility, we have to be innovative and creative. So in our head, just think about the classroom and the different ways we learn. We have lecture-type things, we have professional development with people and tables all over the room, we have online learning, all of that sort of stuff.

Next slide. I want to talk a bit about Auslan through VRI. Now, there are two things people talk about, one is VRI, which means video relay interpreting. What you see Ryan doing is video relay interpreting. Video relay service. Now that is a service offered by the National Relay Service where you can use Auslan to access your phone. But you can't use it for this sort of thing, you can only use it for those. So when you have VRS and VRI, there is a difference, I just wanted to make that clear. Obviously, VRI is done through different platforms, done through Skype, is done through Gotomeeting. I am sure there is other webinar software you can use as well. We've lost the captioner? Should I stop? It has come back.

With VI, it can be to your computer, so you can be sitting with your big computer, desktop computer and you can use VRI that way. You can do it on a mobile device, as I said. You can do it one on one, or you can do it in groups. We are talking about in a classroom here, so think about when you are in a classroom, there might be four deaf people using Auslan in the classroom, maybe you are in Warrnambool and you have decided to use VRI. It is not really a good idea to have four or five people gathered around one computer, it is very inconvenient. So what you do is you can get it connected to the computer that VRI is coming through and beam the interpreter on a screen so it is large and everyone can see it.

Now, I want you to also think that sometimes within a teaching thing people have breakout groups. They have to go to their table, they have to talk and they have - what can happen is that it is a bit hard to take your computer everywhere with you. So if you can get your mobile, log in on your mobile, use the wireless connection and then you can hold or put your mobile or iPad on a stand and within that group chatting you can access the interpreter as well. So there are different ways to use VRI. Think about the classroom, teaching techniques, all of that sort of stuff. And also, if you really - say for example the teachers decide to take everybody outside, if you can get a good 4G Internet connection it can work, but that can be variable. So I don't always recommend it, but if you have to, you can.

Remember, with the Internet we have in Australia, which I think we are ranked number 60 in the world at the moment, it can be . . . So we need an NBN. So vote Labor at the next election; and Bill Shorten didn't pay me for that.

Next slide. Talking about live remote captioning, you can do that in two ways. You can have a captioner present in the room, typing on the stenograph machine, captioning going up on a big screen; you can do it like we are doing now where Rebekah is based - she is the captioner, she is based - could be in Sydney, could be in the Barossa Valley, could be in Brisbane, anyway - and she listens in on the phone or the Gotomeeting platform and she captions. That is basically what she is doing now. Again, live remote captioning, Gotomeeting, and I am sure there are other platforms as well, like VRI, you can use a data projector to put it on screen. It needs good reliable Internet, it needs good audio, all the things you need for VRI.

But also, you can use this in group settings as well, because you can log in to Gotomeeting on your phone, on your iPad, or whatever. So if the feature has group sessions, breakout groups or whatever, you can organise for that person to log into Gotomeeting on their iPad or their mobile phone so when they are in that group, people are listening, they have access to the captioning as well. That is flexibility to use the captioner. A lot of people think captioning is only on screen, very one way. But you can use live remote captioning for different situations as well.

Next slide. Now, with captioning, live remote captioning, it is very similar to VRI, a fixed computer, one on one, more than one deaf person, with group discussion, using it on your mobile devices, you can use 4G. But I can tell you, Bradley Reporting have nightmares about 4G, they don't like it, it is not reliable, and like I said with VRI, what can happen, we need an NBN.

Next slide. Now, I am talking about VRI and captioning. Now, it is important that you prepare. So if you are teacher or anybody that is preparing a learning thing for people who are deaf or hard of hearing, you need to think about everything that is going to happen. The first thing you need to do is you need to test. So make sure the Internet is reliable, you make sure the audio is reliable, before you do anything, you test and you try and you make sure it can happen. That can be time consuming, but in the long run everybody benefits. Because if you just go in cold, like we did yesterday, for example, Darlene, we were testing it, couldn't get the speaker going, had to get my son to come in and fix it up and do all the testing beforehand to iron out all of the bugs so when you come in today everything is going smoothly, and it all comes with good preparation. So check the Internet, check the audio, have a trial run. Get the material to the interpreter and the captioner so they can see what is going on and it helps them to sign better, caption better.

Also one of the things you must do is source another screen. One of the biggest headaches you have is the size or video or teaching material is on one screen, because it is no good freeing up the captioner or the interpreter. So have a second screen where they can go on that. And most of all, have - how things are being taught, are they going outside, are there group discussions, where are the people located around the room, how many tables, all of that sort of thing. You have to be able to work what barriers there might be and problem-solve to overcome the barriers, the layout of the room, the audio, the Internet, needs to happen to make it work.

Technology will let you down sometimes. If anything will let you down, it will be technology, so always have a plan B. There are lots of plan Bs. I won't elaborate on them. I know it is easier said than done, but have a plan B.

Next slide. I am going to get a drink. I am getting a bit froggy, so give me a second.

VRI and captioning, the sound system is very, very important. A lot of modern offices conference facilities, schools, university, they have really, really good sound systems, they have speakers in the room, microphones at the table, microphones at the student's desk, all of that sort of thing. Those sorts of things mean that there is a good feed to the interpreter or the captioner. Now, you have to remember that mostly the feed comes to one laptop. Now, when you have got that sort of sound system where it is, try to make sure that the volume is the same all over the room. When you add that sort of sound system, it is great for live remote captioning and VRI, but it doesn't always happen that way. Sometimes you - things happen in school halls, sometimes people book community centres, and there is no technology there. There are lots of differences in areas which can cause problems, so you need to have a look at the sound system and what you need to overcome some for those barriers. So, if the interpreter can't hear properly, the captioner can't hear properly, everything is going to fall back. So make sure the sound system, the audio feed is good.

Now, I have a solution for that, one solution. I am going to explain it in the next slide.

The next slide. Now, the Roger pen. Some of you who are deaf or hard of hearing will know about the Roger pen. Basically, the Roger pen is an FM system that has bluetooth capability as well. So that means the Roger pen can be programmed to different devices or synced. So what can happen is that with the pen, the student wears it around their neck. The teacher, traditionally, has a microphone that goes on their . . . and when they talk the sound goes directly to the Roger pen and instead through the hearing aid or Cochlear implants so they hear things very clearly. It cuts out the background noise. It is very good for people who are mobile. Cochlear on one side, the hearing aid on the other, or it is very good for people who have hearing aids, competing hearing aids, they have a phone app in one ear and a Siemens in the other, so a lot of hard-of-hearing people use a Roger pen. They will use Auslan or whatever, so they like to listen. But, the Roger pen allows you also to provide feed to interpreters and captioners who are being delivered through a laptop or mobile device.

Now, I will explain. I want you to imagine that you are in a school hall. There is Internet there. People are sitting at tables, five or six to the table. At the front you have a number of speakers, they're speaking about different things, it might be about employment, it might be about funding, it might be instruction about any professional development thing, and of course, there are activities that happen. So people ask questions, the people at the back are asking questions, answering questions, etc. So the microphone traditionally is at the front where the main speakers are, and that means when the people at the back are asking questions, the captioner and the interpreter can't hear them. And that's frustrating for a deaf or hard-of-hearing person because they can't get access to all of the learning and all of the discussions. So what you can do with the Roger pen, you can actually program it to be near your computer. You can get a little sound card which goes into the USB drive of your computer, you place the Roger pen near that sound card and then you can have roving mics or you can have a number of microphones which are placed on the tables or delivered around the tables and the sound from the back goes direct to the laptop, through the Roger pen, so that gives the interpreter and the captioner a good audio feed from anywhere in the room. So I worked out the system with word-of-mouth technology and I encouraged it, and I encourage you to have a look at word-of-mouth technology website and talk to Andrew Willis there about this system and how you can set it up.

Next slide. Now, I am going to talk a bit here about assisted listening devices. Some people like to listen. But some people like to listen, receive captioning at the same time or like to listen or receive interpreting at the same time. Now, there is a video being shown, there is music, there is explosions or whatever. They like to have access to that sort of sound, they like to have access to the sound that is going on around the room, get a feel of the atmosphere, but they might not necessarily be able to hear everything that is being said so they need captioning or interpreting to make sure they get full access to communication. But some people just prefer to listen. They can't sign, they don't want captioning, they hear quite okay with assisted listening devices, so I am going to talk about a few of them.

Next slide. Now, FM systems. A basic FM system is simply a speaker which is with the teacher and the listener has a receiver which is worn around their neck or attached to their hearing aids, so when the speaker speaks, it goes directly to the listener. The good thing about an FM system is you can control the sound, how loud you want it, it cuts out the background noise and it allows you to hear well. Communication is when you have got group situations, a traditional FM system is not so good. It is merely one-way teacher to student and that discusses things.

Next slide. Now, bluetooth systems as well. Bluetooth to Cochlear implant, the system I told you about is an example of a bluetooth system. With the bluetooth, as I explained, it can give you access to group situations because you can use a roving microphone and whoever is speaking into that microphone is synced or linked to the Roger pen, so that allows people to be in both. Now, I can tell you, a Roger pen is not always easy to set up, it is not always user friendly and it takes a bit of time to learn. But once you get it going well, it is a very effective system. Sometimes you may only have one other microphone so it means you are running all over the room to give people microphones and it takes a bit of coordination.

Next slide. Now, traditionally we have the loop system. You know the loop. Lecture theatres, movie theatres have a loop system. The old hearing aids had a T-switch at the back of them and turn on the T-switch and the loop system will mean it can be fed direct to the hearing aid. It allows you to hear the presenter and if you have got a roving microphone, it means you can hear the people talk as well. But modern technology today is making it almost obsolete. Also, hearing aids today, many of them don't have a T-switch. So there will be a time when they won't be there. They need maintenance, and they are often not turned on or working. That old system we had, everyone used to scream out, "Is the loop working! It's not working!" It used to be a nightmare. It gave success in years gone by but I don't think we will see much of it in the future.

Next slide. Talk about soundfield amplification, that traditionally, you will see speakers all over the room. On the table, there are little microphones so when you speak that little microphone feeds the speakers all over the room and it means anywhere you sit is the same. It is great for live remote captioning and video relay interpreting, because you can go to the spot anywhere in the room and they will receive the sound at the same volume as everybody else. You can't control the amplification, so people with mild to moderate hearing, they can hear it well wherever they sit in the room, but if you have got a more profound hearing loss it is not so effective. But those sound-field systems that are part of venues can be really effective. They can help some hard-of-hearing people as well. It is good idea whenever you have to find out if they have the system.

Next slide. Key pointers, as I bring it to an end . At the end of that I will talk a bit - because I probably should have put this into this presentation - about captioning of online material, like YouTube videos and so on. I will go through these key pointers and I will talk about captioning of online material. The first thing to remember is every person is different, everybody will have a preference. Do not, in your head, think that there is one solution for everybody, there is not. There are standard things that you think do to make things better, but the process should be guided by the person who is deaf or hard of hearing. They know what works best for them, they know what works for them. Sometimes they may not know about stuff, but refer them to me or other people, we can talk to them about what those solutions might be, but everybody is different. There is no set way of doing things. And then a combination of solutions, a bit of captioning, a bit of interpreting, a bit of sound. All different stuff, so keep your mind open. Good reliable Internet, good reliable audio. I cannot emphasise how important that is. You need to really investigate how something is being taught. You have to have a look at all of the strategies being used, one-on-one group discussions, videos, online stuff, all of that will have the issues of access for a person who is deaf or hard of hearing. Technology offers solutions, but unless you really know how something is being taught it is very hard to plan how to use that technology. You can't solve every communication problem. You can improve inclusion, you can improve access but there will always be sound barriers. So if you can't make everything perfect, don't kill yourself, don't pull your hair out, don't stress. You can make it really, really good but there will always be some barriers. And test beforehand, trial beforehand.

The assisted listening devices I told you about are very, very good but the effectiveness depends on the level of the hearing loss. It also depends on the ability of the person to recognise speech. Some people with profound or severe hearing loss with amplification can hear quite well with assisted listening devices. Some people with Cochlear implants, for example, hear it better than others, Cochlear is more effective for some than others, so always try the assisted listening devices. Some people will want them with captioning at the same time or with an interpreter at the same time.

And the last bit of advice I can give you is that 4G, a mobile modem can work, but use it only as a last resort.

Moving on, just quickly, I want to talk about captioning of online material, because this is something which I probably should have put into this presentation. We know that more and more education is going online and this year I cannot tell you how many deaf or hard-of-hearing people have contacted me and said, "I've got videos that are part of my assessment. There are no captions. I have no access. I am behind with my work. I have asked the university or the TAFE to give me captioning or transcripts or so on or whatever", and they're not getting them, or they're getting them a long time after and that means they are under a lot of stress to keep up with work. So there are solutions to provide captioning.

The first of them that I can recommend is that if you are going to use a video, make sure it's captioned. If it's not captioned, don't use it. Make it a policy - and we are pushing this, I am pushing this - that any video that goes online must be accessible. I see no reason why a deaf or hard-of-hearing person should have to go through this stress of trying to get access to stuff that everybody else does when the simple solution is to make sure that the videos are captioned.

But if you must caption it, there is a way to get it done very quickly. There are issues with it, issues like copyright, who owns the video, all of that sort of stuff. It is not as straightforward, but when you can, if you need to, you can get things captioned reasonably quickly, within 24 hours, perhaps.

Now, with Bradley Reporting, for example, you can basically get the link for a video or a YouTube video, send it off to Bradley Reporting. They get one of their stenographers to type out a transcript of what is happening on the video with timing, and then you would pay Bradley Reporting for the time of the stenographer. Then you get that transcript and then that transcript, you can either put it on the video yourself using YouTube captioning tool Amara, I believe Echo 360 has an ability to put captions on as well. It can be done and it can be done quickly and no deaf or hard-of-hearing person should be waiting two, three, four, five weeks to get captions. And I have had one example of a deaf student who has 18 - 18 videos - and not one of them are captioned. And I have had one situation where the student has been asked to go online to SBS, for example, and there is a video they want them to watch. SBS online streaming doesn't have captions. So we have a lot of work to do in terms of quality, quantity and making sure all of this online stuff is accessible.

This has been a beef of mine for a very long time, because we recognised way back in 2005 and 2006, as people started to put things online, they put audio files, videos, none of it was accessible to a deaf person. So we need, as a collective, to really work hard on this to make it better. And on that note, I will end my presentation. Thank you.

DARLENE: Thank you, Gary. That's fantastic. So, we - you answered the questions that were received prior to the time.

GARY: Yes, in several aspects.

DARLENE: That is fine. I don't know if anybody has any other questions, if they would like to put it in the chat pod and I will ask Gary. While we are waiting for that, we are, with Gary's support and another disability advisor within the university sector, working to develop some content for the ADCET website around captioning videos and so forth, so that will be out in the ADCET newsletter in the near future. So I thank Gary for his contribution to that.

GARY: You're welcome. I've enjoyed it.

DARLENE: Yes. I hope today, from getting the wisdom from Gary is really important, but also the practical example that we've shown today. For us, we do live caption every webinar we host. We receive the transcript from Bradley Reporting very soon after the video ends. We are then able to put that transcription into the video so that all our videos, all the webinars are put up online and the captioning are part of the videos. It is a very easy process for us. We lacked a bit of confidence at the beginning but it has become very easy.

Today was another stress, we had something new again, having the Auslan interpreting online but once again we hope we have shown it can be done and done effectively. There is just one question: What is considered a reasonable time for return of transcriptions? Gary, have you got any ideas on what would be a reasonable time for transcription returned?

GARY: I have spoken to a few people on this who provide a service. They can say 24-48 hours they can get it done. Once you have got that transcript, then you need to add it to the actual video. That might - take another 24 hours, but with Bradley Reporting, for example, you can get them to actually do the transcript for you and they will add the transcript and captions to the video as well. And there are other companies might do that, Ai Media might do it, Captioning Studio might do it. I would say no more than one week, and 48 hours is possible.

DARLENE: Okay. Thank you. Do you know of any free or easy way to caption videos yourself? We know there is the YouTube talk and type, but is there any other software that is available? Do you know of any?

GARY: Yes, you've got YouTube. There are several. There are two tools on YouTube, autogenerated captions. Leave it well alone unless you want a comedy show. They are very funny to watch, because the way they get translated is . . . But YouTube also has a manual tool. If you have got the time for it, you can cut and paste and put it in. Do the timing yourself. You can use . . . Amara, that is another tool to add captions. You can - if you want to do a basic thing you can use Movie Maker, you can use iMovie, if you can use a word processor, if you can use Word, you can learn how to do captioning on any basic software like Movie Maker or iMovie. And on the website for ABC TV, I have a booklet which talks about preparing successful media, and it shows you how many lines to put per line for captioning, all that sort of stuff, it talks about YouTube captioning. It tells you how to do that. Contact Darlene and have a look.

DARLENE: Okay. For people, that website is adcet.edu.au. So, thank you, Gary.

GARY: If you can't find it on there, contact me and I will get it to you.

DARLENE: Okay. Thank you, Gary. We will finish up now. Just being aware of the time. I want to thank everybody for joining us. This has been recorded and we will have it up on ADCET in the next week. It will be captioned. We also send out a survey, and we really value your feedback, including any ideas on future webinars. The next webinar we will be hosting is called Exploring the Retention and Success of Students with Disability and that will be held on 25 May at 1p.m. Australian eastern standard time, so hopefully many of you will be able to join us. Once again, I want to thank Bradley Reporting and Auslan Services for testing this out with us today, and most importantly, thank you to Gary for his time and also for everybody else that has joined us today. So thank you, and goodbye. I couldn't think - - -

GARY: Thank you, Darlene, for making this possible with this experiment. Good job. Goodbye, everybody. Thank you for your time. I've enjoyed it.