Transcript for presentation:

The TimeMachine - inclusive nature and museum trails


Charlotte Magnusson


B5. Art, entertainment and play

Date and Time

2014-06-16, 17:20 - 17:40


MA 7

Presentation PDF

Short oral presentation

Transcript of the talk

>> The original scheduled presentation is canceled. Apparently he was unable to make it but Charlotte is filling in with the time machine, inclusive nature and museum trails. And this will be our last presentation in this session today.
>> Okay, sorry, it’s on. I invite you to try the system because it’s upstairs there. It’s a screen so we brought it with us and you can look at a film and hear different sound channels and audio description on your smart phone. Just getting that. It’s where we had the lunch and dinner area.
>> Can you hear me? It’s good? So you see some of the logos here because this is actually finished project but it came from that EU project and it was a lot of interactive design but we have moved on with this after the project. And I’m showing this because we had a keynote saying you need to learn to talk to different people about different things and also, with mobile devices, I think we have this excellent motivation for inclusion in the mobile use.
So depending, now, I’m at a conference for design so I really don’t need this but at other conferences, you know, you can talk about all of these different situations where you might want to use it and it also pushes inclusion as well. So this is an app. But the underlying idea is that we wanted to make an app for quotation mark, everyone.
Of course, there’s people who won’t be able to use this but we wanted to make sure that you could, even if you couldn’t see the screen, it would be good. You would have a good experience and if you can see the screen but you can’t hear the sounds, you can still read the stuff so there should be a nice experience for a lot of people in this. And we have actually, we have presented early work in 2012 so I’m going on to stuff we did after that. Which was looking at more how you can explore because the early work was just the trail. You followed the trail sort of and there was no choice. But still, since I have that video, I’m not going to show the whole but I’m hoping that I can give you an idea of how it works.
So anybody who has information is simply quite similar. When you point it in the right direction, the direction it’s supposed to go, it vibrates and it’s a little when you get further along you will hear
(Music) you can see how it acts like a guide and for exploring, we wanted to look at how you could, maybe not just follow the trail but to find all of the different things that is around you. So we implemented, you can scan and then you can scan for different things around and then you can select one of them. Of course, if you have the historical medieval city, you have lots and lots of things so the problem is, how do you filter it? You filter it on the screen so you look for things far away and if you have your thumb closer to the body, you would have things closer to you and you would select a distance. For example, we have mean, medium, and we didn’t have a continuous filtering. We tested it in 2012 with visually impaired persons. Previously we had tested it with elderly users and some of them had visual problems. This was just on severe or all of these people became users so some were blind or close to. We don’t use a cane if you have good vision.
And a range of ages and it also included two persons because we other participants where someone was in a wheelchair. And for the wheelchair, you know, the pointing, you know, you need your hands for running the wheelchair. We had a special design so you could point with your head instead. These are more of the observations, the quotes of the results.
So the guiding because once you had selected a part, you were guided to it. That was easy. It was like, first time use for many people so it took a little bit of exercise or getting note, getting to know it. But you know generally, it was quite easily. The problem is this is a hand held thing. You may have a lot of stuff. You might have a dog, a cane and a phone and you might have a challenge juggling all of these things.
It’s also one thing that several users actually, visually impaired people were afraid of theft and one person had it stole from their hand which is you thinking, well, how low can you go but, yep. And the scanning design which we had tried before as well worked nicely but you have to keep the finger in touch with the screen and that turned out to be a problem.
And also, some of the scanning in the wheelchair was difficult and the scanning generally is kind of intuitive but it is new to the users. You have to introduce it. One of the problems is also, if you had places that were close to each other and you have, sort of varying GPS positions and just around the compass because compasses are not very stable, you have the problem of jumping between so it was hard to actually get the one you wanted.
We also tried to go through guidance and just going through the goal and there seemed to be some preference that was routed. And of course, crossing the square is not okay if you’re a if you see very little. As a visually impaired person, you would go along the edges instead so this kind of a novelty, just being able to cross. We had a lot of problems with the S. A lot of problems but there is a problem and because of that, also, it’s built into the design of the app that within ten meters from a point, you are considered to having arrived.
So you will never get this very very precise location. On the other hand, you really can’t because you can never trust the position to be the position in a sense. Generally, it was appreciated and you know, you might want to turn it off. There was one person who also had hearing problems and wanted to get results. And you want info because actually, this is just the sound and previously we have trails and you would sort of get the information that related to this sound in the guidance points but now you can select any and you might pass a sound and you really wouldn’t know why it was there. For example, in the central park, we had good picks because in the old days there was this was enclosed and you can see in the old documents, they had a problem with it getting in there. But this was, of course, really strange. Why would you have picks there? And also, if you were impaired, the real life sounds are very very important. You have to be careful how the audio relates so you don’t sort of occlude the real sounds. And actually, this is sort of surprising. Also, if you have very nice sounds, it can make you forget to actually attend to thing. In this case, guidance, we had one thinking, she’s walking in one direction and it’s like, yeah, why are you walking or are you really and she’s like, no, I follow the music because, oh, yes. The music was so beautiful so I forget to think about where I was going. So these were some suggestions from improvements. You do want information about the sound windows. You also want information about the present time. In this case, the information in the system was historical and we had, like, there was a place called the small square where they used to have a cattle market and there’s no small square these days so if you have a current LUND person would not know at all. It’s actually, if you were facing the University building and had the fountain in front of you, the small square is kind of to the left of the University there. You want distance to target. You also maybe want more navigation and information in speech, it turns, ahead.
And if you are guiding towards one point, you might want to get back to the starting point so for that test, we brought with us, yes, the scanning is still great. It would work to have this around your thing but we were not really happy with filtering and how we did that. So after this test, we have been working with the local museums and they very much like these trails. We have been trying to move it to Google. As a researcher, you move it into a prototype app and move it into something that you feel comfortable releasing to the public is quite a lot of work. So it was quite a lot of work making it more immediate. Because before, when you had a test, you could explain to people, how it worked and now all of the sudden it has to be self explanatory and parts of it were new interactions where you scan so there is an app that is available. And I think you can search for time machine. And I’m sorry to say, we have respected the access for this app. So in Sweden, and Norway, so far, but originally it was only Sweden because we only have it in Swedish, so forth.
It’s a goal to start making it in other languages because there’s nothing inherently, you know, that needs to change. It’s just you need to create all of the material in the other language and the app itself has English and Swedish.
So if you’re Swedish speaking, you can down load some demo trails in Lund. You could try it out because there’s some stuff we don’t know. And this is the logo, just so that you search because it’s always a Google thing. You might have more than one app having the same name.
And the changes from before, we tried. Since we with respect really happy with the scanning, we didn’t include explicit scanning.
What we did is we made it a lot easier to select a point, to have the trail there with all of the different points in the trail. And we made it very easy for you to just select one of the points so you can select one point, go to it and then you can select another point and go to it so you don’t have to follow the trail. You can follow the points. Since this an act that can work in nature environments and all sorts of environments, even though routing was appreciated, we can’t really trust there’s something to route on. So what we have done is having via points so when you create the trail, you can put little points that people shall pass by. So for example, if you go by nature, you can put a point by the bridge to make sure they pass the bridge before they go to wherever the goal is.
When you go to a point, you see an image and hear someone talk and if you want to read what is said, you press the text button and then you would, instead, see the text so you can read the text instead of listening and of course, you can play it again and again if you want. So I think that was it, already.
>> Do we have any questions or comments or were things crystal clear?
>> You had the same user group, you were facing, getting along with the Android.
>> Actually, since this is an app that started development quite a few years ago, this was a big problem because by then, Android was not accessible at all. Well, it was a little but it had a really crappy screen reader. We actually implemented one that works slightly different but now when you do things, after Android four, I think the access is not a big problem. It’s sort of the same. We could get rid of a lot of the stuff we had done before.
>> Anything else? Thank you for coming. If you came in the midsection, there was a change from the original printed program. So Charlotte came instead for that.
This is the end of the presentations of this session today, thank you very much!


Rough edited copy by AVA AB and Certec, LTH

Remote CART provided by: Alternative Communication Services, LLC (

This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.