Tuesday, November 30, 2010
We seem to have wound up back in Australia, for no reason in particular, except that I like platypuses (and they're also the occasional mascot of my alma mater) and I thought the footage in this video was pretty cool. If you want to know the results of the study they were conducting, some preliminary data seems to be in.
So that's the end of this year's NatCapVidMo! 30 captioned videos in 30 days, plus one video with audio description and one video with subtitles in both English and French. Not bad, all in all. I had a great time this month, and I wouldn't be surprised if I decided to do it again when next November rolls around. In the near future, I'll try to get links to these videos collected on the articles page of my website, for easy reference. Thanks for following along, and for all your encouraging comments along the way. This blog will now resume its relatively low-traffic status.
Oh, one more thing. I captioned all 30 of this month's videos using Universal Subtitles. It's a fantastic solution for quick and easy captioning in a variety of online video formats. It's versatile, simple to use, and free/open source. Check them out, and if you like them as much as I do, think about getting involved. Thanks, Universal Subtitles. I couldn't have done it without you guys.
Monday, November 29, 2010
It's the penultimate day of NatCapVidMo! If you guys have any more video captioning requests, you'd better get them in soon, because this is your last chance. Here's a classic skit from Sesame Street which originally aired in 1993. By that time, Linda Bove had already been on Sesame Street for 22 years. I remember her well from when I watched the show as a little kid in the early '80s. In fact, I'm pretty sure that she was the first Deaf person I was ever consciously aware of, and that the first time I ever saw anyone use Sign Language was on Sesame Street. I've been studying ASL for about two and a half years now, so I realized while watching this video that she altered her signing a bit for the show so that it was closer to English word order than to ASL syntax. (For instance, she signs the sentence "I want to buy a hat" "I-WANT BUY HAT" rather than the more correct "HAT I-WANT BUY" or "HAT BUY I-WANT".) But I still really enjoyed watching it, and the language is simple enough that even an intermediate signer like me can translate it with a pretty high degree of confidence. I think it's funnier if you can understand both sides of the conversation, so I decided to caption it. Now Deaf/HoH signers and non-signing Hearing, deafened, and HoH English speakers can all get in on the joke.
Sunday, November 28, 2010
Mike Paciello tipped me off to this video today on Twitter. It's about how to use a Refreshable Braille Display (The APH Refreshabraille 18) with an iPod Touch or iPhone. I don't have any of these devices; my phone is a Blackberry 8330, my mp3 player is a Toshiba Gigabeat F, and I don't know braille, though I'm certainly very open to providing CART for someone who uses a refreshable braille display. I really liked getting an inside look on both the display itself and the iPod's VoiceOver interface (which I had heard about but had never seen in action), and I was extremely impressed by Chase's facility with the interface and clear way of describing everything he was doing. Below I've attached a transcript of the video, because I'm not sure how accessible Universal Subtitle's "download subtitles" interface is for deaf-blind people using braille displays or screen readers. This also reminded me to attach a transcript to my post for Day 16, Out of Sight (with Audio Description).
CHASE: Hi, this is Chase again. And today I want to show you how to pair a Refreshabraille 18 with the iPod Touch or iPhone running iOS4. As you might know, the Refreshabraille 18 can only connect with the iPod Touch or iPhone via Bluetooth, as they do not have a USB port. And I have an iPod Touch here, which is already on, so what we need to do next is to turn the Refreshabraille on. I will hold down the power button. (cells vibrating) And you heard the cells vibrate. And now we see my display's name. Now, by default, Bluetooth is automatically turned on, so we don't need to do anything to enable Bluetooth. What we need to do to get braille working on the iPod touch is to go to settings, so I'll tap my touchscreen near settings.
iPOD: Page one of three, iTunes. Settings.
CHASE: I did that by flicking left. I'll double tap here.
iPOD: Settings, settings, music, button.
CHASE: And now my cursor was placed on the music item in settings. But I'll flick left.
iPOD: General, button.
CHASE: And that's general, which is where we need to go. I double tapped.
iPOD: General, settings, back button.
CHASE: And now we need to go to accessibility, which is where you can set all of the iPod and iPhones' accessibility features.
iPOD: General, restrictions, up, date and time, keyboard, international, accessibility, reset, accessibility, button.
CHASE: Now, what I did to get to accessibility was I touched near the bottom of the screen, and then flicked right 'til I got to accessibility. Now I'll double tap.
iPOD: Accessibility, general, back button. Accessibility. VoiceOver, on, button.
CHASE: Now we need to go into VoiceOver. VoiceOver, if you do not know, is the built-in screenreader on these devices, and that is what the Refreshabraille 18 will connect with.
iPOD: Zoom, off, button.
CHASE: But we also have things like the monaural audio and the magnifier, called Zoom.
iPOD: VoiceOver, on, button.
CHASE: But we'll go into VoiceOver, by double tapping.
iPOD: VoiceOver, accessibility, back button.
CHASE: And now we'll need to flick right 'til we get to braille.
iPOD: VoiceOver, VoiceOver, VoiceOver speaks items, to select, to activate the selected item, double tap, to scroll, flick three fingers, practice VoiceOver gestures, button, speak hints, off, speaking rate, 22%, typing feedback, words, use phonetics, on, use pitch change, off, braille, button.
CHASE: And there's the braille button. We'll double tap.
iPOD: Alert, Bluetooth is turned off. Bluetooth is required to use a braille device. Would you like to turn on Bluetooth? Bluetooth is turned off.
CHASE: And now we need to turn on Bluetooth. You don't need to enable Bluetooth before you get into the braille setings, as, as you heard, it informs me that Bluetooth is turned off.
iPOD: Bluetooth is required to -- cancel -- okay.
CHASE: I'll double tap okay, which will open the braille settings and turn on Bluetooth.
iPOD: Braille, VoiceOver, back button.
CHASE: And now we'll have some braille settings, as well as the displays that are in range of the iPod.
iPOD: Braille, contracted braille, on.
CHASE: That is to show Grade 2 braille. I have turned that on.
iPOD: Status cell, off.
CHASE: Status cell is off.
iPOD: Choose a braille device. Refreshabraille chase. Not paired, button.
CHASE: Now, the iPod's searched for my Refreshabraille, which is called Refreshabraille chase, as I have named my display "chase". And it has already found it. What we need to do to pair it is to double tap.
iPOD: Refreshabraille chase. Refreshabraille, Refreshabraille chase, PIN, secure, 2, 1, 2, 3, 4. PIN, 1, done, button, done, end, braille, VoiceOver, back button.
CHASE: My display just went blank, and now we see VoiceOver, back, and then if I pan to the right, BTN, which stands for button. Now, I want to explain what I did when it asked for the pairing code. You only have a certain amount of seconds to pair this and put in that code, so I did that before I discussed it, so I'd have enough time. The PIN code, or personal identification... This code allows you to securely connect to your Refreshabraille 18. By default, the Refreshabraille 18's PIN code is 1234. So I typed that in, and then pressed the "done" button. So now we have braille. Now, what I want to do is to go to the practice VoiceOver gestures, which is in the VoiceOver settings. So I'll flick to the left, until I get to the VoiceOver back button.
iPOD: VoiceOver, accessibility.
CHASE: And now we're back in the VoiceOver settings screen.
iPOD: Back button, VoiceOver, VoiceOver, on. VoiceOver speaks items on the screen. To select an item, touch it.
CHASE: Now on the braille display I can use the joystick to flick right, by pushing it to the right.
iPOD: To activate the selected item, double tap. To scroll, flick three fingers. Practice VoiceOver gestures, button.
CHASE: Now, this is what I want. I can either double tap on the touch screen or press down on the joystick. I pressed down on the joystick.
iPOD: Practice gestures.
CHASE: And I'll touch on the screen.
iPOD: Practice VoiceOver gestures in this area. Select the done button in the top right corner and double tap to exit.
CHASE: Now we can do gestures on the screen, like...
iPOD: Rotate clockwise. Select next rotor setting.
CHASE: Or, on the display, we can press any of the display keys, like...
iPOD: Joystick pressed. Activates the selected item.
CHASE: Or... I'm panning to the right, and it's showing what it just said, but if I get to the end...
iPOD: Pan braille to the right.
CHASE: That's the pan braille to the right. But you can also use chorded commands. That is, letters and symbols, with the space bar. And they'll tell us what they do. So H-Chord, for example...
iPOD: Dot-1, Dot-2, Dot-5, space bar. Activates the home button. Practice VoiceOver gestures in this area.
iPOD: Select the done button in the top right corner and double tap to exit.
CHASE: Or, if I press E-Chord...
iPOD: Dot-1, Dot-5, spacebar. Activates the return key.
CHASE: Activates the return key. Now, I'll go ahead and get out of this.
iPOD: Done, button. 20% battery... Practice VoiceOver gestures in this area.
CHASE: Now, I touched too high.
iPOD: Select the done button in the top right corner and double tap to exit.
CHASE: I tapped too high on the screen, so I got into the status bar. So...
iPOD: 20%... Practice VoiceOver, done, button.
CHASE: There's the done button. I'll double tap. And we're back in settings.
iPOD: VoiceOver, accessibility, back button.
CHASE: Now I'll press H-Chord to get to the home screen, or I could press the home button on the iPod.
iPOD: Home, settings.
CHASE: And we'll see... Home, settings, on the braille display. Now, what I want to show you is... We can go to an application, for example, Notes, that allows you to input text and use the Refreshabraille 18's braille keyboard to type.
iPOD: Calculator, Clock, Notes.
CHASE: There's Notes. I used the joystick left function to get to that. I'll press down on the joystick to open Notes.
iPOD: Notes, new note, text field, is editing, character mode.
CHASE: And we're automatically placed into the new note field. So if I type "this is a test" in grade 2 braille...
iPOD: This... Is... A...
CHASE: This is a test. And it didn't speak the last word because I put a period and then a space. I kind of interrupted it. But I have "This is a test" in grade 2 braille. But if you typed in, in an email, for example, in grade 2 braille, and you sent it, that email would be translated into regular text, so you can type in any text fields on the iPod using the Refreshabraille 18's keypad. I'll now return to the home screen with H-Chord.
iPOD: Home, Notes.
CHASE: That is all that I want to show you. I'll be back in a future video, talking more about connecting the Refreshabraille 18 with other devices.
Saturday, November 27, 2010
A video from Clifton Biendurry, via the Enduring Voices Project.
From Biendurry's website: Clifton Jungurrayi Bieundurry belongs to the Walmajarri people of the Central Kimberley. [...] He speaks several traditional contemporary and Indigenous languages, including Walmajarri, Kukatja, Jaru, and Kriol.
I thought it was interesting to learn about people who communicate using their voices most of the time, but who have a set of hand signs they use to communicate something silently or inconspicuously when speaking isn't possible.
Friday, November 26, 2010
This video was suggested by my brother William. I think it's pretty interesting stuff. If they actually figure out how to make organic and polymeric semiconductors into efficient solar cells, they could transform the face of this planet. Best of luck to 'em! Well, there are only four more days left of NatCapVidMo, and the requests are still coming in. Be sure to send yours to email@example.com or comment on this post before time runs out.
Thursday, November 25, 2010
I've had a wonderful warm, cozy Thanksgiving, cooking at home and eating glorious food with my partner and cat here in New York City. I've been in this city for six years now, and I love it with all my heart. It's huge, mysterious, vital, complex, and unexpectedly kind. Living here is the adventure I dreamed of all through my childhood, and I wouldn't trade it for anything. But this is also the time of year that I find myself thinking about the rest of my family back home in Missoula, Montana. (There's also an outpost of Knights in Seattle.) I thought it might be nice to put up a captioned glimpse of where I came from, and I figured that among the more interesting ways to do it would be to show a bunch of reckless youths risking their hides as they jump around on various Zootown landmarks. I always enjoy watching Parkour videos, though I've never done it myself -- partly because I get winded if I run more than half a block or so, and partly because I'm my family's primary breadwinner, so if I screw up my hands, we don't eat. That means I have to get my leaping and scrambling thrills vicariously. Enjoy the energetic lads of the Missoula Parkour Group, and if you can take your eyes off their flailing limbs, check out the scenery while you're at it. That's where I grew up. It's a peculiar little city, but it's home.
ETA: One formatting note. It's generally conventional when doing captioning for people with hearing loss that you don't repeat words in captions that are already written on the screen, such as the names of the vaults in this video. The reason why I captioned them is because Universal Subtitles also allows translation of captions, and if someone wanted to translate this video into another language, they'd be able to translate the names of the parkour moves as well.
Wednesday, November 24, 2010
I found this pair of videos while looking for footage of Peter Cook, the famous ASL comedian. I thought it was fascinating for a number of reasons. For one, I really enjoyed the linguistic content. I knew that there were two different signs for "poetry", but I'd never heard about the international Deaf conference that inspired the more modern one. I always love watching Peter Cook sign, whether he's being silly or serious. I also thought these videos were a timely thing to post on the day before Thanksgiving. It's important to think about things like poverty and privilege year-round, but especially when we're giving thanks for all the good things in our life, we need to remember what it's like for people who haven't gotten the help, support, education, and security that many of us take for granted. As it says in the video, poverty is much more than just a lack of money. I'm going to try to be aware of my many privileges this week and in the weeks to come, and I'll try to keep thinking about people affected by poverty and what I can do to help. One peculiarity of these videos: Even though they're announcing a contest that ran from March through April, 2008, I haven't been able to find any entries for it on YouTube, much less the announcement of the winning entry which they said would happen in May 2008. Columbia College's website has documents about the beginning of the contest, but nothing about the end of it. And when I searched YouTube for "ASL" and "poverty", I didn't find anything connected to Columbia College, Peter Cook, or Signs of Our Ideas. Were there really no entries? If so, that's really a shame. It was a wonderful idea, but perhaps it wasn't publicized enough. Maybe sometime in the future the contest will be revived. Meanwhile, I hope my captions help to make these ideas accessible to new groups of people.
I just donated to the New York City Food Bank. Click here to make your donation today.
Tuesday, November 23, 2010
As I stated in the molecular gastronomy video on Day 15, which was, like this video, more or less an advertisement, I receive no perks from this company. In fact, if Greubel Forsey somehow found this video and decided my captioning job was so brilliant that they just had to give me a complimentary Quadruple Tourbillon wristwatch, I don't think I'd know what to do with it. I haven't worn a watch in five years; whenever I need to check the time, I just pull out my cell phone. It's gauche but convenient. I didn't enjoy watching this video because I covet their final product. I just really like watching the illustration of their process in designing the watches, from the hand drawings to the CAD visualizations to the intricate hand-tooling under magnification with dazzling precision. To me -- and call me a philistine if you want -- the actual watch is almost beside the point. I just really like watching the combination of advanced modern technology with ancient mechanical techniques. (See also: Day 14: The Antikythera Mechanism.)
Monday, November 22, 2010
A friend tipped me off to this fascinating story on NPR: Why Can't We Walk Straight? I chose to caption this for three reasons. First, I thought the animation was beautiful and it was the sort of video I wouldn't mind watching the four or five times it takes to go through the captioning process. Second, I thought it was strange that NPR didn't caption it, even though they offered a transcript for the audio link from Morning Edition at the top of the page. Third, I've got a simply terrible sense of direction even when I'm not blindfolded, so I like to think that my frequent habit of doubling back on myself and turning in circles when trying to find my way in an unfamiliar place is just a natural extension of this apparently inborn human tendency to walk in spirals, circles, and loop-de-loops.
Sunday, November 21, 2010
Following up on the recent theme of ultra-nerdy videos about science stuff, this video was requested by a reader of the blog who would like to remain anonymous.
My own cat, Alcibiades, practicing his steno skills. (Cat-sized steno machine not pictured.)
Saturday, November 20, 2010
Friday, November 19, 2010
Thursday, November 18, 2010
Breaking Consumer Report: Ostensible "Shake Flashlight" From Dollar Store Not What it Appears to Be! Inventor and Product Tester Zachary Snyder has the story.
I admit that the main reason I decided to caption this video is because I'm jealous of his action figure collection. I found it randomly while searching YouTube for how-to videos and it amused me. I also kind of covet an illuminated toothbrush, though I think if you're gonna do it at all, do it properly and spring for the $30 Faraday model.
Wednesday, November 17, 2010
40 years ago today, Lunokhod 1 landed on the surface of the Moon and began taking pictures. It was only expected to last 3 months, but wound up staying operational for 11 months, and even today, though it hasn't transmitted any electronic signals back to earth since 1971, its reflector is still being used by by earth-based astronomers. Well done, little lunar robot! Maybe someday soon we'll come back up there to thank you for all your good work in person.
Tuesday, November 16, 2010
I've placed my usual embedded video with captions by Universal Subtitles above, but if you watch it, you'll notice that the only dialogue is the name "Gogo", and other than that the captions are just sound effects. Out of Sight is a beautiful short animation by Ya-Ting Yu, Ya-Hsuan Yeh, and Ling Chung, three students at the National Taiwan University of the Arts. I decided that since the captioning job was such an easy one, I should try my hand at audio description.
My audio described version can be found here. You can also paste the URL (http://www.youtube.com/watch?v=cQXD6jkv4hQ) into EasyTube, an accessible YouTube interface that allows for keyboard navigation. I've seen a few audio described videos and have read some guidelines on audio description, but this is my first time actually attempting it, so I'd be grateful for any feedback from blind or low-vision people who give it a listen. It was really fun to do, and even though I've now seen this video probably a dozen times, I'm still not sick of it. Bravo to the Taiwan University Team! For more information on the video, go to their website here.
Edited to Add: I've attached a transcript of the audio description below.
Animated footprints of a dog.
The rest of the dog is gradually sketched in, then filled with color.
The legs and feet of a little girl, Chico, holding the dog's leash and walking.
The words "out of sight" painted on a white fence.
The legs and feet of a man walking past the fence.
A sunny day. The little girl is walking her dog and smiling.
The man approaches from behind and snatches her purse.
Her dog chases after the thief, dragging Chico along.
She slams into a tall fence and the dog breaks free, running through a ragged opening in the boards.
She feels a draft coming from the opening.
She bends down and feels around the edges of the opening.
She crawls through it into a pitch dark, cave-like space.
She gropes around and finds a small twig.
A drop of water briefly illuminates the darkness as it falls into a small puddle.
She waves the twig and the window of a house emerges out of the darkness.
The twig has transformed into a magic wand with a star on the end.
She taps her head with the wand her colorful clothes transform into a black dress with a witch's hat.
She runs her fingers along the hat and smiles at her transformation.
She traces the wall with her wand, heading toward a small patch of light in the distance.
She collides with a tree and falls down.
The darkness is gone and she is in front of a blue house with yellow shutters.
She continues to walk, waving her magic wand in front of her.
Her wand taps against a metal rod with a pleasant chime.
She smiles and runs her wand along the row of metal rods, which become a metal fence as she passes by.
Birds fly up and out of view as a plain brick wall is revealed.
She pauses and taps her wand against the wall, then sniffs the air.
The blank wall becomes a bakery window with pies, cakes, and a swinging sign reading "bread".
The bakery door opens. A woman walks out and past Chico, carrying a bag filled with bread.
Chico continues walking, tapping her wand against the wall.
She sniffs again, and a woman walking past sprouts large pink flower petals that encircle her face.
As Chico inhales deeply, she begins to cough.
She's encircled by a cloud of smoke from a passing man with a pipe.
She passes an alleyway.
A beige, vaguely spherical shape is rolling around on a garbage can.
The shape jumps to the ground, rolls towards her, and transforms into a kitten.
The kitten rubs happily against Chico, who smiles.
The kitten hears Gogo's bark in the distance and runs away.
As the walks, houses and shops spring into being when she taps them with her wand.
A stork drills a manhole in the road with its beak.
A church springs up, with a bell ringing in its steeple.
Cars that look like fish swim by in shallow water, smoking pipes.
A fish bus picks up several strange, animal-headed passengers, then moves on.
Chico looks up.
A giant whale blimp with 10 rippling fins floats benignly over the city.
Chico cups her ears and enjoys the sound of the whale blimp.
Gogo runs up to Chico carrying her purse in his mouth.
She drops the magic wand and it transforms back into a twig.
She hugs Gogo and feels the purse carefully before realizing what it is.
She smiles happily and picks up Gogo's leash.
They walk off together along the bright and colorful city street.
The camera pans upward to show the contrail of a passenger jet moving through the sky.
Credits. Still sketch of the dog barking at the thief up in a tree.
Another sketch of a man coming out of the bakery while Chico's dog navigates the opening door.
A sketch of Chico squishing Gogo's face and grinning while he wags his tail.
Monday, November 15, 2010
Disclaimer: I take no kickbacks from Moto. Heck, I've never even eaten there. I sure would love to, though. I love eating weird stuff, especially weird stuff made using Science, so Molecular Gastronomy is one of those things I've been wanting to try for ages. If I ever find myself in Chicago, I'll definitely give it a try. I thought it was pretty funny that one of the dishes in the video was frozen cooked pureed pancakes. I used to have to cook pancakes, then puree them, at the group home I used to work at, for the clients who had swallowing disorders. Never thought to freeze the puree on a plate, though; I think it might sort have defeated the purpose.
Well, that's day 15 of NatCapVidMo. Half the month gone, half the month left. I'm having a blast so far. Thanks again to Universal Subtitles and all the people who upload interesting stuff on YouTube.
Sunday, November 14, 2010
The Antikythera mechanism is a fascinating piece of machinery, and Michael Wright has built a fully functional working model, which he demonstrates in this video. The narration is a bit dry, but the thing itself is just beautiful.
This video was very easy to caption, but it took me several hours to find something appropriate for today's NatCapVidMo. Now that I'm reaching the middle of the month, my ready sources of captioning material (especially since I'm trying to limit the number of music videos somewhat) are running dry. If there's a video you'd like to see captioned, please comment on the blog or email firstname.lastname@example.org. As long as it's under 10 minutes and safe for work, I'll do my best to oblige. And even if you don't need anything captioned, tell your friends about the NatCapVidMo project, and maybe they'll come up with something. Thanks!
Saturday, November 13, 2010
Universal Subtitles isn't just for captioning English to English! I took this song from the brilliant television show Flight of the Conchords, captioned it in French (with a little help from the internet), then, using my magnificently advanced French skills (two semesters in college nine years ago) translated it into English for your viewing pleasure. You can select either language from the Universal Subtitles menu.
Friday, November 12, 2010
Thursday, November 11, 2010
So Universal Subtitles isn't just fun and games. The above video was shown in one of the classes I CARTed today. Yesterday the class's instructor picked out the video and informed the student, the disability office transcribed it, I got the transcript, and then I captioned it on my lunch hour so that the student could watch it in class. Usually when there's a video shown in class I'm not given any warning, so I have to CART it on the fly. It's not usually a big problem, except that day when the professor decided to show this video (warning: salty language) out of the blue, which is not the easiest thing to write, let me tell you. But even when I'm able to CART the video accurately, it can often be a frustrating situation for the student, because they're forced to constantly look back and forth between the video screen and the laptop's CART display, which not only gives them a crick in the neck but can also result in them missing some of the words, some of the video images, or both. Captions are of course ideal for this, but before Universal Subtitles it wasn't generally possible to get a captioned video up in time, especially since professors don't tend to choose the videos they show in class more than a day or two ahead of time. Now if I have even a little advance notice, I can transcribe the video (or have the disability department do it for me), caption it in minutes, and give the instructor the link to the Universal Subtitled version instead of the original one. Easy as anything!
Because I did this video on my lunch hour, have done a total of nine hours of CART today (seven in the Bronx and two in Manhattan, including a mad dash from the former to the latter via Metro North), and have to do 24 audio minutes of paid transcription work before going to bed tonight, this video is not as finely polished as some of my previous ones. For instance, there are a few captions with hanging/orphaned words, which I usually try to avoid. It would be great if Universal Subtitles offered a "line break" option within a single caption block, but I suppose that's not ultra high priority. Anyway, I hope you like it. I know my client was pretty happy with the results.
Wednesday, November 10, 2010
Tuesday, November 9, 2010
I know there are a ton of terrible "signed songs" out there; it's all too easy for a hearing person with a tiny bit of ASL to pick a song, look up each word of the lyrics in their Signing Dictionary, and string them together in one ungrammatical chain of nonsense. When ASL music is done well, though, it's an amazing thing to watch. I'm no expert, but this interpretation of Gnarls Barkley's Crazy by B Storm is one of the best productions I've seen, not just in sensitivity of interpretation, but in characterization, makeup, narrative design, and video editing. Storm clearly paid close attention to the rhythm of the song in making his camera cuts, and I tried hard to duplicate that timing in the captions, leaving them on throughout each of Cee Lo Green's gorgeous melismas, but blanking them as soon as he cut off. It's a fine balance; you don't want to clear the captions too quickly, because then people don't have time to read them, and the constant blinking on and off can be very distracting. On the other hand, you don't want the captions just hanging there when no one is speaking or singing. You need to pay attention both to what's going on in the soundtrack and to the video itself. It's best to blank a caption when the video switches cameras or when someone makes a gesture indicating that they're moving on.
Two things I wish Universal Subtitles had, which would make these finer points of captioning a bit easier: 1) A manual "blank" key, which wipes a caption on command, rather than having to drag the caption's box on the timeline during the second go-around. The commercial software I used to use in my old offline captioning job had one, and it was invaluable. 2) A "nudge" function. Even the quickest captioner has to account for the delay between mind and hand. If they anticipate the caption before it's spoken, they risk triggering it too soon, but if they hit the button right when they hear the first word, it's already too late. The only way I've found to fix this problem with Universal Subtitles is to caption the whole video normally, hitting the button as soon as I hear the first word, and then going back through the syncing process again, paying close attention to the timeline and hitting the caption button about five little tick marks before each caption I placed on my first runthrough. That compensates for my reflex delay, but it also necessitates an additional runthrough, which I wouldn't need to bother with if I could tell Universal Subtitles to timeshift each caption, say, half a second earlier. Then if there are any mistimed captions I could catch them and tweak them on my final review rather than having to devote an entire runthrough to retiming my initial caption placement. It's not a big deal on a three-minute video, of course, but it would be a helpful feature to have somewhere down the line. I'm still thoroughly enjoying the process of captioning a video per day via Universal Subtitles, though, and a few of my colleagues seem to have gotten in the spirit as well. It's a wonderful tool, and I encourage anyone who's contemplated putting captions on their videos to play around with it and see how truly simple and intuitive the interface really is.
Monday, November 8, 2010
Sunday, November 7, 2010
Two of my favorite British comedians, Rowan Atkinson and Hugh Laurie, in a very silly sketch about Shakespeare and his editor. You'll notice that the video and the soundtrack are a little out of sync; this is an occasional artifact of YouTube's uploading system, and there wasn't anything I could do about it, since Universal Subtitles just takes someone else's YouTube video and puts the captions on top. I tried a few versions that various people had uploaded, and they all seemed to have this syncing problem, so I just decided to synchronize the captions with the audio and let the action lag behind a bit. My other dilemma was how to spell "King Canute". The page where I found the script for this sketch had it Canute, but the Wikipedia article had it Cnut, and I'd always thought it was properly spelled Knut, like the polar bear. I decided to let Google play the judge, and since "king canute" got 117,000 hits, "king cnut" 26,000 hits, and "king knut" 14,000 hits, I let the original spelling stand. The relevant portion of his legend, incidentally, is here.
Saturday, November 6, 2010
DISCLAIMER: Yes, I am a freelancer, but I do not subscribe to the business practices shown in the above video. Don't try this at home, kids.
I found Life of a Freelancer via Freelance Switch's Twitter feed a few months ago. Even though I'm not a graphic designer, I've hung around enough of them to get most of the industry in-jokes, and I definitely sympathize with the need to portray one's self as a legitimate business enterprise when in reality it's just one person in their boxer shorts on a cell phone. I'm always careful to point out that I'm a sole proprietor rather than a multi-employee CART firm, but I admit that back before I started renting coworking space I did used to say things like "I'm sorry, I've been out of my office all day" when by office I meant "the corner of my living room next to the fire door." If you're not self-employed, you might not enjoy this video quite as much as I do, but I hope you'll get a laugh out of it anyway.
By the way, I transcribed this video in steno using a $45 keyboard and Plover, the open source steno software I've been developing with a (freelance!) Python programmer for the last year. If you've ever been interested in learning steno but don't want to pay $1,500 just to get started, check Plover out! It's completely free and works like a dream.
Friday, November 5, 2010
I found this one delightful, but also pretty poignant. It's from the great mind of Dr. Seuss, of course, and was voted #9 of the 50 Greatest Cartoons of All Time. I really enjoyed doing the sound effects on this one. Well, day five is in the bag. I saw a woman reading No Plot? No Problem!, the book written by the founder of NaNoWriMo, but even though I felt a little twinge at not being part of the madcap noveling crowd, I have to say I'm really enjoying this so far. What'll I caption tomorrow? No idea! I guess I'll find out. As always, requests are welcome.
Thursday, November 4, 2010
As someone who came of age in the '90s, I've got a soft spot for Neo-Swing, cheesy as so much of it was. This music video, released in 2010 by Caro Emerald, is a standout, I'd say. It's peppy, it's catchy, and the animation is supremely well crafted. It's even got a bit of kinetic typography to help jazz up the captions. I don't want to do too many music videos this month, because in a way they're just too easy, but this one was hard to resist.
Wednesday, November 3, 2010
I was hoping to post this before tonight's Keith Wann Show, but the first time I tried to caption it I discovered that I'd been working on the Vimeo version, and Vimeo files don't yet support Universal Subtitles. Oh well. After captioning the show (guest hosted this week by the superb Windell Smith, whose Peter Sellers-esque comedy stylings you see in this video) I dug up the YouTube version and was able to caption it without any problem. So this is coming about an hour after midnight, but I haven't gone to bed yet, so I'm still counting it as Wednesday's NatCapVidMo entry. I love captioning ASL videos. I'm not yet fluent enough to caption them by sight, but when they're voiced it's a pleasure to link up the three modes of communication -- two visual, one auditory; two English, one Sign -- so that the video becomes accessible to anyone who knows either ASL or English, whether they can hear or not. Someday I'd like to do more ASL captioning without having the voiceover to guide me, but I need to become much more nuanced in my understanding of the language first.
Tuesday, November 2, 2010
The transcript from my interview on That Keith Wann Show last Wednesday is now online. You can read it here. When he uploads the ASL video (probably in a month or so), I'll caption it and link it here as well. It was a really fun evening; Keith and Windell and I talked about all sorts of stuff, from theater captioning to the issue of how to build connections between the Deaf and deafened/hard of hearing communities, to the virtues of New York City diner food. Here's the audio file, and once again, here's the transcript. Special thanks to my awesome colleague Cory, who captioned the show last Wednesday while I was busy talking.
I was watching this wonderful documentary, Attenborough in Paradise, just a few weeks ago. It's available on Netflix Streaming (though not with captions, tsk-tsk) and it's an amazing view into the lives of the birds of New Guinea. This is my favorite part of the documentary, the segment on the Vogelkop bowerbird.
Monday, November 1, 2010
I fooled you with that title, didn't I? I was especially tickled by the mention of the transcription happening in realtime. (CART, you'll remember, stands for Communication Access Realtime Transcription). But strangely enough, I didn't find this video by searching for "realtime" or "transcription". I went through the archives of some of my favorite videos and found a link to a 7-minute video by the same animation company, and decided that a 4-minute video was more my speed for the fist day of NatCapVidMo. So I clicked this one, got it all down on my steno machine, and captioned it in just a few minutes using Universal Subtitles. Easy as anything. This video goes out to my brother William (last seen in What Is Steno Good For Part Four: Mobile and Wearable Computing), who's a PhD in biochemistry with a special love for ribosomes. I'll have another captioned video for you tomorrow. As always, requests are welcome. 'Til then!