
The Third Angle
Best Business Podcast (Gold), British Podcast Awards 2023
How do you build a fully electric motorcycle with no compromises on performance? How can we truly experience what the virtual world feels like? What does it take to design the first commercially available flying car? And how do you build a lightsaber? These are some of the questions this podcast answers as we share the moments where digital transforms physical, and meet the brilliant minds behind some of the most innovative products around the world - each powered by PTC technology.
The Third Angle
Tatum Robotics: Hands-On Innovation for the DeafBlind Community
“Her favourite show used to be Friends, and growing up she could hear it and she could read the captions, and it was so exciting for her that she could get that moment back.”
In the modern world, we have an abundance of technology that helps us with our communication, information gathering, and entertainment needs. But most of this is inaccessible for DeafBlind individuals whose primary language is tactile sign. It can be difficult for them to access news headlines, or to even find out what the weather will be like later on in the day.
Tatum Robotics is advancing accessibility by developing a robot hand that can communicate with DeafBlind people through tactile sign and allow them to access the internet. Designed to allow for as much movement as possible, the T1 Fingerspelling Hand features 18 degrees of freedom whilst still feeling as much as possible like holding a real human hand. DeafBlind people’s main source of communication is usually through human interpreters, who can’t be with them all of the time, so this technology will open up a world of interaction for them.
Our producer Curt Nickish went to meet Samantha Johnson, the founder of Tatum Robotics, in their headquarters in Boston. She demonstrates how one of the robot hands works and emphasises the importance of testing with members of the deafblind community to capture all of the complexities of tactile sign language.
Find out more about Tatum Robotics here.
Find out more about OnShape here.
Your host is Paul Haimes from industrial software company PTC.
Episodes are released bi-weekly. Follow us on LinkedIn and X for updates.
This is an 18Sixty production for PTC. Executive producer is Jacqui Cook. Sound design and editing by Clarissa Maycock. Location recording by Curt Nickish. Music by Rowan Bishop.
Welcome to Third Angle. Today, we’re taking the hand of a robot that’s changing lives.
I’m your host, Paul Haimes from industrial software company PTC. In this podcast, we share the moments where digital transforms physical, and meet the brilliant minds behind some of the most innovative products around the world, each powered by PTC technology.
We use a combination of our senses to communicate. But for someone who is deafblind, relying on these two senses isn’t an option. In this episode, we meet Tatum Robotics, a Boston-based startup who are changing the lives of the deafblind community. They’ve developed the first collaborative, cloud-based robotic hand that can be a lifeline to those who cannot hear or see. This community relies on the remarkable ability to tactile sign. This requires a human signer to be present for any communication to take place. But their tabletop robotic hand can translate any virtual speech or text into tactile sign language without the signer needing to be physically present.
My name is Samantha Johnson, I’m the founder of Tatum robotics. Right now we are located at Mass Robotics, which is a robotic space based in Seaport, Boston, which is home to about 70 robotics startups in the area. We have about a team of five. So we have a couple co-ops here, we have some full-time folks, and then about once or twice a week, we have deafblind folks coming in and out to do testing and validation. So we travel, we go to New York, Florida, DC to really get user feedback. Usually, when we go to testing, we essentially set up sitting just like we are, sort of kiddie corner with the robot in the middle. And I try to describe, let them feel it. That’s always the first step is they run their hands down the front, feel the buttons, understand what’s in front of them. We were actually just in DC recently, and I tend to start by doing sample sentences, like, “The cat walks through the door,” simple sentences so they can get an idea of the signing speed they need, the grammar patterns they use, things like that. And for whatever reason, this woman I was working with, I randomly put in “smelly cat”, just trying to think of a short expression. And this woman started telling me about how she was born blind in one eye, and deaf. And she got a surgery and recently lost vision in her other eye. So now she’s completely deaf and completely blind, learning tactile signing. And I did smelly cat. And she basically started to cry, she was sitting there talking about how her favourite show used to be Friends, when growing up, she could hear it and she could read the captions. And it was so exciting for her that she could get that moment back. So we started doing Friends quotes and doing all those things together. And it’s an experience we get frequently of these people that are just so deprived of entertainment, of these technologies, that people really see how much of an impact this could have on them going forward.
The hardest part, and also the most exciting part, is that there are all of these things that need to fit together in a profile that is predefined. So we know that the hand itself needs to be the size and shape of a human hand, which limits our hardware but also informs our software development in terms of how we’re building out our trajectories and what needs to be integrated. It’s a fun group, because we have a very interdisciplinary team because of the hardware and software play that we have here. So we have software folks who are remote that are helping develop linguistics algorithms, the client for the robot itself, and then here in person, we have all of our hardware folks developing the anthropomorphic robotics as well.
What’s been fun for me is I started this as my master’s thesis. I started this because I was volunteering in the deafblind community during COVID, when social distancing pretty much prevented deafblind people from accessing communication. So I know how to sign, but as I started building on this project, the scope just kept increasing. So as I started, I started building a hand and then I realised this whole linguistics element and brought on Nicole shortly after for that. And then I realised, “Oh, we need to allow them a way to interact back,” so we sort of bring on this computer vision aspect. So it’s been a fun project as it’s continued to grow as we start to see these needs come out of the testing that we’re doing with deafblind folks.
Right now we’re looking at a bit of a deconstructed hand. But you can see it’s a tendon-driven system, we have 18 degrees of freedom, which again, it needs to have all of this dexterity in order to achieve the hand shapes of American Sign Language. So part of the design process was we started with five degrees of freedom, so you might just think of a traditional gripper, each finger can move in one way. And then we started adding in additional degrees of freedom to make different hand shapes. In that way, we’re really making sure the design is as optimised as possible. We’re not having any extra degrees of freedom, we’re not paying for additional motors, we’re really making sure that this is optimised specifically for signing. It’s also minimised for grip force. Most robotics these days has very limited degrees of freedom, but really high grip force to pick things up, this is almost the opposite. We have a lot of degrees of freedom, and minimal grip force that it can’t injure those deafblind folks. And so each finger has tendons routing through it to allow it to bend in different patterns. It routes to a pulley so that the motor can spin and then configure those motions. It’s difficult technology, and there’s a small market. There’s millions of deafblind folks, but it’s always difficult to really bring assistive technology forward. But I think now’s a great time to because I think there is a focus on accessibility where there wasn’t 10 years ago. And people are really understanding that the world is not set up for everybody right now and how can we bridge that gap. And there have been previous projects developing fingerspelling hands or robotics for the deaf or deafblind community, mainly in academia, and nothing’s ever gone through commercialization. So a lot of the work that we do here is really making sure that we bridge that gap. When I was doing this as my master’s thesis, I was about to graduate, and the deafblind folks we were working with through the Deafblind Contact Centre here in Boston were really encouraging me. They said, “You have to start a company with this, you need to commercialise it because you haven’t helped us yet.” You know, having this be a couple-year thesis project doesn’t benefit anybody. So really making sure that this makes it into their homes and takes it to that next level, ensuring that it’s actually helping this community that’s they’ve worked so hard for. And we’ve worked so hard really making sure that it has that impact.
All right, so this is the hand, all in a nice box. So we have an interface here, button interface, that is designed specifically for deafblind folks based on how they use their microwaves. So if you can imagine microwaves nowadays, they’re pretty flat on the outside so it’s hard to differentiate where the buttons are. So they use these, they almost look like what you put on the inside of the cabinet to prevent it from slamming as reference buttons. And each of these buttons have different topical feels, different textures, so people can differentiate what each button is doing. Each of them has different applications. So we have stories, these are stories that actually are linguist rights, and these are written in English, and then we translate them into kind of a tactile sign grammar to make sure that the language is accessible and familiar. And so we have stories that Nicole writes, things that people will find interesting, so it might be about people with disabilities who are swimmers in the Olympics, or about animals, or teaching them about bees, or the water cycle. And they can request content, we write things that they might find interesting and really go off of their tune there. The next one is news. This one is the one that inspired the whole project. So really making sure if there are important news updates, upcoming election information, they can access that. The next one is the weather. That was actually really interesting for deafblind people, they leave the house and they don’t know what the weather is outside. So this is helpful in allowing them to know if it’s raining out, this is the temperature you can expect, the temperature will change throughout the day, things like that. And then the last one is websites. So deafblind people can go on the internet, they can pull any website that you and I can. But the way the deafblind people interact with it as signing. So as this demo right here, we’re just sort of watching it, but deafblind folks actually hold from behind. So they hold right along the back with the tips of their fingers, almost hitting the thumb here. And then as it signs, they’re able to feel it bending. And they know what those different forms mean. So if we click one of the application buttons, it just signed the word story because I was going into the story app. So if I enter the application, it will then ask me a tag. So what filters you’re looking for, and it’s very intuitive. All the apps work the exact same way, so it’s the same walkthrough process for news. So it really just getting them comfortable with how to interact with the system.
In this space right now, deafblind communication options are basically through Braille tablets and human interpreters. Braille tablets are a way to access a written or spoken language. So Braille is a medium of English, for example, here in the US, and there are a lot of technologies in that Braille space. However, only about 10% of deafblind people know braille, either because they went blind later in life, or, if they were born deafblind, they also have cognitive disabilities. Only about 10% of deaf people actually know Braille. As a result, pretty much a large majority focus on human interpreters, which can be very expensive and have very long lead times. Although extremely skilled, and can really create that communication access, you can’t have an interpreter with you all day, every day. And there are deafblind folks who live alone or maybe in homes that need that access to what is the weather outside and what news is happening – again, especially in times of pandemic, the news is essential to move forward. So there isn’t a lot of technology in this space if you can’t access Braille, which is why this is really filling that gap right now, to allow people to use their primary conversational language, which is a tactile sign, but in an independent way. So we’re starting in the US and really making sure that the technology we can validate it easily, because obviously, we’re here in Boston, we can test it with local folks. But we do have collaborators, as I mentioned, Canadian National Institute for the Blind, we have worked collaborating with Sense International India, to understand that, okay, we’re developing this hardware, does this hardware work for other sign languages. There are about 300 sign languages. And that’s really the goal of what we’re making, is making this a flexible platform so the hand has as many degrees of freedom as needed to make robust hand shapes, and not just American Sign Language. That’s something we looked at very early, looking at, for example, French signs and making sure that this can still make those hand shapes and understanding. For example, Indian Sign Language is a two-handed sign language. So how do deafblind people receive that? And how would they use our technology? So asking these questions early on to create this platform that is easily integrated with other languages.
My name is Nicole Rich; I am the lead linguist at Tatum Robotics. I also wear a couple of other hats as well, like I do documentation. I do some basic teaching of ASL for engineers here, and a couple of other tasks here. My favourite days are definitely the ones where we have our deafblind folks come in, getting to actually watch them interact with the technology and interacting with them personally. Getting to know them on a personal level has been so rewarding. I write the content for what the robot is going to be spelling. So that’s a tricky spot where I had to decide what people want to learn about and what they don’t already know about, what they can learn, and what’s repetitive, so it’s hard for one person to do all that. But I did a lot of research with the deafblind community in Boston here specifically, what do they already know? What do they learn in school? What do they not learn in school? What do they know about celebrities that we take for granted? What do we see online that they never see because they had to be really intentional about what information they gather? I think the biggest misunderstanding is that people think that tactile sign is simple. People often think that ASL is just manually coded English. That couldn’t be further from the truth. The fact that I’ve been working on ASL translation for two years, I think, proves that, if it were simple, then it’d be over by now. And even assumptions about visual sign, they don’t carry over to tactile sign in the same way. So you really have to have personal experience with the people that use the language as a primary language to understand it at all. A lot of people really think it’s as simple as if you know English, then you can just move your hands around and you can get to the sign. Or that all signs look like the concept that they represent in English, like that the sign for tree looks like what an actual tree looks like – raising your hand to mimic the trunk and stuff like that. And that’s true, but some other signs don’t work like that at all. So it’s definitely a complicated language, not just a code. And I think that one of the biggest misunderstandings we encountered with non-signers is that they assume that it must be simple.
One of the biggest hurdles we face as a constant is showing the importance of bringing communication to the deafblind. I think a lot of people see the deafblind community, they might think there are three to five of them, but there are millions of these people that are completely isolated. And I think often we have to show that there’s this need, and also that we’re solving the need correctly. I think often people think, “Can’t they use Morse code or something?” And I don’t know Morse code, you don’t know Morse code, so deafblind people definitely don’t know Morse code either. So showing them that it’s important that we support this community in a way that is accessible to them. Show them that the way that we build this technology will allow for easier pickup for this, so they can use it easily. They don’t need to have hundreds of hours of training to learn a language they don’t know. And that is really been the hurdle that we’ve been facing is really just again, that whole awareness that this is an important issue and we’re tackling it in a way that is not ableist. We’re really hoping to preserve the identity of the deafblind community in signing culture, but in a way that’s just accessible for them.
That was Samantha Johnson from Tatum Robotics. This company is at the forefront of creating effective at-home communication tools to ensure that the deafblind community can have access to remote health care. Now it’s time to meet our expert, Jon Hirschtick from PTC. John, Tatum Robotics follows an agile development approach, and that has included being a participant in PTC’s Onshape startup programme. We’ve spoken about Onshape before, but can you give our listeners an idea of what the startup programme is and how Tatum Robotics has benefited from it?
The team at Tatum Robotics drive innovation by embracing agile product development, utilizing Onshape. Tatum Robotics uses Onshape for secure design, sharing and collaboration, accelerating the development process, and ensuring control over access permissions. Tatum can now easily share a design with someone and then withdraw access just as quickly, which enables quicker and more secure design feedback from their whole community. Our Onshape cloud-native architecture is the perfect solution for hardware startups looking to be agile and innovate quickly. Founders and entrepreneurs can take advantage of our incredible Onshape startup programme, where we’re providing qualifying startups with free access to Onshape professional licences and enhanced focus technical support. To apply, go to onshape.pro/startup and see if your startup is eligible for Onshape professional licences for a year for free.