Host: Benjamin Thompson
Welcome back to the Nature Podcast. This week, a microphone made out of fabric.
Host: Shamini Bundell
And how machine learning can help in a crisis. I’m Shamini Bundell.
Host: Benjamin Thompson
And I’m Benjamin Thompson.
[Jingle]
[Heartbeat]
Interviewer: Benjamin Thompson
This is a person’s heartbeat, and it was recorded by their vest, like by their actual vest that they were wearing, using a microphone made out of fabric. Fabrics keep us warm or cool, they protect us from the elements, even from bullets, but the world of thread technology is advancing at a rate and researchers are always looking for the next big thing.
Interviewee: Yoel Fink
In recent years, we’ve been trying to see if we could reimagine what fabrics could do and really try to open up some interesting opportunities that mostly stem from the fact that fabrics cover some of the most potentially valuable real estate on the planet, which is the surface of our bodies.
Interviewer: Benjamin Thompson
This is Yoel Fink from MIT in the US who, along with his colleagues, developed this new fabric microphone. Now, fabrics are no stranger to recording environments. Anyone who has recorded a podcast from home will know that draping heavy blankets around you can help your recordings sound a lot less echoey, and that’s because they’re so good at absorbing sound, which gave Yoel an idea.
Interviewee: Yoel Fink
So, traditionally, fabrics really have been used to absorb sound and convert the energy associated with acoustic waves into vibrations and into heat. So, the question we set out to answer is whether fabrics could do something very different, which is take those acoustic waves associated especially with audible speech, and convert them into meaningful and detectable electrical signals. Now, when we talk, the acoustic waves that we are exchanging create pressure variations that are very, very small – on the order of one tenth of a millionth of an atmosphere. The amazing thing is that our ears are able to take those pressure waves and convert them into electrical signals.
Interviewer: Benjamin Thompson
And Yoel created a two-dimensional fabric ‘ear’ – a piece of fabric that could essentially do the same job, turning pressure waves into mechanical vibrations that could then be converted into electrical signals. He built his ear from a piezoelectric material – something that can convert mechanical stress, pressure, say, into an electrical pulse. At its core was a piezoelectric fibre that the team developed – a composite with a flexible outer layer concentrating the stresses on an inner piezoelectric layer. This maximises the electrical output. It is a complex structure, and getting all the different layers in the right place in a fibre thin enough to be woven into a fabric wasn’t exactly straightforward, so Yoel built this complex structure on a large scale and gave it a stretch.
Interviewee: Yoel Fink
We start off with a scaled-up model of the fibre. This has in it all of the materials that the fibre will have, in the same arrangement that is necessary in the fibre. And when it gets pulled out, all of these materials end up flowing next to each other while keeping their relative positions and shrinking down in dimensions, and that is essentially how we produce the fibre.
Interviewer: Benjamin Thompson
The final fibre was close to a millimetre in width, and it was flexible enough to be wound around a finger. The team wove a single 7-centimetre strand of it into a shirt, placed that shirt onto a mannequin, and hooked it up to their equipment. Then one of Yoel’s colleagues said this to it.
[The acoustic fabric records audible sounds]
Interviewer: Benjamin Thompson
So that’s them talking at the shirt, and this is what the fabric microphone picked up.
[The acoustic fabric records audible sounds]
Interviewer: Benjamin Thompson
Once more, and bear in mind this is being recorded by a shirt containing a single strand of acoustic fibre.
[The acoustic fabric records audible sounds]
Interviewer: Benjamin Thompson
So, that’s one fibre. The team also stitched two fibres into another shirt, and showed that this double-microphone setup could be used to identify the direction that a sound came from. But they didn’t just listen for external sounds. In another experiment, they also wove the acoustic fibre into a vest that could pick up the wearer’s heartbeat.
[Heartbeat]
Interviewer: Benjamin Thompson
Yoel thinks that a wearable fabric mic that sits directly on the body opens up a wealth of potential applications, from helping people with hearing aids to focus their listening on a specific speaker in a noisy room to providing long-term, comfortable monitoring of heart or respiratory function, even monitoring a baby in the womb. Now, Yoel’s fibre isn’t the first flexible piezoelectric material capable of responding to sound, but others have struggled with sensitivity, and this is what Wenhui Song, a biomaterials engineer at University College London here in the UK, was intrigued by.
Interviewee: Wenhui Song
I’m impressed with the combined engineering design and the process to make use of all the different types of the functional materials into one piece of flexible sensor. I would say they have a very inventive design of the structure of the sensor to fabricate such an interesting high-performance, flexible fibre.
Interviewer: Benjamin Thompson
Wenhui also points out that the way the fibre was made – by stretching – is actually a fairly standard way of producing long strands of material, which should aid future industrial scaleup. But she notes that a controlled lab setup is very different from the real world, and you need more than just microphones to make a truly portable, wearable device.
Interviewee: Wenhui Song
You can see the settings they have, all the signal processors, all the computers. They have all the existing components to make the system work.
Interviewer: Benjamin Thompson
It’s true right now that the fabric microphones need to be hooked up to other equipment to process, store and transmit the sounds. And to make that portable, well, you need a lot more components.
Interviewee: Wenhui Song
So, flexible polymer transistors, capacitors, those things need to catch up, not only for sensing, but also for signal processing and data storage.
Interviewer: Benjamin Thompson
But Wenhui doesn’t think that this is an insurmountable barrier.
Interviewee: Wenhui Song
Of course, in the future, we look at the big data, cloud storage, Wi-Fi communication, and that might save those components and use less this kind of integration.
Interviewer: Benjamin Thompson
Indeed, Yoel is looking to find ways for his fibres to hand off the signals they pick up, using wireless components to negate the need to attach lots of extra electronics to his fabrics. But he also wants to build functions like this into the fibres themselves, and he has lofty ambitions.
Interviewee: Yoel Fink
So, we have in fact figured out some very, very small, miniaturised electronics. They don’t perform all the analysis, but they broadcast. In separate work, we actually have shown that fibres could store signals in the memory, and we could actually store computer programmes within the fibre, so a fibre with a memory. So, in the not too distant future, I think you’re going to hear about a fibre or a fabric computer, and all of these different elements will come together within a single fabric.
Interviewer: Benjamin Thompson
That was Yoel Fink. We’ll put a link to his paper in the show notes. You also heard from Wenhui Song, who has written a News and Views article about the research, and there’ll be a link to that as well.
Host: Shamini Bundell
Coming up, we’ll be hearing how machine learning was employed in Togo to help target financial aid. Right now, it’s the Research Highlights with Dan Fox.
[Jingle]
Dan Fox
If you’re trying to sneak up on a sleeping shark, you should keep an eye on its posture and not on whether its eyes are closed. Researchers filmed seven draughtsboard sharks over 24 hours while measuring the oxygen levels in the sharks’ tank. The faster the fish absorbed oxygen from the water, the higher their metabolic rate. The team found that when sharks stopped swimming for more than five minutes, their metabolic rates dropped, suggesting they were sleeping. Sleeping sharks typically had a ‘flat posture’, lying flat on the bottom of the tank, whereas sharks that were only resting propped themselves up on their front fins. closed eyes were not a good indicator of whether a shark was asleep. The researchers say that shark sleep probably functions to conserve energy. And as sharks are the earliest group of jawed vertebrates, dating back hundreds of millions of years, they may provide insight for understanding the evolutionary origins of sleep. Don’t sleep on that research. Read it in full in Biology Letters.
[Jingle]
Dan Fox
Dust from windswept deserts and plains plays a dominant role in the formation of thin, wispy cirrus clouds around the world. Existing data suggests that winds carry 1-4 billion tonnes of mineral-dust particles from arid regions into the atmosphere each year. Some of these particles help to generate delicate, high-altitude clouds called cirrus clouds, but the importance of dust to cirrus formation has been unclear, contributing to uncertainty about clouds’ role in climate change. To trace the movement and impact of dust, researchers combined detailed dust measurements, collected during a global airborne campaign, with atmospheric modelling. They found that the Sahara Desert is the largest source of dust overall, but Saharan dust often fails to make it to the upper atmosphere. Instead, in the Northern Hemisphere, the upper atmosphere tends to be dominated by dust from central Asia, whereas in the Southern Hemisphere, dust from deserts in Australia, South Africa and South America dominate. Model simulations suggest that dust is the main contributor to cirrus-cloud formation in the north and initiates some cirrus formation in the south as well. Read that research in full in Nature Geoscience.
[Jingle]
Host: Shamini Bundell
Next up, reporter Nick Petrić Howe has been finding out how machine learning helped the government of Togo when the pandemic hit.
Interviewer: Nick Petrić Howe
Early 2020 is probably as cemented in your mind as it is in mine. The world was, and in many ways still is, facing a crisis. A novel virus was spreading at an alarming rate and countries around the world were implementing public health strategies to avoid the worst possible impacts. For me, this resulted in being sent to work from home, not seeing loved ones and, frankly, watching too much Netflix. But the majority of the world was not as lucky as me. There were significant economic impacts to the public health strategies, and these were especially acute in low- and middle-income countries. Togo, a West African nation of around 8.5 million people, was just one of many countries that was facing a crisis. Here, citizens were facing the prospect of hunger and malnutrition as the pandemic closed down the world. The Togolese government wanted to do as much as it could to help the most vulnerable. Like many nations, it settled on sending payments to people to help them pay for essentials like food. Unfortunately, resources were limited. The UN classifies Togo as one of the world’s least developed countries, so they needed a very targeted approach. Wealthy nations like the US have in-depth statistics about their citizens and can roll such things out based on things like income tax records. This wasn’t an option in Togo, where, normally, the government would organise a national door-to-door survey to find out about citizens’ financial situations. But in the midst of a coronavirus pandemic, this sort of approach wasn’t really an option. To maximise their ability to provide financial aid to the people most in need, government officials from Togo reached out to Josh Blumenstock, a computer scientist based at the University of California, Berkeley, who has had previous experience of using machine learning to identify people in poverty.
Interviewee: Josh Blumenstock
So, I do research that's really at the intersection of development economics and computer science, thinking about how methods from computer science and, in particular, machine learning can be applied to questions of global poverty and inequality.
Interviewer: Nick Petrić Howe
Before Josh’s involvement, the Togolese government had come up with a plan to target the distribution of financial aid to people based on their occupation. They had recent data on this due to an election, and so opted to send aid to people in informal occupations, under the assumption that those people would be the most in need of financial assistance. In the first instance, they wanted Josh to analyse their approach to see how well it was working.
Interviewee: Josh Blumenstock
And our analysis suggested that the programme was working pretty well in urban areas. But a major focus of the government's efforts in late 2020 was expanding this programme to rural areas. And this occupation-based approach to targeting really didn't make sense in rural areas.
Interviewer: Nick Petrić Howe
The issue in rural areas was that most people would be classified as being in informal occupations, so there would be no selectivity in these regions. The Togolese government really needed to know who needed the aid the most, as they could not afford assistance for everyone. So, again, at the request of the government of Togo, Josh and his colleagues set out to find a solution that would work in these rural areas, and they thought machine learning could help. Initially, they fed satellite data into a machine-learning algorithm to try and identify villages whose populations were most likely in need of help. But this approach was too broad. Really, the government needed to find the people most in need within the villages that were also most in need. So, Josh and his colleagues turned to mobile phone data. They thought they could use this to differentiate between wealthier and less wealthy people, as this has been shown to be the case in other parts of the world.
Interviewee: Josh Blumenstock
It's also very true in a place like Togo, where wealthier people have different international calling networks and different patterns of SMS activity, and so on and so forth. And so, again, the question then is, can you teach a machine-learning algorithm to recognise what those hallmark signals are of wealth versus poverty in the phone data? And it turns out that these algorithms are relatively accurate.
Interviewer: Nick Petrić Howe
Now I know what you’re thinking. Mobile phone data was being used to find out about their wealth – something that many people consider private. But Josh and the team worked to ensure privacy.
Interviewee: Josh Blumenstock
We were very careful about building both technical and institutional safeguards to help ensure that data privacy was really protected as much as possible. So, we had a lot of sort of technical data security mechanisms in place. Another thing we did is that we set things up to make sure that no party had access to more data than they absolutely needed. So, for instance, and very much by design, the government of Togo never accessed any of the data from the mobile phone companies. What happened is our research team worked with the mobile phone companies to produce a white list of beneficiaries, and then the government essentially took that list and paid all of those people.
Interviewer: Nick Petrić Howe
So, there were safeguards put in place, and the machine-learning algorithm could pretty accurately pick people out who were most in need of the payments. But a key question is, is this better than the alternatives? Was this phone approach better than the original plan of the government to send money based on people’s occupations? Or was it better than the idea to send it out to all of the villages that were identified as having the lowest incomes?
Interviewee: Josh Blumenstock
And so, across these three approaches, nothing is perfect. And the way you measure accuracy and performance in this context is by looking at errors of exclusion and errors of inclusion. And, in particular, we're concerned about how many true poor people are not getting paid because the targeting mechanism is not able to correctly identify them. And so, when we look at it we see that the machine-learning and phone-data-based approach substantially outperforms the other two, actually reducing errors of exclusion by up to 50%.
Interviewer: Nick Petrić Howe
So, the machine-learning approach using phone data was going to miss less people than other approaches available to the Togo government, and so they rolled out the scheme. Across six months in 2020, about US$10 million in aid was distributed to about 150,000 people in rural Togo. This particular targeted algorithm made sure aid was sent to those in need, and it has inspired interest in using machine-learning technology in other assistance programmes run by the government. Josh had also analysed this phone approach in comparison to other solutions that were not available to the government of Togo at the time. For example, compared to a conventional survey approach, the machine-learning method actually performed worse. But at the time, it wasn’t really an option.
Interviewee: Josh Blumenstock
So, I think the lesson here is that, if it's not a crisis or if the government has access to a national survey, then they should probably rely on that data. But in the context of a crisis, where the other options don't exist, this phone- and machine-learning based approach can work pretty well.
Host: Shamini Bundell
That was Josh Blumenstock, from the University of California, Berkeley. To find out more about this study, there'll be a link to the paper in the show notes.
Host: Benjamin Thompson
Finally on the show, it’s time for the Briefing chat, which is the part of the show where we discuss a couple of science stories that have been featured in the Nature Briefing. Shamini, what have you brought for us to talk about this time?
Host: Shamini Bundell
Well, I’ve actually got some more machine-learning news for you. It’s researchers who are using artificial intelligence to try and find meteorites that have fallen from space.
Host: Benjamin Thompson
My goodness. I mean, why is machine learning being used for this? Is it hard to find meteorites that fall from space?
Host: Shamini Bundell
Yeah, so the first thing you’ve got to do if you want to find a meteorite is to figure out when sort of fireballs come sort of burning through our atmosphere. And there are various networks. So, this particular story from Physics World is actually in Australia and they have the Australian Desert Fireball Network. So, they’ve got cameras. It’s all about working out where meteorites are hitting and trying to find out roughly where they might have impacted. But then what you’ve got is tiny little pieces of space rock scattered over still a really potentially huge area. So, yeah, it is really tricky, and researchers really want to get to these little fragments as soon as possible before that rock gets contaminated, so that they can study it.
Host: Benjamin Thompson
And when you say ‘small’, how small are we talking? And how does machine learning kind of fit into the overall picture?
Host: Shamini Bundell
The sizes vary. We’re usually talking about things that could be seen by the human eye, right? For example, there were some pieces of meteorite found in the UK last year, actually, and that involved, once they’d kind of worked out where it was, people walking over a field, evenly spaced 2 metres apart, sort of staring down at the ground until they found a little chunk of it. But, especially if you’re talking about the vast Australian wilderness, it’s quite hard to cover that much ground. It’s obviously a lot of resources to get people out there. So, what these folks have done is try to use drones with cameras on and try to train a machine-learning algorithm on what meteorites in this desert environment might look like so that, if they have a rough area, they can send the drones out first.
Host: Benjamin Thompson
Right, and, well, did it work?
Host: Shamini Bundell
Yeah, well, so this is the first time they’ve actually recovered a bit of meteorite from a sort of drone-assisted, machine-learning-assisted technique. So, they had a couple of drones that were quite high up, looking over quite a big area, and they had shown the drones what a meteorite looked like on that particular background, and the algorithms were basically looking for things that were out of the ordinary. They then whittled that down to a few options of things that could be meteorites, and then they send another drone, flying even lower down, to get a closer look. That left four options still of objects that could be meteorite fragments and then, finally, they still had to go out there on foot, and they found one 70-gram meteorite, which they were absolutely ecstatic to find because if it hadn’t have been for this sort of system, they probably wouldn’t have even bothered going out for this particular meteorite.
Host: Benjamin Thompson
Wow, okay, so it’s worked in this case then. What do you think this means for the field of searching for pieces of meteorite? Does this mean less folk going out, just walking through fields, looking for bits that have landed?
Host: Shamini Bundell
Well, hopefully it will mean being able to find more meteorites. A lot of these objects have come from the asteroid belt, and researchers are really keen to, like I said, get them really quickly once they’ve landed, and find out more about the history of the Solar System and the sort of makeup of that asteroid belt. Now, I should mention that this is in the desert in Western Australia. It’s not necessarily going to work everywhere. The UK example that I mentioned, they were looking in a field full of sheep poo for tiny, shiny black rocks. That might be a harder challenge for a machine-learning algorithm to spot things, rather than the sort of orange desert with notable dark rocks in. But, like I said, this is just the first time it’s worked, so I think a lot of people are going to be keen to put that to use and hopefully find lots more meteorites. So, what’s your story for this week, Ben?
Host: Benjamin Thompson
Well, this a story that I read about in Nature, and it’s based on a paper published in Forensic Science International, and it looks at how radiocarbon dating was used to detect two forged paintings.
Host: Shamini Bundell
Oh, this is like art crime and drama. What paintings were these?
Host: Benjamin Thompson
Well, this story starts with a discovery by French investigators of a trove of like 600-odd paintings uncovered in a restorer’s workshop. Now, dozens of these were apparently potentially mid-level masterpieces, but experts were questioning their authenticity because the paints seemed relatively fresh. And so, to find out if they were genuine, the French government called in some researchers to find out, and they selected a few, including two in particular, as part of this work, by an impressionist and pointillist artist.
Host: Shamini Bundell
And I guess the traditional techniques for these kinds of things would be looking at the brushstrokes or the type of paint, so was this a very novel way to try and detect faked paintings?
Host: Benjamin Thompson
Well, you’re absolutely right. So, researchers, yeah, typically use imaging or chemical analysis to detect forgeries, and I guess they’re sort of peering behind the brushstrokes to look at how the painting materials have aged. But they can’t necessarily nail down a painting’s date. And radiocarbon dating is apparently getting quite some steam in terms of forensic analysis of artwork, and advances are meaning that smaller and smaller amounts can be tested. Now, of course, if you’re testing a piece of artwork, you don’t want to have to cut off a massive piece to see if it’s the real deal or not. And if you own an auction house or a museum or what have you, I guess the last thing you want to do is potentially damage an authentic painting to check if it is indeed authentic.
Host: Shamini Bundell
So, do they actually have to get a piece of the canvas itself to date that?
Host: Benjamin Thompson
So, in this case, yeah. They took some samples from these two paintings, including a piece of fibre that is used to make the canvas, and they did some radiocarbon dating, looking at levels of carbon-14, which is an isotope that decays over time, and all living things pull this out of the atmosphere and you can kind of work backwards to guess how old something is. But beginning in the 1940s and increasing a lot during the 1950s, levels of carbon-14 rather increased in the atmosphere as a result of nuclear weapons testing.
Host: Shamini Bundell
Oh, no way.
Host: Benjamin Thompson
Yeah, and so you get this curve up, starting in the mid-40s, I guess, but increasing in the 50s, up to about 1964 before it tails off again, right, so you have this kind of peak. And researchers can identify very clearly if something is post-1950 because the levels of carbon-14 are so much higher than they would be before that. And this is kind of what’s happened here. In this work, they looked at these paintings and levels of carbon-14, and they can say that these are forgeries, and the canvases were made either in the mid-1950s or after the year 2000. So, you don’t know which side of the curve it’s on – is it pre the 1964 peak or is it after – but either way, these paintings couldn’t have been made in the early twentieth century, as had been claimed.
Host: Shamini Bundell
So, they already basically thought these paintings are suspicious, and now they’ve been able to prove it. So, now that this technique is being used, I guess that makes things all the more difficult for all the art forgers out there.
Host: Benjamin Thompson
Yeah, potentially. I mean, in this article, they say that this is maybe the first time that this has been used in a police investigation. And as I said before, this has been used in various places to test whether it works. So, yeah, we might imagine that it’s going to get used in conjunction with other techniques more and more.
Host: Shamini Bundell
Wow, fascinating. I shall look forward to future episodes of science solves crime here in the Briefing Chat. Well, thank you, Ben, and listeners, if any of these stories have piqued your interest then you can get more just like them delivered to your inbox by signing up to the Nature Briefing. There’ll be a link on how to do that, along with the stories that we’ve discussed, in the show notes.
Host: Benjamin Thompson
And that’s all for this week’s show. But as always, don’t forget you can keep in touch with us, either on Twitter – we’re @NaturePodcast – or on email – podcast@nature.com. I’m Benjamin Thompson.
Host: Shamini Bundell
And I’m Shamini Bundell. Thanks for listening.