Why sounds and smells are as vital to cities as the sights
The growing field of sensory urbanism is changing the way we assess neighborhoods and projects.
When David Howes thinks of his home city of Montreal, he thinks of the harmonious tones of carillon bells and the smell of bagels being cooked over wood fires. But when he stopped in at his local tourism office to ask where they recommend that visitors go to smell, taste, and listen to the city, he just received blank stares.
“They only know about things to see, not about the city’s other sensory attractions, its soundmarks and smellmarks,” says Howes, the author of the forthcoming book The Sensory Studies Manifesto and director of Concordia University’s Centre for Sensory Studies, a hub for the growing field often referred to as “sensory urbanism.”
Around the world, researchers like Howes are investigating how nonvisual information defines the character of a city and affects its livability. Using methods ranging from low-tech sound walks and smell maps to data scraping, wearables, and virtual reality, they’re fighting what they see as a limiting visual bias in urban planning.
“Just being able to close your eyes for 10 minutes gives you a totally different feeling about a place,” says Oğuz Öner, an academic and musician.
Öner has spent years organizing sound walks in Istanbul where blindfolded participants describe what they hear at different spots. His research has identified locations where vegetation could be planted to dampen traffic noise or where a wave organ could be constructed to amplify the soothing sounds of the sea, something he was surprised to realize people could hardly hear, even along the waterfront.
Local officials have expressed interest in his findings, Öner says, but have not yet incorporated them into urban plans. But this kind of individual feedback about the sensory environment is already being put to use in Berlin, where quiet areas identified by citizens using a free mobile app have been included in the city’s latest noise action plan. Under EU law, the city is now obligated to protect these spaces against an increase in noise.
“The way quiet areas are identified is usually very top-down, either based on land use or high-level parameters like distance from highways,” explains Francesco Aletta, a research associate at University College London. “This is the first example I’m aware of something perception-driven becoming policy.”
As a member of the EU-funded Soundscape Indices project, Aletta is helping create prediction models for how people will respond to various acoustic environments by compiling recorded soundscapes, both vibrant and tranquil, into a database and then testing the neural and physiological reactions they elicit. These kinds of tools are what experts say are needed to create a practical framework for ensuring that multisensory elements are included in design criteria and planning processes for cities.
The best way to determine how people react to different sensory environments is a subject of some debate within the field. Howes and his colleagues are taking a more ethnographic approach, using observation and interviews to develop a set of best practices for good sensory design in public spaces. Other researchers are going more high-tech, using wearables to track biometric data like heart-rate variability as a proxy for emotional responses to different sensory experiences. The EU-funded GoGreenRoutes project is looking to that approach as it studies how nature can be integrated into urban spaces in a way that improves both human and environmental health.
“We’re creating a lexicon of elements and how they work in combination to create a complete experience of a space,” says Daniele Quercia of Nokia Bell Labs Cambridge and the Centre for Urban Science and Progress at King's College London, one of the researchers working on the project. Quercia previously helped develop “Chatty Maps” and “Smelly Maps” of city sounds and odors by scraping data from social media. The latter project found strong correlations between people’s olfactory perceptions and more conventional air-quality indicators. With GoGreenRoutes, he’ll be using wearable technologies to assess whether design improvements to new and existing green spaces have the predicted (and desired) impact on people’s well-being.
At Deakin University in Australia, architecture professor Beau Beza is aiming for full immersion. His team is adding sounds—and, eventually, smells and textures—to virtual-reality environments that city officials can use to present planning projects to stakeholders. “Static depictions on paper of a streetscape, park, or square are difficult for many people to visualize,” says Beza. “Being able to ‘walk’ through and hear how it sounds increases understanding.”
As data collection about people’s sensory experiences becomes more widespread, many of these experts caution that concerns about privacy and surveillance need to be taken into account. Issues of equity and inclusion also come into play when determining whose sensory experiences are factored into planning. Underprivileged urban communities have typically borne the brunt of noise and odor pollution from highways and factories, yet they are also often targeted by noise complaints, for example, when their neighborhoods gentrify.
“Sensory perceptions are not neutral, or simply biological; whether we find something pleasant or not has been shaped culturally and socially,” says Monica Montserrat Degen, an urban cultural sociologist at Brunel University London. Civic planners in both London and Barcelona are using her research on public-space perceptions and how “sensory hierarchies,” as she refers to them, include or exclude different groups of people.
Degen cites the example of a London neighborhood where inexpensive eateries that served as hangouts for local youth were displaced by trendy cafes. “It used to smell like fried chicken,” she says, but newer residents found that aroma off-putting rather than welcoming. “Now it smells like cappuccinos.”
Jennifer Hattam is a freelance journalist based in Istanbul, Turkey.
This story has been updated to include an additional affiliation for Daniele Quercia.
Keep Reading
Most Popular
Large language models can do jaw-dropping things. But nobody knows exactly why.
And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.
OpenAI teases an amazing new generative video model called Sora
The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.
The problem with plug-in hybrids? Their drivers.
Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.
Google DeepMind’s new generative model makes Super Mario–like games from scratch
Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.