With all the attention given to urban applications of machine vision — from facial recognition systems to autonomous vehicles — it’s easy to forget about machines that listen to the city. Google scientist Dan Ellis has called machine listening a “poor second” to machine vision; there’s not as much research dedicated to machine listening, and it’s frequently reduced to speech recognition. 7 Yet we can learn a lot about urban processes and epistemologies by studying how machines listen to cities; or, rather, how humans use machines to listen to cities. Through a history of instrumented listening, we can access the city’s “algorhythms,” a term coined by Shintaro Miyazaki to describe the “lively, rhythmical, performative, tactile and physical” aspects of digital culture, where symbolic and physical structures are combined. The algorhythm, Miyazaki says, oscillates “between codes and real world processes of matter.” 8 The mechanical operations of a transit system, the social life of a public library, the overload of hospital emergency rooms: all can be intoned through algorhythmic analysis.