The Battle For Better Air
This article includes a "Behind-the-Scenes" interview with the author. Listen on Spotify or Apple Music.
—
Imagine: it is 900 AD. You are in a great hall somewhere in northern Europe. In the center of the room is a vast open wood or peat fire where meat is roasting and a stew is bubbling in a large pot. The hall is warm — okay, it’s tepid — and the atmosphere is cozy, even sleep-inducing.
Oops! That warm, sleepy feeling is probably mild chronic CO poisoning, and you might have an asthma flareup from the soot. This cozy atmosphere could be deadly.
In 2014, a group of Danish researchers lived in recreations of Viking houses for 13 weeks and measured the indoor air quality. They concluded that Viking longhouses in winter were less charming and more carcinogenic than fantasy authors might have you believe: particulate matter levels averaged around 0.41 milligrams/m3 (for context, the WHO suggests that particulate matter levels of just 5 micrograms/m3 can be harmful).
Another series of archaeological experiments in 2017 suggests the problem was even worse in prehistory. In a reconstructed building at Çatalhöyük in Turkey, fine particulate matter (PM 2.5) levels averaged around 30,000 micrograms/m3 — the carcinogenic byproduct of burned wood, dung, and other fuel biomasses.
Burning solid fuels has been a serious contributor to health issues such as chronic illness and reduced lifespan for centuries and continues to be so today. In 2021, indoor air pollution from solid-fuel-burning fires — much like those fires burning in the longhouses — contributed to about 3 million deaths worldwide (including nearly 500,000 deaths of children under five). This figure is far from acceptable, but it’s actually a staggering improvement.
It’s tempting to think that people living in pre-urban eras enjoyed healthier built environments. But in reality, crowding, solid fuel use, and limited design options meant that indoor air has been unpleasant for most of human history. In fact, just making indoor air bearable to breathe was a challenge.
Incorporating technologies from rudimentary windows to contemporary HEPA filters has helped improve air quality over time. And as these technologies have become cheaper, they have changed from a luxury to a necessity as more people can afford and benefit from them, and as their use becomes mandated by law.
Glass windowpanes, for example, were once made of imported alabaster and accessible only to the aristocracy. But by the 1850s, a minimum number of windows per dwelling was codified into law in cities like London. While this architectural change was driven, in part, by lowered costs and the reduction of taxes on glass, it was largely the result of a growing consensus that windows improved human health. Our built environments reflect many such protracted battles to stay comfortable and well. Air quality is often at the center of these struggles, and while we have come a long way, we can push these efforts even further.
Soot
Today, it is well-known that smoke emitted from coal, wood, and dung fires contributes to chronic health conditions and even death. However, chimneys and fireplaces, both attempts to manage the particularly awful smoke from coal and solid fuel (not only smelly but now a known carcinogen), didn’t become common home fixtures until the 16th century.
Before then, dwellings were plagued by high concentrations of air pollutants from smoke. While there’s some evidence that a few wealthy Romans had chimneys, the overwhelming majority of Europeans from antiquity until the 12th century relied on holes in their roofs to reduce smoke buildup indoors. These holes did not work particularly well. Rudimentary chimneys did allow smoke to escape, but they also contributed to a “stack effect,” wherein warm air leaves from above and causes cooler, denser air to rush in from below. This allows for some air exchange, which is an important factor in keeping indoor air from feeling and smelling bad, as well as reducing the concentration of airborne disease particles. However, it can also draw in outdoor pollutants, allergens, and dampness.
{{signup}}
Wood fires remained commonplace despite the indoor pollutants they caused until wood was no longer readily available as a source of fuel. By the early Tudor period, Britain’s forests were in shambles. Deforestation may seem like a modern concern, but even in the 16th century, reliance on wood for heating and shipbuilding meant that lumber was in high demand. In 1543, Henry VIII passed the Preservation of Woods Act, which further reduced the availability of firewood, pushing more people towards coal.
Coal fires do have an advantage over wood fires: they burn significantly hotter, which means that they are a bit better at heating a room. This might seem like an improvement until you consider that many houses were still largely made from materials like thatch, wattle, and daub — literal tinder — a recipe for devastating house fires. Following the Great Fire of 1666, the 1667 Act for the Rebuilding of London stipulated that new structures had to be made from brick or stone and that chimneys could not involve timber. A 1708 act stipulated that window frames needed to be set back from the street “for the better preventing of Mischiefs that may happen by Fire.” While building codes for safety purposes are mundane today, a city taking such an interest in how health played into built environments was fairly unprecedented at the time.
Like all restrictions, however, many people shirked them to their own detriment. In 1662, England began levying a tax based on the number of hearths, which were stationary and easier to count than people. People immediately began thinking of ways to circumvent the tax.
In 1684, a tax-evading baker in Oxfordshire knocked a hole through the back of her oven to exploit her neighbor’s chimney. While it may have helped her avoid taxation, it also caused a fire that leveled 20 houses and killed four people. Although fire risk may seem less directly related to indoor air quality, a clear and clean chimney can help cycle more air than either a clogged one or a hole in the roof.
Beginning in the 1600s, people were becoming increasingly aware of the importance of air quality. In 1667, the English polymath Robert Hooke developed one of the first air pumps and promptly demonstrated the importance of fresh air — not just the mechanical motion of respiration, as was previously believed — in mammals. By 1783, Lavoisier and Laplace established that human metabolism relied on a mechanism akin to fire, consuming oxygen in a contained space.
By the 18th century, most educated people knew air quality mattered to health. While staying together at an inn, John Adams and Benjamin Franklin briefly quarreled over whether to keep a window open or closed at night, basing their arguments on burgeoning theories of health and disease transmission. Adams fell asleep during Franklin’s impassioned lecture on his “theory of colds.”
Fresh Air
Before the 20th century, most efforts aimed at improving air quality in the home were focused only on making the environment more “agreeable,” rather than measurably more beneficial to health. This included the most rudimentary of ventilation tools: windows. Opening windows is a simple and effective way to regulate indoor air quality by reducing particulate matter, the buildup of viruses and CO₂, and (aesthetically) bad smells. Prior to electrification, natural light meant less reliance on expensive candles or gas (and thus slightly fewer indoor air pollutants). While windows in dwellings are legally mandatory in many developed countries, this is a luxury afforded by relatively new technologies like cheap, high-quality plate glass.
The Romans had some glass windows, but they were small, bumpy, and fairly opaque. To Romans, it was simply easier to cobble together windows from sheets of translucent alabaster than it was to make glass panes. Those that couldn’t afford alabaster windowpanes relied solely on open window holes. Unfortunately, this left them vulnerable to pests like mosquitoes. As a result, vector-borne diseases such as malaria abounded. Childhood mortality due in part to such diseases meant that Roman women needed to have around 6.3 children merely to replace themselves and their husbands.
For about a millennium, most Europeans made do with either shutters or open holes for windows — a wealthy few could shell out for stained glass. By the 17th century, glass windows were becoming more commonplace.1 However, because they were relatively expensive to make and maintain, windows often served as a proxy for wealth that could be tallied from the street by inspectors. After England began to tax windows in 1696, as with chimneys, many people blocked them off to avoid this taxation — though their indoor air quality paid the price.
Starting in the early 1800s, larger, cheaper panes of glass made it possible to have more windows. Window taxation began to cause more friction as it began to impact the general population rather than just the wealthy. Unscrupulous landowners blocked off windows to avoid paying extra. This may have been fine in a 4,000-square-foot row home, but it could be disastrous to a working-class family crammed into a single rented room.2 Industrialization and then-unprecedented levels of urban density led to a rise in airborne diseases and made better ventilation much more important.
The window tax was increasingly seen as a cruel means of “taxing light and air.” And in the 1840s, model dwellings companies (MDAs) began to advocate for “hygienic” yet profitable housing for working-class people. One such MDA hired the architect Henry Roberts, who, in a manner prescient for his time, thought that “the most humble abodes, whether in a town or in the country, in order to be healthy, must be dry and well-ventilated.” In 1851, thanks in large part to pressure from these MDAs, the window tax was repealed.
While outdoor air pollution was still a problem and windows were not guaranteed, for the first time in history, fresh air and light were less likely to be treated as luxuries.
Early Air Control Systems
On their own, technologies like windows and chimneys improve indoor air quality by reducing particulate pollution and disease transmission. But when used strategically and in tandem, ceiling height, double-hung windows, multiple levels of windows, and even building placement prove more effective than the sum of their parts.
Florence Nightingale, an English statistician often acclaimed as the “founder of modern nursing,” was obsessed with air. As a believer in miasma theory, she claimed that bad air was making people sick. In her 1863 book on hospital design, she dedicates around 60 pages to the importance of ventilation in hospitals. She advocated for pavilions connected by outdoor walkways, wards full of windows that allowed for cross-breezes, and building sites near the sea. “With a proper supply of windows, and a proper supply of fuel in open fire-places, fresh air is comparatively easy to secure when the patient or patients are in bed,” wrote Nightingale in her 1859 book, Notes on Nursing. “Never be afraid of open windows then. People don't catch cold in bed. This is a popular fallacy.” However, while Nightingale’s assertion that windows are key to indoor air quality sounds sensible on the crisp and windswept English coast, it seems less so in stagnant, sweltering, mosquito-ridden parts of the world.3
Between the 1882 discovery of Mycobacterium tuberculosis and the introduction of antibiotics in the early 1940s, tuberculosis haunted the public imagination in the West. Rather than blaming the spread of the disease on things like heredity, medical professionals could now point to a specific communicable microbe. Even so, the general public continued to want to treat the disease the way they had for decades: through things that felt or seemed good. One of these was exposure to fresh, invigorating air.
“Licht und luft” (light and air) was a popular public health motto in Germany at the turn of the century, and even outside of public health institutions such as sanatoria, experts urged people to sleep with their windows open. Although this had less to do with advanced microbial understanding, and more to do with observation and folk wisdom, some of the elements of these treatments did indeed help — especially those relating to the curative nature of good air quality.
Sanatoria in remote locations often had design features like huge porches for chaise longues and designated locations for people to sun themselves. In urban areas, designers tried to maximize sunlight and fresh air via open patios and stepped terraces. In the suburbs, revolving “summer houses” (picture a tiny house on a lazy Susan) provided families with “curative” and bolstering sunlight. Following the 1943 introduction of streptomycin antibiotics, tuberculosis no longer conveyed a near-certain death sentence or a lifetime of sleeping on the porch. Le Corbusier, widely considered one of the founders of modernism, imagined a built environment covered in white paint (imitating antibacterial whitewash) where “[t]here are no more dirty, dark corners. Everything is shown as it is. Then comes inner cleanliness.”4 In wealthier countries, sanatoria closed almost overnight, but certain design elements — such as wide porches and window seats — remain popular to this day.
Other technological developments not directly related to air also helped improve indoor air quality. Odd as it may sound, the loudspeaker, microphone, and amplifier played a crucial role in improving indoor air. Before voice amplification systems, people relied on acoustics to make themselves heard. In huge spaces like theatres, this often involved enclosure and the nixing of windows. At best, this led to stuffy, still air shared by too many people.5 At worst, such as in the U.K. Houses of Parliament and the U.S. Capitol, it meant hundreds of men in full suits in the heat of summer, trying not to pass out and inadvertently infecting one another with airborne diseases.
The House of Parliament’s most iconic architectural feature is a memento of a much stuffier and smellier world before modern HVAC systems. Big Ben originally had a huge ventilation shaft meant to take advantage of the stack effect (hot, breathed air goes up; cool, fresh air goes down), like a giant chimney. Unfortunately, the Victorian ventilation system didn’t hold a candle to a modern forced-air system, and it may have made the problem worse.
Following an influenza outbreak, a 1903 study of the House of Commons debating chamber focused on whether the ventilation system was adequate. To test the 1860s system, Dr. M.H. Gordon read Shakespeare aloud for an hour in an empty House of Commons, to an assembly of Petri dishes. One experiment was done with the ventilation off, and another with it on. The tests confirmed that the ventilation system made things worse — there were more colonies on the plates when the system was on.
Washington, DC, muggier and more humid than across the pond, was beset by even worse challenges. Those who could leave for the summer did so. Those who had to stay — including the President — resorted to extreme measures. In the summer of 1914, years before central A/C made it to the White House, President Wilson sought to cool off by moving his office into a tent next to the Rose Garden. Assembly rooms and halls proved even more oppressive. Windows and acoustic voice amplification simply didn’t mix, which made the many iterations of America’s most important federal buildings absolutely unbearable. People even dubbed the House Chamber “the oven.”
By the 1850s, the Capitol was renovated to include a ventilation system featuring 14-foot rotary fans driven by subterranean steam engines. Here again, disease control didn’t drive this change, just the desire for a more tolerable temperature. The designer of this iteration bragged that at least his ventilators “didn’t kill anyone” from heatstroke. However, it didn’t do much to lower temperatures, either, and led to an angry Committee on Ventilation in 1864. The committee hired the architect Charles Frederick Anderson to survey the system. He concluded that “great error has been committed in attempting to furnish the proper air to the Halls.” An 1870s improvement featured individually controlled vents under every seat — sadly, most served as spittoons.
Overall, then, many of these early ventilation systems proved ineffectual. Their designers didn’t fully understand how pollution impacted health, nor did they have a great model of disease transmission. Also, they lacked the tools to make enclosed spaces tolerable in high temperatures.
Everything changed with the arrival of mechanical refrigeration for living spaces.
Heat and Filtration
In 1844 in Florida, John Gorrie, a young doctor, was treating an endless stream of yellow fever patients. One of the few treatments available in Gorrie’s day was to reduce the fever via cooling the body. However, ice couldn’t play a great part in this as it was a luxury that often melted en route from its source.6 Gorrie, who had studied physics, knew that compressing a gas raised its temperature and that, by allowing that gas to expand, one could lower its temperature. Using this knowledge, he set about building a system to make ice by means of compressed gases. By doing so, Gorrie reduced the fevers of his patients and created the first working mechanical refrigeration system.
While briefly used to offer palliative care to an ailing President Garfield, Gorrie’s system didn’t catch on for cooling homes and workplaces until the 1890s. A number of systems emerged based around similar principles as Gorrie’s, but to many historians, Willis Carrier’s 1902 installation in Buffalo, New York introduced air conditioning as we know it today. Carrier’s air conditioners not only cooled the air but also controlled humidity and (supposedly) improved air quality by forcing air through a chamber full of mist-spraying nozzles.7
That same year, the New York Stock Exchange installed a similar system designed by Alfred Wolff. While air conditioning has some immediate, obvious health benefits, like fewer cases of heatstroke, we now know it can help with mental health, cognition, and … stock market returns. Before 1903, hot days were significantly associated with lower Dow Jones Average returns; post-AC, the correlation weakened.
For the first time in history, the built environment could withstand the tyranny of climate. People could sit inside a fully enclosed space in the summer with no risk of heatstroke. Heat-related mortality decreased in cities like New York as air conditioning became more prevalent. The air in rooms without direct window access no longer stagnated. The usable square footage in simpler, cheaper-to-build floor plans and taller skyscrapers increased, and hotter regions like the American Southwest and Southeast opened to would-be residents.
Now that we had tackled obvious problems like heatstroke and solid fuel smoke inhalation, we could work on tools to effectively address unseen killers like airborne disease and dangerous particulates.
Unfortunately, people needed prompting to view it as a priority. The postwar era had seen a boom in medicine. Polio, largely transmitted via aerosols and droplets, had a vaccine by 1955; in 1961, there were only 161 reported cases. Measles (1963), mumps (1967), and rubella (1969) were all vaccine-preventable by 1970 (and targeted by a single vaccine in 1971). The infectious airborne diseases that had killed millions a generation ago waned; the United States saw a 50 percent drop in tuberculosis deaths between 1940 and 1950. In the meantime, the public was just beginning to become aware of the harms of a daily (if not hourly) indoor habit — smoking.
On January 11, 1964 (a Saturday, to minimize negative stock market externalities), the most famous paper from the Surgeon General’s office was released: The 1964 Report on Smoking and Health. While anti-tobacco sentiment had been around since the 17th century — King James I decried it in his 1604 “Counterblaste to Tobacco” — evidence regarding its risks had been building up during the first half of the 20th century. The 1964 report was not the first government report on smoking, but it was the first to authoritatively state that smoking causes lung cancer, bronchitis, and heart disease.
At the time the study was published, about 41 percent of Americans smoked, and with a few limitations, smoking was allowed everywhere, including indoors. While there were warning signs of the dangers of indoor smoking before 1964, including several large-scale studies that isolated the causal link between smoking and cancer, the 1964 report was unsurpassed in scale and rigor, containing 7,000 articles reviewed by more than 150 consultants from 1962 to 1964. Fortunately, and perhaps as a consequence of this report, 1964 saw the peak of American smoking, with the percentage of Americans who smoke decreasing ever since.
This reduction was largely driven by legislation and campaigns to educate people about the risks of smoking. In 1965, Congress mandated labeling on all cigarette packaging that warned consumers about the risks of smoking. In 1960, 28 percent of Americans did not think that smoking was a cause of lung cancer; by 1969, that number dropped to 11 percent.
Conveying (let alone addressing) the negative effect of smoking on air quality was more challenging. Tobacco smoke contains particularly small particles which prove difficult to filter away (even with the best mechanical filters and ventilation systems available in 2025). Thus, in the comfort of air-controlled homes, millions of Americans who didn’t smoke themselves died from secondhand smoke in the years following 1964.
While this public health crisis unfolded, many air filtration solutions that we rely on today to reduce indoor air pollutants already existed. HEPA filters, used in military and industrial applications since World War II, needed to extend their reach and efficacy. However, putting them to work effectively in homes, schools, and workplaces presented a more significant challenge than one might expect.
A 1978 case study from the American Journal of Epidemiology helps illustrate the risks of sub-optimal filtration methods. In April 1974, an elementary school with a 97 percent vaccination rate had a measles outbreak. A single sick second-grade student unleashed 60 cases in 14 other classrooms. The outbreak was attributed to the school’s having just two central HVAC units to circulate air among all 28 classrooms in its main wing. The filters, formed of crudely pleated paper, removed only 12 to 30 percent of respirable particles.
In 2025, we’re significantly better off in terms of indoor air quality and safety than we were 50 years ago. Today, many schools in the state of California8 use filters that capture 50 percent of respirable particles or HEPA filters (which capture 99.7 percent). Beginning in the 1970s, indoor smoking bans began rolling out in the U.S. on a state-by-state and city-by-city basis. Today, 82.4 percent of Americans live in areas that mandate 100 percent smoke-free workplaces, restaurants, and bars. Between 1964 and 2012, tobacco bans prevented about 8 million deaths in the US. Since 2007, the UK has had a near-total ban on smoking in public indoor spaces and workplaces.9 Our understanding of airborne disease and pulmonary conditions (and the tools available to prevent them) has improved yet further — indicating the need for another shift.
Regulating Air
With COVID having reawakened us to the risks of airborne diseases, we’re increasingly returning to air quality as a focal point for health. One of the most popular, effective, and easy-to-implement solutions from the past few decades is a standalone, “plug and play” mechanical air filtration device with HEPA filters. These devices are relatively cheap (often under $300), easy to maintain, and simple to set up. What’s more, they work very well, removing up to 99.97 percent of particles. Their loudness, however, is often a major drawback — up to 60 decibels, about the same volume as a conversation. This can make it hard to implement HEPA filters in areas where they’re needed most, such as classrooms.
Fortunately, HVAC experts might have an answer. ASHRAE (American Society of Heating, Refrigerating and Air-Conditioning Engineers) has issued guidance to HVAC practitioners since 1894. ASHRAE 62 (first published in 1973, updated in 2022) still sets the dominant standard for indoor air quality and ventilation. In 2023, ASHRAE introduced Standard 241, which lays out the requirements for reduced risk for airborne disease transmission. One can meet Standard 241 in different ways, including many of the aforementioned interventions — windows that open, ventilation systems (which incorporate filters and outdoor air), portable HEPA filters, or, ideally, some combination of these interventions.10 But Standard 241 also mentions a novel and potentially revolutionary new tool — far-UVC.
In-duct UV lights as a part of an HVAC are old news and have lots of limitations. In contrast, far-UVC lights are silent, small (often resembling recessed can lights), and potentially extremely effective in the fight against airborne disease. Also, unlike many other kinds of UV light, far-UVC kills airborne microbes without hurting skin or eyes. Far-UVC could provide an extra layer of intervention in situations where noise matters (or an extra layer of protection in places with high concentrations of people and airborne particles.)
Though promising, “it's hard to run a study where you can demonstrate that far-UV actually competes in the real world with HEPA filters or outperforms them,” admits Gavriel Kleinwaks, director of indoor air quality at 1Day Sooner. Early research points in the right direction, however. For a 2024 study, researchers installed four far-UVC lights into a room ceiling. These fixtures emitted light at a 222-nanometer wavelength (not damaging to the skin) and successfully reduced the number of viral particles in the air by 99.8 percent. This is just one study, however. Not only is more research needed on the efficacy of far-UVC, but the newness of devices using this technology also means that far-UVC is still expensive to install; on average, a far-UVC fixture currently costs around $1000 (and many intended installation environments would need several).
Another specter haunts air-quality-improving devices: unwanted ozone. In the 2000s, the silent, filterless, and sleek-looking Sharper Image Ionic Breeze ionizing air purifiers were best sellers despite their $450 price tag (around $690 today). But in 2005, Consumer Reports discovered that they emitted dangerous levels of ozone, which can aggravate conditions like asthma. Far-UVC also emits a (trivial amount of) ozone, perhaps adding to the anxiety around a new intervention in our arsenal. Fiscal decision-makers might worry that far-UVC will instigate a repeat of the Ionic Breeze debacle: an expensive investment that, at best, does nothing and, at worst, causes harm.
The experts and the public disagree. During the 2024 elections, voters in Berkeley, CA, voted on a proposed measure (HH) that would have codified many parts of ASHRAE 241 into regulatory standards for city-owned buildings. This appears to have been the first legislation of its kind in the United States. HH had one notable deviation from ASHRAE 241 — it stipulated that far-UVC or any “technologies that emit ozone” could not be used,11 whereas ASHRAE 241 does not. While Measure HH failed at the polls, it did not lose by very large margins (56.55 percent no, 43.45 percent yes) — a nontrivial percentage of voters in a large, progressive American city thought indoor air quality should be taken seriously.
The adoption of air-quality-improving technologies continues to depend on cost, just as it did for windows, hearths, ventilation systems, and A/C. Even a small far-UVC lamp starts at about $500 (and doesn’t account for installation costs or a relatively short device lifespan). However, many technologies become cheaper and more efficient over time.
Take solar energy, for example. In 1979, President Jimmy Carter installed a series of 32 primitive solar water heating panels on the White House roof to heat water.12 During the dedication speech, Carter conceded that “a generation from now, this solar heater can either be a curiosity, a museum piece, [or] an example of a road not taken.” They cost $28,000 (more than $120,000 in 2025) and supplied 75 percent of the energy required to heat 1,000 gallons of water daily.
Importantly, Carter’s system could only heat water, not generate electricity or store the heated water. Given that the average cost of photovoltaic solar in 1979 was about $42/watt (adjusted for inflation), this roughly suggests that the output of a comparably-priced photovoltaic solar array would be no more than 2,857 watts. Obama’s 2013 White House solar installation generated about 19,700 kWh13 annually (just short of double what an average American household uses per year), at a time when residential solar systems were about $5/watt. In 2025, a new residential system would cost $4.00/watt.
While lots of people like to ascribe Moore’s Law to explain the growth of industries inside and outside of computing (roughly, the cost of a product decreases exponentially over time), a few researchers suggest that Wright’s Law is even more apt when it comes to technologies like photovoltaic solar — cost decreases at a rate that depends on production output. Rather than costs exponentially decreasing as a function of time, we might instead expect that the more we make, the more we learn, with technologies becoming cheaper as a result.
Solar went from a circus sideshow to a near-necessity14 in less than a lifetime, and we can likely expect a similar transformation for air quality technologies. Our expectations regarding healthy indoor air have changed over the centuries and will keep changing, an evolution evident in our built environment. We no longer accept heatstroke as a constant risk in the summer now that air conditioning can alleviate it. We no longer endure small window-holes open to the elements now that plate-glass windows are cheap and mandatory.15 We do not choose sooty coal stoves when electric or gas stoves are cheaper, cleaner, and easier.
Maybe our threshold for “acceptable number of sick days per year” or “money spent on sick leave or medical bills” will decrease, just as most Americans no longer find chickenpox parties acceptable in light of a cheap, accessible, effective chickenpox vaccine. One thing is certain — striving to raise our living standards in spite of occasional missteps is a welcome and wonderful pattern in the history of designing indoor spaces.
{{divider}}
Larissa Schiavo really, really, really likes old buildings (but doesn’t think we should be too precious about them).
Thank you to Gavriel Kleinwaks, Jesse Smith, Vivian Belenky, Xander Balwit, Lauren Gilbert and Miles Brundage for helpful feedback. Cool: How Air Conditioning Changed Everything by Salvatore Basile and The Gospel of Germs by Nancy Tomes were extraordinary jumping-off points for researching sections of this article.
Cite: Larissa Schiavo. “The Battle for Better Air.” Asimov Press (2025). DOI: 10.62211/82pt-11tr
Lead image by Ella Watkins-Dulaney.
This article was published on February 2nd, 2025.
{{divider}}
Footnotes
- Starting in the 18th century, double-hung windows solved a few different problems. With just one window, you can create a stack effect, which results in more effective ventilation than the more binary casement window. Bigger (about the size of a book) panes of glass make windows a cheaper and more appealing prospect in the 18th century — these were much more transparent than their predecessors, and caused less distortion.
- Isabella Bird, a famous 19th century diarist, described three adults and six children living in a windowless 6-by-11 room in 1869.
- Window screens have been an option for many years — Monticello had them — but were staggeringly expensive until relatively recently.
- The Decorative Art of Today, Le Corbusier, p. 188.
- Historically, exceptional orators could be heard by very large crowds without voice amplification. In 1739, for example, Benjamin Franklin attended an outdoor speech made by Reverend Whitefield. Franklin tabulated the rough number of people packed into each street to estimate that “he might well be heard by more than thirty thousand.” This is also why sopranos were so popular in opera — higher-pitched sounds are perceived as louder and can often be heard over an orchestra without amplification.
- Prior to mechanical refrigeration, ice was cut in large blocks from frozen lakes during the winter and stored in sawdust and straw-insulated barns and ship hulls. A significant percentage of product was lost during transportation from melting.
- This process, dubbed “air washing” at the time, works by spraying a water mist to catch particulate matter like pollen and dust. While the evidence is limited as to its effectiveness, people still sell air purifiers that use this method.
- California is still much more advanced in this respect compared to the rest of the U.S.
- Those born after 2009 will no longer be able to buy cigarettes at all in the UK, following an interesting “generational” ban.
- See the “Swiss cheese model” of safety.
- See section 12.12.040.c.
- Access. (1979). United States: U.S. Office of Minority Business Enterprise.
- 19,700,000 Wh x 3.412 = 67,219,190 BTUs.
- In many parts of California, for example, residential electricity is nearly double the national average, and the price per kWh increases with total volume of power used; if you wanted full-house air conditioning, the ~$20,000 or so in solar panels would “pay for itself” very quickly.
- By “we”, I am assuming the reader is in a fairly wealthy country (as the author is) or is fairly well-to-do in a less wealthy country.
Always free. No ads. Richly storied.
Always free. No ads. Richly storied.
Always free. No ads. Richly storied.