Humans vs. Snow: A Love-Hate History

Humans vs. Snow: A Love-Hate History

We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

Paleolithic Era: Skiing for Survival

Today, skiing is a fun activity winter-lovers can’t wait to take advantage of at the first sight of freshly fallen snow, but it was originally invented thousands of years ago as a means of survival. The first use of skis can be found in a cave painting dating back to the Paleolithic Era’s final Ice Age. The sticks that were used as the first prototype were not only helpful for traveling over frozen terrain, but also for hunting prey.

1565: Snowscapes in Paintings

Commonly seen as the first winter landscape painting, Pieter Bruegel the Elder painted “Hunters in the Snow” during the brutal winter of 1564-65. It was the longest and most severe winter Europe had seen in more than a century, kicking off what some called the “Little Ice Age.” If you can’t beat them, join them, right? After his first snow-scape Bruegel couldn’t stop painting ice and snow—he also painted the first scene with falling snow, as well as the first nativity scene with snow—and his work started a winter-themed trend among Dutch painters that lasted for some 150 years.

1717: “The Great Snow”

Events occurred either before or after “The Great Snow” of 1717 for generations of New Englanders. Starting in late February of that year, a series of storms dumped up to six feet throughout the region, with drifts as high as 25 feet! New Hampshire, Massachusetts and Connecticut got the worst of it: Entire houses were completely covered with snow, livestock perished and even Boston Puritans canceled church services for two weeks. But one intrepid postman refused to lose the battle, reportedly leaving his horse behind and donning a pair of snow shoes to make the arduous trip from Boston to New York.

Early 19th century: A New Word is Born—Blizzard

The exact origins of “blizzard” are unclear, but it appears to have emerged as a non-snow-related noun. An 1829 article in the Virginia Literary Museum, a weekly journal published at the University of Virginia, defined the word as “a violent blow, perhaps from blitz (German: lightning).” In his 1834 memoir, Davy Crockett himself used the term to mean a burst of speech: “A gentleman at dinner asked me for a toast; and supposing he meant to have some fun at my expense, I concluded to go ahead, and give him and his likes a blizzard.” The first use of the word in reference to a severe snowstorm apparently came later. Eytmologist and lexicographer Allen Walker Read believes the earliest such usage of “blizzard” was in an April 1870 issue of the Northern Vindicator, a newspaper in Estherville, Iowa.

1862: The Rise of the Snow Plow

Today snow can mean long delays and canceled flights, but it used to be a positive thing for travel. When the main mode of transportation was the horse-drawn carriage, having packed snow on the roads made things easier, switching out their carriage’s wheels for ski-like runners when the snow piled up. Foot traffic was a different story, however, and by the mid-1800s several different inventors had patented their version of a horse-drawn snow plow to clear the alleys and walkways of America’s cities. In 1862, Milwaukee became the first major city to use such a plow, and its popularity spread quickly throughout the Snow Belt (the area stretching across the Great Lakes from Minnesota to Maine).

1878: Shakin’ It Up—The Snow Globe

Indicative of the winter wonderland that fills the hearts of many each holiday season, the snow globe was first seen in France at the 1878 Paris Universal Exposition. The trinket gained little attention, however, and only found its way into the hearts and minds of holiday holiday-goers thanks to Edwin Perzy I. The mechanic accidentally created a snow globe in 1900, when he was asked to fix a dim light bulb. After noticing that water-filled glass globes would fill the entire room with light when placed in front of candle, he tried the same technique with a lightbulb but didn’t get the same results. Next he filled the globe with semolina flakes with the hope that they would help reflect the light, but instead it inspired him in a totally different way—the flakes reminded him of snow. Perzy patented the snow globe and the novelty caught on like wildfire.

1888: The Blizzard That Ate the Big Apple

Paralyzing the Northeast for over three days with snow, wind and freezing temperatures, horse-drawn plows stood no chance against the Blizzard of 1888. New York City was inundated with 50 inches of snow, along with high winds causing drifts of up to 40 feet—it was as snow-pocalypse. The city’s elevated railways—usually the only transport option during storms—were blocked leaving travelers stranded for days. The 1888 blizzard claimed 400 victims. It also did some good, however, by prompting cities to improve their snow removal procedures, including hiring more plows, assigning routes and starting the plowing process in the early phases of storms.

1920s: Snow Removal Goes Mobile

When automobiles replaced horse-drawn carriages on the roads, clearing the roads of snow became a big priority. Mechanized salt-spreaders helped, but weren’t sufficient. As early as 1913, some cities had started using motorized dump trucks and plows to remove snow. Chicago took it one step further in the 1920s, debuting a contraption called the “snowloader.” Equipped with a giant scoop and a conveyor belt, the device forced plowed snow up the scoop, onto the belt and into a chute that dropped it into a dump truck parked beneath. The snowloader revolutionized urban snow removal, making it a lot less labor-and time-intensive.

1952: Introducing Your Very Own Snow Blower

Snow blowing got personal in the early 1950s, when a Canadian company called Toro released the first human-powered snow blower. Other companies produced their own models during the 1960s, ushering in the age of modern snow removal. Around the same time, satellite weather technology was making it easier than ever to predict and prepare for storms, and widespread use of TV and radio helped keep the public aware of impending hazards caused by snow and wind.

Today: An Ode to the Humble Snow Shovel

Odds are the snow removal tool most people are familiar with is also the one that’s been around the longest—the shovel. Thought to date back some 6,000 years, the old-fashioned snow shovel remains one of the most effective tools for digging out of a blizzard, no matter where you live. Since the 1870s, more than 100 patents have been granted for snow shovel designs, as various people try their hand at improving on the time-honored classic.

The Real Meaning of 'Good' and 'Evil'

It's a dangerous oversimplification to believe that some people are innately ‘good’ while others are innately ‘evil’ or ‘bad.’ This misleading concept underpins the justice system of many countries — ‘bad’ people commit crimes, and since they are intrinsically ‘bad’, they should be locked away so that they can’t harm us with their ‘evil’ behavior. This concept has also fuelled wars and conflicts throughout history, and even in the present day. It makes groups believe that they are fighting a just cause against an ‘evil’ enemy and that once the ‘evil’ people have been killed, peace and goodness will reign supreme.

Human nature is infinitely more complex than this, of course. In human beings, ‘good’ and ‘evil’ are fluid. People can be a combination of ‘good’ and ‘bad’ qualities. Some people who behave cruelly and brutally can be rehabilitated and eventually display ‘good’ qualities such as empathy and kindness. And rather than being intrinsic, most cruel or brutal behavior is due to environmental factors, such as an abusive childhood, or social learning from a family or peers.

The Meaning of Good and Evil

What do we really mean when we use these simplistic terms, ‘good’ and ‘evil’?

‘Good’ means a lack of self-centredness. It means the ability to empathize with other people, to feel compassion for them, and to put their needs before your own. It means, if necessary, sacrificing your own well-being for the sake of others’. It means benevolence, altruism and selflessness, and self-sacrifice towards a greater cause — all qualities which stem from a sense of empathy. It means being able to see beyond the superficial difference of race, gender, or nationality and relate to a common human essence beneath them.

All of the ‘saintly’ people in human history have these qualities in abundance. Think of Mahatma Gandhi and Martin Luther King, risking their own safety and well-being for the goal of gaining equal rights and freedom for Indians and African Americans. These were human beings with an exceptional degree of empathy and compassion, which overrode any concern for their own ambitions or well-being.

‘Evil’ people are those who are unable to empathize with others. As a result, their own needs and desires are of paramount importance. They are selfish, self-absorbed, and narcissistic. In fact, other people only have value for them to the extent that they can help them satisfy their own desires or be exploited. This applies to dictators like Stalin and Hitler, and to serial killers and rapists. I would argue that their primary characteristic is an inability to empathize with others. They can’t sense other people’s emotions or suffering, can’t see the world from other people’s perspectives, and so have no sense of their rights. Other human beings are just objects to them, which is what makes their brutality and cruelty possible.

Good and Evil as Flexible

Most of us lie somewhere between the extremes of Gandhi and Hitler on the spectrum of human behavior. Sometimes we may behave badly, when egocentric impulses cause us to put our needs before the welfare of others. Sometimes we behave in a saintly fashion, when empathy and compassion impel us to put the needs of others before our own, resulting in altruism and kindness.

The real difference between this idea of ‘good and evil’ and the traditional concept is that empathy or a lack of empathy aren’t fixed. Although people with a psychopathic personality appear to be unable to develop empathy, for most of us, empathy — or goodness — is a quality that can be cultivated. This is recognized by Buddhism and most other spiritual traditions. As we practice meditation or mindfulness, and as we become less attached to materialism and status-seeking, we become more open and more connected, and so more selfless and altruistic.

The ‘fluidity’ of goodness is also recognized by the process of ‘restorative justice’, which is becoming more and more widely used within European justice systems. Rather than locking ‘bad’ people away — which is unfortunately so widely practiced by the US penal system — restorative justice gives offenders the opportunity to meet their victims, to see how their crimes have affected them, which often leads to a sense of empathy for their victims — which in turn frequently leads to rehabilitation.

This is an optimistic view of nature, but I would go even further. Because the goodness in human beings emerges when we are connected — when we spread out into empathy with one another — I believe that goodness expresses something fundamental about human nature, even if it might be sometimes difficult to see. ‘Evil’ is an aberration, a form of pathology, as the psychopathic personality shows, which only emerges when we are broken off into disconnected fragments.

Folklore and Folktale

Whereas myth has at its core the origins of a people and is often sacred, folklore is a collection of fictional tales about people or animals. Superstitions and unfounded beliefs are important elements in the folklore tradition. Both myths and folklore were originally circulated orally.

Folktales describe how the main character copes with the events of everyday life, and the tale may involve crisis or conflict. These stories may teach people how to cope with life (or dying) and also have themes common among cultures worldwide. The study of folklore is called folkloristics.

536 AD — the worst year in history

2020 has already been immortalised. It is a year that nobody will forget. However, when speaking of the worst year recorded in human history there are many to choose from:

The year 1349 saw the Black Death kill half the population of Europe.

In 1520 smallpox ravaged the Americas and killed between 60 and 90 per cent of the continents’ original inhabitants.

In 1918 the Spanish Flu led to the deaths of over 50 million people.

The rise of Hitler in 1933 is often claimed to be the turning point in modern history.

However, historians are unanimous in their choice. The title of the worst year in history is easily held by the year 536 AD.

Medieval historian, Michael McCormick has stated that “it was the beginning of one of the worst periods to be alive, if not the worst year.” (Science Magazine, Ann Gibbons, 2018).

The year began with an inexplicable, dense fog that stretched across the world which plunged Europe, the Middle East and parts of Asia into darkness 24 hours a day, for nearly 2 years.

Consequently, global temperatures plummeted which resulted in the coldest decade in over 2,000 years. Famine was rampant and crops failed all across Europe, Africa and Asia. Unfortunately, 536 AD seemed to only be a prelude to further misery. This period of extreme cold and starvation caused economic disaster in Europe and in 541 A.D. an outbreak of bubonic plague further led to the death of nearly 100 million people and almost half of the Byzantine Empire.

This part of the sixth century has widely been referred to as the Dark Ages, but the true source of this darkness had previously been unknown to scholars. Recently, researchers led by McCormick and glaciologist Paul Mayewski, have discovered that a volcanic eruption in Iceland in early 536 led to incredibly large quantities of ash being spread across much of the globe, creating the fog that cast the world into darkness. This eruption was so immense that it altered the global climate and adversely affected weather patterns and crop cultivation for years to come (Antiquity).

There are diseases hidden in ice, and they are waking up

Throughout history, humans have existed side-by-side with bacteria and viruses. From the bubonic plague to smallpox, we have evolved to resist them, and in response they have developed new ways of infecting us.

We have had antibiotics for almost a century, ever since Alexander Fleming discovered penicillin. In response, bacteria have responded by evolving antibiotic resistance. The battle is endless: because we spend so much time with pathogens, we sometimes develop a kind of natural stalemate.

However, what would happen if we were suddenly exposed to deadly bacteria and viruses that have been absent for thousands of years, or that we have never met before?

We may be about to find out. Climate change is melting permafrost soils that have been frozen for thousands of years, and as the soils melt they are releasing ancient viruses and bacteria that, having lain dormant, are springing back to life.

In August 2016, in a remote corner of Siberian tundra called the Yamal Peninsula in the Arctic Circle, a 12-year-old boy died and at least twenty people were hospitalised after being infected by anthrax.

The theory is that, over 75 years ago, a reindeer infected with anthrax died and its frozen carcass became trapped under a layer of frozen soil, known as permafrost. There it stayed until a heatwave in the summer of 2016, when the permafrost thawed.

This exposed the reindeer corpse and released infectious anthrax into nearby water and soil, and then into the food supply. More than 2,000 reindeer grazing nearby became infected, which then led to the small number of human cases.

The fear is that this will not be an isolated case.

As the Earth warms, more permafrost will melt. Under normal circumstances, superficial permafrost layers about 50cm deep melt every summer. But now global warming is gradually exposing older permafrost layers.

Frozen permafrost soil is the perfect place for bacteria to remain alive for very long periods of time, perhaps as long as a million years. That means melting ice could potentially open a Pandora's box of diseases.

The temperature in the Arctic Circle is rising quickly, about three times faster than in the rest of the world. As the ice and permafrost melt, other infectious agents may be released.

"Permafrost is a very good preserver of microbes and viruses, because it is cold, there is no oxygen, and it is dark," says evolutionary biologist Jean-Michel Claverie at Aix-Marseille University in France. "Pathogenic viruses that can infect humans or animals might be preserved in old permafrost layers, including some that have caused global epidemics in the past."

In the early 20th Century alone, more than a million reindeer died from anthrax. It is not easy to dig deep graves, so most of these carcasses are buried close to the surface, scattered among 7,000 burial grounds in northern Russia.

However, the big fear is what else is lurking beneath the frozen soil.

People and animals have been buried in permafrost for centuries, so it is conceivable that other infectious agents could be unleashed. For instance, scientists have discovered fragments of RNA from the 1918 Spanish flu virus in corpses buried in mass graves in Alaska's tundra. Smallpox and the bubonic plague are also likely buried in Siberia.

In a 2011 study, Boris Revich and Marina Podolnaya wrote: "As a consequence of permafrost melting, the vectors of deadly infections of the 18th and 19th Centuries may come back, especially near the cemeteries where the victims of these infections were buried."

NASA scientists successfully revived bacteria that had been encased in a frozen pond in Alaska for 32,000 years

For instance, in the 1890s there was a major epidemic of smallpox in Siberia. One town lost up to 40% of its population. Their bodies were buried under the upper layer of permafrost on the banks of the Kolyma River. 120 years later, Kolyma's floodwaters have started eroding the banks, and the melting of the permafrost has speeded up this erosion process.

In a project that began in the 1990s, scientists from the State Research Center of Virology and Biotechnology in Novosibirsk have tested the remains of Stone Age people that had been found in southern Siberia, in the region of Gorny Altai. They have also tested samples from the corpses of men who had died during viral epidemics in the 19th Century and were buried in the Russian permafrost.

The researchers say they have found bodies with sores characteristic of the marks left by smallpox. While they did not find the smallpox virus itself, they have detected fragments of its DNA.

Certainly it is not the first time that bacteria frozen in ice have come back to life.

In a 2005 study, NASA scientists successfully revived bacteria that had been encased in a frozen pond in Alaska for 32,000 years. The microbes, called Carnobacterium pleistocenium, had been frozen since the Pleistocene period, when woolly mammoths still roamed the Earth. Once the ice melted, they began swimming around, seemingly unaffected.

Once they were revived, the viruses quickly became infectious

Two years later, scientists managed to revive an 8-million-year-old bacterium that had been lying dormant in ice, beneath the surface of a glacier in the Beacon and Mullins valleys of Antarctica. In the same study, bacteria were also revived from ice that was over 100,000 years old.

However, not all bacteria can come back to life after being frozen in permafrost. Anthrax bacteria can do so because they form spores, which are extremely hardy and can survive frozen for longer than a century.

Other bacteria that can form spores, and so could survive in permafrost, include tetanus and Clostridium botulinum, the pathogen responsible for botulism: a rare illness that can cause paralysis and even prove fatal. Some fungi can also survive in permafrost for a long time.

Some viruses can also survive for lengthy periods.

In a 2014 study, a team led by Claverie revived two viruses that had been trapped in Siberian permafrost for 30,000 years. Known as Pithovirus sibericum and Mollivirus sibericum, they are both "giant viruses", because unlike most viruses they are so big they can be seen under a regular microscope. They were discovered 100ft underground in coastal tundra.

Once they were revived, the viruses quickly became infectious. Fortunately for us, these particular viruses only infect single-celled amoebas. Still, the study suggests that other viruses, which really could infect humans, might be revived in the same way.

The giant viruses tend to be very tough and almost impossible to break open

What's more, global warming does not have to directly melt permafrost to pose a threat. Because the Arctic sea ice is melting, the north shore of Siberia has become more easily accessible by sea. As a result, industrial exploitation, including mining for gold and minerals, and drilling for oil and natural gas, is now becoming profitable.

"At the moment, these regions are deserted and the deep permafrost layers are left alone," says Claverie. "However, these ancient layers could be exposed by the digging involved in mining and drilling operations. If viable virions are still there, this could spell disaster."

Giant viruses may be the most likely culprits for any such viral outbreak.

"Most viruses are rapidly inactivated outside host cells, due to light, desiccation, or spontaneous biochemical degradation," says Claverie. "For instance, if their DNA is damaged beyond possible repair, the virions will no longer be infectious. However, among known viruses, the giant viruses tend to be very tough and almost impossible to break open."

Claverie says viruses from the very first humans to populate the Arctic could emerge. We could even see viruses from long-extinct hominin species like Neanderthals and Denisovans, both of which settled in Siberia and were riddled with various viral diseases. Remains of Neanderthals from 30-40,000 years ago have been spotted in Russia. Human populations have lived there, sickened and died for thousands of years.

NASA scientists found 10-50,000-year-old microbes inside crystals in a Mexican mine

"The possibility that we could catch a virus from a long-extinct Neanderthal suggests that the idea that a virus could be 'eradicated' from the planet is wrong, and gives us a false sense of security," says Claverie. "This is why stocks of vaccine should be kept, just in case."

Since 2014, Claverie has been analysing the DNA content of permafrost layers, searching for the genetic signature of viruses and bacteria that could infect humans. He has found evidence of many bacteria that are probably dangerous to humans. The bacteria have DNA that encodes virulence factors: molecules that pathogenic bacteria and viruses produce, which increase their ability to infect a host.

Claverie's team has also found a few DNA sequences that seem to come from viruses, including herpes. However, they have not as yet found any trace of smallpox. For obvious reasons, they have not attempted to revive any of the pathogens.

It now seems that pathogens cut off from humans will emerge from other places too, not just ice or permafrost.

In February 2017, NASA scientists announced that they had found 10-50,000-year-old microbes inside crystals in a Mexican mine.

The bacteria have somehow become resistant to 18 types of antibiotics

The bacteria were located in the Cave of the Crystals, part of a mine in Naica in northern Mexico. The cave contains many milky-white crystals of the mineral selenite, which formed over hundreds of thousands of years.

The bacteria were trapped inside small, fluid pockets of the crystals, but once they were removed they revived and began multiplying. The microbes are genetically unique and may well be new species, but the researchers are yet to publish their work.

Even older bacteria have been found in the Lechuguilla Cave in New Mexico, 1,000ft underground. These microbes have not seen the surface for over 4 million years.

The cave never sees sunlight, and it is so isolated that it takes about 10,000 years for water from the surface to get into the cave.

Antibiotic resistance has been around for millions or even billions of years

Despite this, the bacteria have somehow become resistant to 18 types of antibiotics, including drugs considered to be a "last resort" for fighting infections. In a study published in December 2016, researchers found that the bacteria, known as Paenibacillus sp. LC231, was resistant to 70% of antibiotics and was able to totally inactivate many of them.

As the bacteria have remained completely isolated in the cave for four million years, they have not come into contact with people or the antibiotic drugs used to treat human infections. That means its antibiotic resistance must have arisen in some other way.

The scientists involved believe that the bacteria, which does not harm humans, is one of many that have naturally evolved resistance to antibiotics. This suggests that antibiotic resistance has been around for millions or even billions of years.

Obviously, such ancient antibiotic resistance cannot have evolved in the clinic as a result of antibiotic use.

The reason for this is that many types of fungi, and even other bacteria, naturally produce antibiotics to gain a competitive advantage over other microbes. That is how Fleming first discovered penicillin: bacteria in a petri dish died after one became contaminated with an antibiotic-excreting mould.

As Earth warms northern countries will become more susceptible to outbreaks of "southern" diseases like malaria

In caves, where there is little food, organisms must be ruthless if they are to survive. Bacteria like Paenibacillus may have had to evolve antibiotic resistance in order to avoid being killed by rival organisms.

This would explain why the bacteria are only resistant to natural antibiotics, which come from bacteria and fungi, and make up about 99.9% of all the antibiotics we use. The bacteria have never come across man-made antibiotics, so do not have a resistance to them.

"Our work, and the work of others, suggests that antibiotic resistance is not a novel concept," says microbiologist Hazel Barton of the University of Akron, Ohio, who led the study. "Our organisms have been isolated from surface species from 4-7 million years, yet the resistance that they have is genetically identical to that found in surface species. This means that these genes are at least that old, and didn't emerge from the human use of antibiotics for treatment."

Although Paenibacillus itself is not harmful to humans, it could in theory pass on its antibiotic resistance to other pathogens. However, as it is isolated beneath 400m of rock, this seems unlikely.

Nevertheless, natural antibiotic resistance is probably so prevalent that many of the bacteria emerging from melting permafrost may already have it. In line with that, in a 2011 study scientists extracted DNA from bacteria found in 30,000-year-old permafrost in the Beringian region between Russia and Canada. They found genes encoding resistance to beta-lactam, tetracycline and glycopeptide antibiotics.

How much should we be concerned about all this?

One argument is that the risk from permafrost pathogens is inherently unknowable, so they should not overtly concern us. Instead, we should focus on more established threats from climate change. For instance, as Earth warms northern countries will become more susceptible to outbreaks of "southern" diseases like malaria, cholera and dengue fever, as these pathogens thrive at warmer temperatures.

The alternative perspective is that we should not ignore risks just because we cannot quantify them.

"Following our work and that of others, there is now a non-zero probability that pathogenic microbes could be revived, and infect us," says Claverie. "How likely that is is not known, but it's a possibility. It could be bacteria that are curable with antibiotics, or resistant bacteria, or a virus. If the pathogen hasn't been in contact with humans for a long time, then our immune system would not be prepared. So yes, that could be dangerous."

Always in Season: Barn swallows are a love-hate species

Therefore, human beings, and this individual human, tend to have a love-hate relationship with the swallow.

In European folklore, the swallow was regarded as a sign of good luck, and swallows were encouraged to nest on structures. This they were more than happy to do. Swallows adapted to the vertical surfaces of human habitations, an improvement over the natural cliff faces they had favored. This phenomenon occurred in the Western Hemisphere, too. When European settlement began, barn swallows rapidly switched to sides of buildings, and even interiors, for their building sites.

On the farm in Mountrail County, my father welcomed the swallows, and we children were encouraged to appreciate them, too.

And swallows are easy to love.

They are quite beautifully colored, in deep purple and brick red. Their tails are impressively long. Their aerial maneuvers are stunning. Their twittering calls aren't quite musical, but they are pleasant nevertheless.

The trouble with swallows is they don't observe the ordinary rules of courtesy, at least as we humans understand them.

Put plainly, they are trespassers. Barn swallows don't respect our rules of property.

This is the most frequent complaint against these birds, and the most often asked question is, "How do I keep the swallows from nesting above my door?"

It's true that nesting swallows can be a nuisance.

For one thing, they don't clean up after themselves.

Actually, that's not quite true. They keep their nest spaces tidy-but by dropping waste over the side.

Their construction methods are a little sloppy, too. Their basic building material is mud, often picked out of a convenient puddle. This they plaster against a wall, preferably under some sort of overhang that provides protection from both sunlight and rain.

Left to themselves, barn swallows will happily inhabit the interior of a structure, as well. I've found them in abandoned houses, for example, and they nested in the rafters of the barn on the farm where I grew up. We always left the door open for them.

Of course, it's possible to discourage swallows. One method is simply to knock down the nest. This is best done early in the construction process. The birds will try and try again, but eventually, they'll move to a different-though nearby-spot.

Another trick is to hand something shiny and mobile in places where swallows aren't welcome. Strips of tinfoil might work. So might an aluminum pie plate.

Still, the swallows will find somewhere nearby to nest.

Barn swallows form loose colonies, with individual pairs nesting at various spots on a single building or on nearby structures. One season, I counted 15 nesting attempts at our place west of Gilby, N.D.

The swallows are back this year, and their nesting efforts are in full swing.

It's not clear yet how many swallows I will be hosting.

The swallows have made me acutely aware of them, however. This season's colony seems especially aggressive, flying at me whenever I enter what they consider is their territory.

They are bold, too, getting plenty close enough that I can feel the air pumping through their wing feathers.

Honestly, an attacking barn swallow can part the hair on a bald man's head.

Still, the swallows are welcome, for their beauty and their sociability - but also for their utility. Swallows are insect eaters, and flying insects are almost their exclusive food. They don't eliminate the mosquito population. That would be too great a service to ask of them. They do reduce it, at least marginally, and that is welcome.

When we first moved onto our place west of Gilby, N.D., we had a small colony of cliff swallows.

These birds differ from barn swallows in three significant ways. They lack the long, forked tail that decorates the barn swallow - and is the male swallow's chief sexual adornment. Yes, size matters to swallows.

They build bottle-like nests, in contrast to the open cup-like nests that barn swallows construct.

And their colonies are often huge, sometimes numbering hundreds or even thousands of birds. Barn swallows are small town birds in comparison.

The cliff swallow may be the most numerous bird species in North Dakota. Colonies occur under bridges on rural roadways and over the Red River in downtown Grand Forks. They sometimes nest on buildings. In wild areas, including North Dakota's Badlands, they sometimes nest on cliff sides, as their ancestors did before America had barns and bridges.

Still, barn swallows are the more familiar species. They are "commensal" with humans, depending on us for nesting sites and rewarding us by eating flying insects and providing us with companionship and entertainment.

Raccoon hands have thumbs that, although not opposable, provide them with more dexterity than their relatives. Because of these thumbs, raccoons can grab things and open containers such as jars, bottles, and trash bins.

Raccoon senses are powerful, but they are not known to have a great sense of sight. However, raccoons do have good night vision. Similar to cats, they have a reflective layer in the lenses of their eyes called the tapetum lucidum which enhances their vision of nearby objects. That said, they don’t have a wide range of vision with limited depth perception.

All Timelines Overview

The story of vaccines did not begin with the first vaccine–Edward Jenner’s use of material from cowpox pustules to provide protection against smallpox. Rather, it begins with the long history of infectious disease in humans, and in particular, with early uses of smallpox material to provide immunity to that disease.

Evidence exists that the Chinese employed smallpox inoculation (or variolation, as such use of smallpox material was called) as early as 1000 CE. It was practiced in Africa and Turkey as well, before it spread to Europe and the Americas.

Edward Jenner’s innovations, begun with his successful 1796 use of cowpox material to create immunity to smallpox, quickly made the practice widespread. His method underwent medical and technological changes over the next 200 years, and eventually resulted in the eradication of smallpox.

Louis Pasteur’s 1885 rabies vaccine was the next to make an impact on human disease. And then, at the dawn of bacteriology, developments rapidly followed. Antitoxins and vaccines against diphtheria, tetanus, anthrax, cholera, plague, typhoid, tuberculosis, and more were developed through the 1930s.

The middle of the 20 th century was an active time for vaccine research and development. Methods for growing viruses in the laboratory led to rapid discoveries and innovations, including the creation of vaccines for polio. Researchers targeted other common childhood diseases such as measles, mumps, and rubella, and vaccines for these diseases reduced the disease burden greatly.

Innovative techniques now drive vaccine research, with recombinant DNA technology and new delivery techniques leading scientists in new directions. Disease targets have expanded, and some vaccine research is beginning to focus on non-infectious conditions such as addiction and allergies.

More than the science behind vaccines, these timelines cover cultural aspects of vaccination as well, from the early harassment of smallpox variolators (see the intimidation of a prominent minister described in the 1721 Boston Smallpox Epidemic entry) to the establishment of vaccination mandates, to the effect of war and social unrest on vaccine-preventable diseases. Edward Jenner, Louis Pasteur, and Maurice Hilleman, pioneers in vaccine development receive particular attention as well.

This timeline category holds nearly all of the entries for the subject-specific timelines. A few of the entries have been left out in order to provide a broad overview.


Thomas Peebles collected blood from sick students at a private school outside of Boston in an attempt to isolate the measles virus. Eventually he succeeded, and the collected virus would be isolated and used to create a series of vaccines.

In 1905, Swedish physician Ivar Wickman suggested that that polio was a contagious disease that could be spread from person to person.

The first vaccine created in a laboratory was Louis Pasteur’s 1879 vaccine for chicken cholera.


Robert Carr Harris of Maple Green, New Brunswick patented a "Railway Screw Snow Excavator" in 1870. [2] In 1923, Robert E. Cole patented a snowplow that operated by using cutters and a fan to blow snow from a surface. [3] Various other innovations also occurred. [4] However, it is Arthur Sicard (1876–1946) who is generally credited as the inventor of the first practical snow blower. In 1925 Sicard completed his first prototype, based on a concept he described in 1894. [5] He founded Sicard Industries in Sainte-Thérèse, Quebec and by 1927 his vehicles were in use removing snow from the roadways of the town of Outremont, now a borough of Montreal. His company is now a division of SMI-Snowblast, Inc. of Watertown, New York. [6]

The U.S. Consumer Product Safety Commission estimates that each year there are approximately 5,740 snowblower related injuries in the United States which require medical attention. [7] One problem with the design of the snow blower is that snow can build up in the auger, jamming it and stalling the motor. This is complicated by the fact that the auger could deform before applying enough resistance to the motor to turn it off. If the jam is cleared by hand, it is possible for the auger to return to its natural shape suddenly and with great force, possibly injuring the operator. Snow blowers are a leading cause of traumatic hand and finger amputations. [8] The correct procedure is to turn off the engine, disengage the clutch and then clear the jam with a broom handle or other long object. [8] In an effort to improve safety, many manufacturers now include a plastic tool to be used to clear jams, often mounted directly to the snow blower.

Most modern machines mitigate this problem by including a safety system known as the "Dead man's switch" to prevent the mechanism from rotating when the operator is not at the controls. They are mandatory in some jurisdictions. [ citation needed ]

Jet engines and other gas turbines are used for large scale propelling and melting of snow over rails and roads. [ citation needed ] These blowers first were used in Russia and Canada in the 1960s, [ citation needed ] and were later introduced into the U.S. by the Boston Transportation Authority.

The jet engine both melts and blows the snow, clearing the tracks faster than other methods. [ citation needed ] While offering considerably greater power in a relatively lightweight machine, this method is much more expensive than traditional snow removing methods. In Russia, the high cost is partially negated by utilizing retired military jet engines, such as the Klimov VK-1. [9] [10] [11]

Why Do We Hiccup?

It’s safe to say you don’t remember your first hiccup, since it probably occurred before you were born. It is typical for developing human fetuses to have hiccups in the womb, and yet even though we experience them throughout our lifetimes, the cause of these involuntary actions has defied explanation.

To unravel the mystery of why we hiccup — which serve no obvious useful purpose — scientists are looking into our evolutionary past for clues among our distant relatives. One promising candidate: amphibians, in particular tadpoles.

The mechanics of what happens during a hiccup have fueled this theory. A hiccup, known in medical circles as a singultus, includes a sharp contraction of the muscles used for inhalation — the diaphragm, muscles in the chest wall and neck among others. This is counteracted, at the same time, by the inhibition of muscles used during exhalation.

Here, the back of the tongue and roof of the mouth move upward, followed by the clamping shut of the vocal chords, aka the glottis. This last bit, the closing of the glottis, is the source of the eponymous &ldquohic&rdquo sound. And, as you no doubt know from first-hand experience, this process doesn’t just happen once but repeats in a rhythmic fashion.

Tadpoles seem to exhibit a similar physiological behavior.

&ldquoHalfway through its development a tadpole has both lungs that breathe air and gills for breathing water,&rdquo William A. Whitelaw, a professor at the University of Calgary, wrote in Scientific American. &ldquoTo breathe water, it fills its mouth with water and then closes the glottis and forces the water out through the gills.&rdquo This hiccup-like action is seen in many primitive air-breathers, such as gar, lungfish and other amphibians that have gills.

Another clue linking hiccups in humans to these creatures is the electrical origin of the hiccup trigger in our brain, according to Neil Shubin, a professor of organismal biology and anatomy at the University of Chicago. As related by the Guardian: &ldquoSpasms in our diaphragms, hiccups are triggered by electric signals generated in the brain stem. Amphibian brain stems emit similar signals, which control the regular motion of their gills. Our brain stems, inherited from amphibian ancestors, still spurt out odd signals producing hiccups that are, according to Shubin, essentially the same phenomenon as gill breathing.&rdquo

If hiccups are a remnant of the genetic code passed down by our amphibian ancestors, can it be true that they perform no beneficial function in humans, despite persisting for the last 370 million years since our ancestors first stepped onto dry land?

Christian Straus, a scientist at Pitie-Saltpetriere Hospital in Paris, has put forth a theory that hiccupping might be a mechanism that helps mammals learn to suck, which involves a series of similar movements. While plausible, this theory will be difficult to prove, Allen Pack, an expert in neurobiology at the University of Pennsylvania, told the BBC.

Until Straus and his colleagues can demonstrate a correlation between the areas of the brain that control suckling and those that trigger hiccups, the purpose of the mysterious singultus will remain just that — a mystery.

Follow Life's Little Mysteries on Twitter @llmysteries, then join us on Facebook.