There’s a chance that, like many people around the country, you have been slightly more tired this week than in previous weeks. It may be due to midterms, papers and general stress prior to spring break, but daylight saving time may also have had a role to play. The tradition of turning the clock back one hour and ahead one hour has been a formal yearly part of our lives ever since Woodrow Wilson instituted the idea in 1918 as a means of saving energy during the war. The idea itself supposedly originated with Benjamin Franklin over a century prior, but has it really kept up with the industrial revolution and the transition to modern society?
Contrary to what many think, daylight saving was not implemented as a helping hand to the farmers. In fact, many farmers would rather have an additional hour of sunlight during the day than in the evening. Rather, the idea originated from the desire to conserve energy. If there was an additional hour of daylight later in the day, less artificial lighting would be used. A few centuries ago this referred to candles and lamp oil. Today, the justification is to help cut back on electricity.
However, according to empirical evidence, daylight saving does not even come close to accomplishing what it is theoretically supposed to. The National Bureau of Economic Research looked at Indiana, which only instituted daylight saving less than a decade ago in 2006. The study found that overall electrical consumption increased by 1 percent with roughly a 2 to 4 percent increase during the fall months. Ben Franklin was right about decreasing the demand for artificial lighting in that household usage of light did decrease. However, accompanying this was an increase in electricity spent on heating and cooling which not only canceled out any potential energy savings, but led to increased energy consumption.
In addition to not even achieving its intended goal, daylight saving time has adverse unintended consequences. A 2009 Journal of Applied Psychology study found that mine workers, who are already at injury risk due to the nature of their job, experienced 5.7 percent more workplace injuries in the week directly following the springtime daylight saving transition than during any other part of the year. A 2008 Swedish study showed that the rate of heart attacks during the first three weekdays following the “spring forward” of daylight saving increased by about 5 percent from the average rate. Both of these cases are due to the immediate reduction of sleep, which, in the case of the latter, releases stress hormones that can cause heart complications. These effects are not observed in the “fall back” portion of daylight saving as an additional hour of sleep is usually not harmful.
While an hour of sleep does not seem like a lot, it will affect people for the next morning and potentially the next few days as their bodies adjust to the new sleep schedule. Among the immediate effects will be decreased productivity in the work place. A 2012 Journal of Applied Psychology study found a significant increase in cyberloafing, or Internet distractions during work, in more than 200 metropolitan regions after the loss of an hour of sleep. Chmura Economics & Analytics even went so far as to quantify this loss in productivity and estimated it at almost $434 million in the U.S. alone.
So if daylight saving does not actually save us energy and puts us in a groggy state of mind so much that it can potentially lead to increased injury risk and hundreds of millions in wasted productivity, why have we continued to cling onto it? Perhaps it is because we enjoy having an extra hour of sunlight for summer evenings, and changing clocks is easier than changing our schedules. Perhaps it is because it is simply a tradition we have held onto for almost a century without question. However, times have changed since Big Ben wrote his letters and if we want to talk about practicality, daylight saving either needs to be restructured to minimize harmful side effects, or abolished entirely.