« Health & Fitness

Not just in the summer...WATER is important in the winter too!

by Angela Anderson

Yes! During the colder months of the year we STILL need to stay hydrated with water!  We may even need to consider drinking a bit more of it during this time. The winter brings about dryer air and there's less humidity which then becomes a contributor of dehydration, it increases risks of colds or flu, dry skin problems and potential weight gain!  Because the days get shorter during the colder months, people tend to DECREASE in physical activity and INCREASE in the eating department.  Water can't help the body function at its optimum best if its not properly hydrated during the fall and winter months. Some signs of dehydration:

  • Cotton mouth
  • Unusually dark urine
  • Sleepiness
  • Headache
  • Confusion

Remember water is a natural flusher and helps rid the body of toxins (which assists in weight loss), aids in proper organ, joint and muscle functioning, gives the right amount of fluid between our bones and joints, imprioves the appearance of our skin and eases digestion just to name a few of the benefits of staying water hydrated in the colder months.