We’re kicking off a mini blog series dedicated to Telehealth. For those of you unfamiliar with this concept, telemedicine (also known as telehealth) is the practice of medicine using technology to deliver care at a distance. Essentially, a physician in one location uses a telecommunications infrastructure to deliver care to a patient at a distant site.
A Brief History and Timeline
Telehealth has been making its mark on the healthcare community for decades. To date, nearly 90% of healthcare professionals say their organizations are developing or have already developed a telehealth application. This has been made possible by years of technological innovation and federal regulation. Dating back to 1948, the very first radiological images were sent via telephone. Doctors began using this tool to send radiological images to other medical specialists to speed up the data transfer process.
We wouldn’t have Telehealth today had it not been for the internet which came into existence in the 1990s. With a globally interconnected computer network, healthcare professionals worldwide can now send and share critical data and information with just a few clicks. Ultimately, this laid the foundation for our modern healthcare system.
In 1993, the American Telemedicine Association (ATA) is created as a non-profit organization to push for better resources, standards, regulations and legislation for telemedicine. It wasn’t until 1999 when the Centers for Medicare and Medicaid Services (CMS) begins paying for telehealth consultations for patients who live in underserved communities and rural areas.
In 2000, video conferencing took off when video chat programs and applications like Skype made virtual video chat an everyday technology for people worldwide. The 2010’s mark a decade of rapid expansion in telehealth as the healthcare organizations in the U.S. look for ways to reduce healthcare costs and still provide convenient and quality care for patients.
In 2020, it is estimated that telemedicine will be a $34 billion industry and a key part of our modern healthcare system. Let’s take the current COVID-19 pandemic as an example. Health officials and physicians are urging people to stay home if they are unwell and avoid visiting the ER unless they are severely ill. While this may seem a bit extreme for some, the goal is to minimize hospital overcrowding and ensure that critical healthcare resources aren’t stretched too thin. Recent stories from Italy and China provide alarming examples of what happens when hospitals and health care resources become overwhelmed with an influx of patients.
It’s important to note that telemedicine can’t test for or provide actual treatment for Covid-19 or any other infectious diseases nor can it predict whether a patients’ symptoms may worsen after the visit. However, telehealth is a powerful triage tool that can help keep patients unlikely to have Covid-19 or any other infectious disease out of the emergency department, which will keep costs down, prevent overcrowding and help reduce the burden of medical staff.
Telehealth still has a long way to go before it becomes the backbone of our healthcare system. There are still challenges to overcome, but its accomplishments and milestones illustrate how far this technology has come over the years and how valuable it will be in the future.
Stay tuned for Part 2 of our mini-series where we discuss how telehealth is used and it’s benefits.