This is the first in a series of articles that describe how different types of laboratory tests are developed, validated, and made available for use by patients and their healthcare providers. This section looks at the reasons new tests are developed and how they may eventually reach the patients they are meant to help.
So often we see and hear in the news about new laboratory tests that have been developed to detect or manage conditions or diseases that affect our life or that of someone we know. As with other products and services, new laboratory tests are meant to satisfy a need: to help us and our healthcare providers screen for, diagnose, or monitor conditions faster, easier, and with more confidence.
But how does a particular test that shows promise in the research stages actually get to the point where it is available for use at our doctor's office, clinic, or hospital? What does it mean for you, the healthcare consumer, when a new test is announced or when headlines tout the latest research? How are your healthcare needs met with the advancement of new tests and how is your health protected from new tests that might misinform or mislead your healthcare provider? Becoming familiar with how laboratory tests are navigated through the development, validation, and approval stages and placed into practice may help you understand the answers to these questions and put the latest headline news into appropriate context.
It may take years for a new test to pass through the many phases – research, testing, clinical evaluation, development of manufacturing processes, and review by regulatory authorities – before the test is available for use. It is an intensive process with no assurance that the test, once developed and validated, will actually be adopted by healthcare providers. That is why the first step is usually determining whether the proposed test will be useful for patients and healthcare practitioners.