Kenny Walter is an editor with HCPLive. Prior to joining MJH Life Sciences in 2019, he worked as a digital reporter covering nanotechnology, life sciences, material science and more with R&D Magazine. He graduated with a degree in journalism from Temple University in 2008 and began his career as a local reporter for a chain of weekly newspapers based on the Jersey shore. When not working, he enjoys going to the beach and enjoying the shore in the summer and watching North Carolina Tar Heel basketball in the winter.
Attempts to use neuroimaging to screen for ADHD in the past have failed.
Screening for attention deficit/hyperactivity disorder (ADHD) has not changed much over the years.
Often the screening practices include a self-screening questionnaire to help the patient recognize the signs and symptoms of ADHD. For adolescent ADHD patients, the screening process can also include interviews from teachers, family members, and other people.
While there are tools to help the psychiatry screen patients for the disorder, a high degree of suspicion remains the most important aspect of screening for ADHD.
Doctors will look to find out not only about the individual symptoms each patient must have, but also how these symptoms might impact their lives. For example, ADHD is known to have an impact on an individual’s school or work function, so to be diagnosed and treated for ADHD most patients can readily have an example of how those functions have been impacted.
In an interview with HCPLive®, Andrew, J. Cutler, MD, Clinical Professor of Psychiatry at SUNY Upstate Medical University, explained why screening practices are effective and why there is some hope for new techniques and technologies to make an impact in the future.
Cutler said right now there is no biomarkers or clinically relevant sensitivity testing useful and accurate enough to be used on the widescale for ADHD.