Biography
Avinash (“Avi”) Das Sahu, PhD is an assistant professor at UNM Comprehensive Cancer Center and Dept of Computer Science, where he leads th TumorAI lab (Tumorai.org). His group designs interpretable, clinician ready AI that links molecular signals to real world decisions in cancer care. The group builds AI “co pilots” across two fronts: (1) genomics, including Protein2Text and the forthcoming Mutation2Text to explain variants—especially VUS—with clear, rationale based text; and (2) medical imaging, where EndoPilot and the multi agent MED X framework support transparent, human in the loop analysis of endoscopic images for colorectal cancer. Guided by a simple aim—to “prevent the preventable and treat the treatable”— he collaborates closely with oncologists and geneticists to move these tools from algorithms to clinic ready prototypes. His work has been recognized with awards such as the OCRA CDRG (PI), NIH Pathway to Independence (K99/R00), and the Michelson Prize, following training at the University of Maryland/NHGRI and postdoctoral work at Dana Farber/Harvard and the Broad Institute. Beyond publishing in venues such as Cancer Discovery and Nature Communications, his lab emphasizes open science, inclusive mentoring, and deployment on high performance computing resources to broaden access to expert level decision.
SRP Project Title
Bridging Molecules and Medicine: Explainable AI for Genetics and Imaging
NAIRR Project
Advancing Colorectal Cancer Diagnosis: A Multimodal AI Copilot for Real-Time Endoscopy
Topical Areas
Artificial Intelligence and Intelligent Systems; Biochemistry and Molecular Biology; Clinical Medicine; Health Sciences
Abstract
Millions of people face deep uncertainty in their fight against cancer. Two major hurdles often stand in the way of clear answers. First, genetic testing frequently returns ambiguous results called "variants of unknown significance" (VUS), leaving nearly 40% of patients in an anxious "diagnostic limbo" without clear medical guidance. Second, a nationwide shortage of specialists means that interpreting crucial medical images, like endoscopies, can be slow and inconsistent, delaying life-saving diagnoses. 🩺 Our lab is pioneering a solution by building AI "co-pilots" for medicine. We have already developed the core technology: powerful AI models that can translate complex genetic data (Protein2Text) and interpret endoscopic images (EndoPilot). This project will focus on the most exciting application of these powerful tools. We will create Mutation2Text to demystify VUS with clear, understandable rationales. Concurrently, we will build MED-X, a framework where a team of our AIs work together (AI agents) to help doctors spot signs of colorectal cancer with greater accuracy. This project will deliver a unified pipeline that empowers clinicians, provides clear answers to patients, and makes expert-level care accessible to all.
Desired Skills
We are building a diverse team and welcome collaborators from all backgrounds. Whether you are a coder, biologist, future doctor, or creative problem-solver, there is a place for you. We seek faculty/students with interests in: Technology & Data Science: Help build and train cutting-edge AI models using skills like Python and machine learning. Biology & Health Sciences: Use your background in genetics or life sciences to guide our AI’s reasoning and ensure its insights are clinically useful. Human-Centered Thinking: Help us create trustworthy, explainable AI that doctors can confidently use to improve patient care.
Lightning Talk Title
Explainable AI Co‑Pilots for Genes and Endoscopy
Keywords
Explainable AI; Genomics; Variant interpretation; Endoscopy; Multimodal LLMs; Human‑AI collaboration; Cancer prevention; Precision oncology; Ovarian cancer; Colorectal cancer.