GRAM: GRAPH-BASED ATTENTION MODEL FOR HEALTHCARE REPRESENTATION LEARNING

[Code][Visualization] [paper]

Edward Choi1, Mohammad Taha Bahadori1, Le Song1, Walter F. Stewart 2 & Jimeng Sun1
1
Georgia Institute of Technology, Sutter Health

ABSTRACT

Deep learning methods for predictive modeling in healthcare start showing promis- ing performance, but two important challenges remain:

Data insufficiency: Often in healthcare predictive modeling, the sample size is insufficient for deep learning methods to achieve satisfactory results.

Interpretation: The representations learned by deep learning models should align with the ground truth medical knowledge.

To address these challenges, we propose GRAM, a GRaph-based Attention Model that combines electronic health records and a medical ontology. By dynamically combining the ancestors from the medical ontology via attention mechanism, GRAM learns interpretable representations for medical concepts leveraging medical concepts in EHR and the hierarchical structure of the ontology.

We conduct predictive modeling experiments for disease progression and heart failure prediction using GRAM and other baselines. Compared to the basic recurrent neural network (RNN), GRAM achieves 10% improved accuracy for predicting less common diseases and 3% improved area under the curve for predicting heart failure with small training data. Unlike other baseline methods, the resulting concept representations of GRAM are clinically meaningful, and well aligned with the structure the ontology. Finally, GRAM can exhibit intuitive attention behaviors by adaptively generalizing to higher level concepts when facing data insufficiency at the lower level concepts.