Point of interest or additional context that can be overlaid on model’s output to further indicate behavior
Visual Intelligence・Medical Prognosis

Work In Progress

Our Elements guide is still in progress, and therefore lacks full visual and technical assets. We hope to release them by summer of 2020. Thanks for reading Lingua Franca!


A signal is a visual indicator of an AI model’s focus, to allow a user to discern where the AI has discovered relevant or anomalous information. The use of a signal is most prominent in image recognition, where an image may contain a large amount of irrelevant information but a few key regions of importance. These regions can then be annotated on the original data for users to interpret.


Signals were devised initially as a form of ‘explainability’ that could inform a user as to why an image was categorized a certain way. However, the signal is best understood as a human-assistive feature that requires some training and experience on the human’s part. Because the signal is a low-fidelity feature, it may seem confusing or outright incorrect. For example, in an image of a hockey puck, the relevant feature may actually be the hockey stick, since pucks are visually correlated with nearby hockey sticks. Therefore, it is important to think of the signal as just one of several mechanisms of building human intuition for AI, rather than as a reliable indicator.


This element is often called the AI’s attention[1], though it does not require the specific mechanism of attention used in modern neural networks.

Signals are typically displayed with an overlay such as a heatmap or mask. A good practice with signals is to reduce the level of detail to match its implied level of importance. You can do this by blurring, fading, or rounding values.


  1. Attention is All You Need by Vaswani, et al. ↩︎