Bias in AI is well established and the latest generative systems such as Dall-E or Stable Diffusion are no exception. A new tool designed by an AI ethics researcher raises awareness of the problem by letting anyone query 2 popular text-to-image systems, showing how certain word combinations produce biased results. Users choose from 20 descriptive words (eg assertive, compassionate) and 150 jobs. Inherent bias means "CEO" almost always generates a male image, unless it is paried with supposed feminin qualities such as "supportive.