Just found out that the simple
Disallow:/ robots.txt I had set up initially was probably undone by upgrading Discourse.
Discourse doesn’t have an easy way to modify robots.txt, but instead I found a setting
blacklisted_crawler_user_agents - which I now set to
*. That should do the trick and remain resistant to upgrades.
Good to keep an eye at