Woo it's been a while since I've seen such an obvious Google Translate fail
(With the knowledge that épeautre is a grain it's pretty easy to figure out what went wrong here)
the thing is that asylum has a specific meaning that differs: in pt-BR asilo is understood as a home for the elderly, in the Netherlands it's for refugees. the general meaning is correct. being stuck in asylum is not the original idea, it literally means opvang: to provide asylum, opvangen.
This happened to me earlier today; not quite a Google Translate fail (Google Translate got it right), but a bad translation offered when I simply Googled the word.
A trip to translate.google.com confirmed my suspicion that "cotorrito" translates to "parrot", but when I first just Googled "cotorrito" this was the top result.
slightly more amusing: the japanese name of the product is written in katakana as /zerībinzu/, which google translates as "jelly bottles". my best guess is that it's parsing /bin/ as the japanese word 瓶 (bottle), but /zu/ as the english plural marker -s. which is...certainly a choice
I don't use Google Translate much, but I see horrible DeepL (and various other AI/ML translators) fails pretty much every time I use these tools. They kinda sorta manage English OK, but any other language gets completely butchered. It's not even funny how horrible these tools are.
When I use these tools, I basically have to edit every single sentence anyway, often significantly. And I'm not just talking stylistic changes - these tools frequently change the meaning of a sentence, leave out important parts or, surprisingly, fabricate new statements that were not there.
Also, it frankly seems that instead of improving, these tools are actually getting worse lately. Some suggest it's actually because the ML tools are more and more often learning from their own previous bad output that keeps flooding the internet, and it wouldn't surprise me in the slightest.