I'd be impressed if this was the reasoning GPT provided, e.g. "I don't think this vegan wolf likes cabbage". But when asked to explain itself (see above, the "debugging" comment) it states nothing of the sort.
Also, a reasoning person would understand that in the context of a riddle like this, "vegan wolf" means "a wolf that eats cabbages" even if this isn't spelled out.
GPT could be a contrarian, trying to subvert the terms of the riddle and fight over every word ("it depends of what the definition of 'is' is") but we know it's not set up to behave like that, so we can rule it out.
Also, a reasoning person would understand that in the context of a riddle like this, "vegan wolf" means "a wolf that eats cabbages" even if this isn't spelled out.
GPT could be a contrarian, trying to subvert the terms of the riddle and fight over every word ("it depends of what the definition of 'is' is") but we know it's not set up to behave like that, so we can rule it out.