I feel like a lot of zombie fiction where characters know what zombies are and the dangerous of getting bitten end up being semi-satirical comedies. Movies and shows where the idea of zombies didn’t previously exist seem to be a bit more serious from what I’ve experienced. I don’t know if it’s the aura of suspense and mystery or because it leads to more pandemonium.
I think half the fun of a zombie movie is the confusion and disarray of society having no idea what’s going on, why Mom is trying to bite me, why is my sister already laying in a pool of blood!? So yes, I think it works best for the characters to not have zombie fiction in their world, so they learn what’s happening and why getting bit is such a bad thing.
Zombie is just a tool. Best Monster media is always about something else, that’s more important than the zombies. About trust on strangers (dawn of the dead), the breakdown of society (28 days) about moral decisions (the last of us) or about the distribution of rights and the power of corporations (resident evil). The rest is an excuse for action and gore.