I've been feeling for months now that WWE is reaching that point now where they're just so ubiquitous... the image around them has shifted dramatically from being kind of like a niche sideshow attraction which wrestling has always been treated as, to a true mainstream sports league. Its wild, they're almost at like UFC/NBA/NFL levels of just being part of the pop cultural zeitgeist. If they continue on the path they're on, it's hard to imagine them ever recessing back to the low points we've seen them at throughout different eras.