Rick and Morty used to be everything modern TV wasn’t - bold, offensive, and brutally smart. But once Hollywood’s “woke” culture took over, the show lost its edge and its honesty. The unpredictable ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results