“America is back!” said Joe Biden this week, as he prepares to take over as president. It’s a sentiment that has, of course, been championed by liberals, but what exactly does it mean? And did America ever really go away? If you ask any critic of Donald Trump, the answer …
Read More »