I mean I guess it depends on you define things. You could easily make the case the US is an empire. We have territories everywhere and bases on every continent.
Well, American Imperialism is alive and well, so yeah, The United States is definitely an Empire, but not in the traditional sense.