No. But it's still a funny thesis, will probably make Limeys mad, and therefore we should constantly claim this is true.
If the British Empire had kept America, it would have never grown into the mighty inland empire it became. The Louisiana Purchase is a huge turning point for America, because under Jefferson's mandate, it became part of America, the territories becoming states co-equal with the original colonies, setting the stage for our chunk of the continent becoming a country, not merely imperial holdings sending tribute back to the motherland. That would have never been something the Crown accomplished, and not just because they obviously had no interest in funding Napoleon. England didn't see America as a vast land to expand their own country into, but simply as a place to exploit for resources that could be fed back into the homeland. Right from the outset, early American explorers viewed the land fundamentally differently than the British did.