The vast majority of HD content available today is recorded at 720p or 1080i/p and 1080p is clearly superior to 720p or 1080i. This is why 1920x1080 is considered the best. If there was more content recorded at higher resolutions and mainstream media made it the norm, then that resolution would be considered the best.
full hd is a marketing gimmick and all new high definition formats receive similar hype.
1080p is considered the best for two reasons.
-1080p content is widely available
-1080p screens are common
as far as actual screens go, the higher the pixel density the sharper the picture. 1080p screens are generally available in smaller sizes then 2560x1600 screens which gives them a higher pixel per inch density despite having less overall pixels. if comparing monitors of the same size (ie a 30" 1920x1080 and 1920x1600 monitor) then the 2560 would be sharper. 2560 is not a common video format however so content would actually look better on the 1080p monitor due to zero scaling.
there are plenty of formats which far surpass 1080p such as 4k. at this point in time however they are not practical.
The FCC chose the resolutions that are designated High Def. This is for broadcast TV which tops out at 720p (most channels are 1080i not 1080p). So 1080p which you can get from Bluray and some pay per view movies on satellite is considered full HD.
A few new TVs are coming out that will be called either UHD or 4K. These have twice the resolution of full HD but there is no native material to watch on them so they have to upscale the source. This can lead to quality issues and it is questionable that anyone will sit close enough to actually be able to see the added pixels.
Computer monitors come in many resolutions so that people who need a lot of detail for photographic and CAD can function.