1080p means 1920 x 1080 pixels are displayed on the screen every 1/60th of a second.
1080i means 1920 x 540 pixels are displayed on the screen in 1/60th of a second, and then the other 1920 x 540 pixels are displayed on the screen in the next 1/60th of a second, in an interlaced format.
IOW, the total number of pixels displayed is the same for both but the 1080i resolution has half the raw data rate. However HD camcorders almost always use some sort of compression such as x264 to store data, and the actual difference is not as great since there usually is not as much difference between frames at 1/60th of a second vs. 1/30th of a second, unless you are recording fast-action scenes like sports or racing.
Finally, you can always convert 1080i to 1080p video using software (de-interlace and interpolation to fake the intermediate frames between the actual 1/30th second frames).
I have an old SD (640 x 480) camcorder that uses the mini-DV tapes, and that stores the video in a raw format - one hour of tape is about 12GB of data. So you can imagine hhow much storage raw HD video would require...