All about flooble | fun stuff | Get a free chatterbox | Free JavaScript | Avatars
 perplexus dot info

 Nice View (Posted on 2004-09-29)
Assuming that the earth is a perfect sphere, in units of the earth's radius, how high must one be to see exactly one half of the earth's surface?

Okay... okay... how about exactly one third of the earth's surface?

 No Solution Yet Submitted by SilverKnight Rating: 3.1667 (6 votes)

Comments: ( Back to comment list | You must be logged in to post comments.)
 solution | Comment 1 of 7

The area of a spherical cap relative to the area of the whole sphere is (1-cos r)/2, where r is the radius of the circle measured as a great circle arc, or the angle it subtends at the center of the sphere.  We want this to be 1/3, so

1-cos r = 2/3
or
cos r = 1/3

If a line is drawn from the center of the earth to the elevated observer, and another from the observer to any point on the earth's surface that's on his horizon, and another as the radius from that point to the center of the earth, the angle between this last line and the first, thus has cos = 1/3.  Therefore the hypotenuse is 3 earth radii from the center, or 2 earth radii above the earth's surface immediately beneath him.

The above formula for the area of a spherical cap is the basis for rectangular equal-area projections of the world, an old idea, but recently claimed by someone named Peters, to "invent" a Peters projection of the world.

 Posted by Charlie on 2004-09-29 13:46:13

 Search: Search body:
Forums (0)