A solar system with a star like your own sun and a planet like Jupiter has been discovered 90 light years from us. If the star is just as bright as our sun, how does it compare in brightness to the other stars in our night sky?
Background information:
Light from the sun takes 8 minutes to reach the earth. The magnitude system for stars and other astronomical objects in our sky is designed so that a difference of 5 magnitudes represents a factor of 100 in brightness, with the larger numbered magnitudes being the dimmest. The dimmest stars seen with the unaided eye in a dark sky are about magnitude 6; the brightest stars (other than the sun) about 0. The planet Venus at times is magnitude -4, the full moon -14 and the sun -27.
I'm not an astronomer, but this is what I've computed, given the data in the problem:
90 light years = 90 x 365.25 x 24 x 60 = 47336400 light seconds.
This is 5917050 times the distance from the earth to the sun.
Brightness goes down as the square of distance, so this star is dimmer than the
sun by a factor of 3.50x10^13. Logarithms give this as 100^6.772
So if 5 orders of magnitude represents a factor of 100, 6.772*5=33.86 would be afactor of 100^6.772. Since the sun is given as magnitude -27, this star would have a magnitude of about 6.86, or just below the limit of what the naked eye can see.