Solar observing, Newtons Rings – artifacts of narrow band filtering, use it for good not evil

Drexel Glasgow posted images 14 Aug 2024 on Facebook with title “With the Altair GPCAM 130M. Plenty newton rings”

Kirtsunegari posted in StarGazersLounge some animated gifs showing what you likely see when you look at live data, and what the software has to guess at when trying to separate the “really on the sun” things from “probably vibrations at the sensor” and “path from sun to camera pixels” variations. If the filter is very narrow band the lens in the telescope are looking a laser source with a finite range of intensity through the shape of the filter function. Even though it is projected over an image of the “true” sun, I think these interference effects are mostly from the lenses. I would not keep changing lenses or equipment or software until I wrote down all the relevant mathematical and calibration relations and trace the rays from the sun for the “really on the sun data”

If multiple telescopes and cameras all share and compare image sequences (same time, same location on the sun), in order to do optical interferometry and use that to improve the images (3D and time and intensities and wavelengths) – they would all need to calibrate their lenses and camera sensor settings and intensities. A common target on the sun could be a start to calibrating a global network of telescopes and cameras and software and shared data. That network can include Solar Dynamic Observatory and AIA data from space based sensors too. Maybe Elon Musk can send up some more or put one around the moon or Mars).

I am so new to this group, I hesitate to say anything. I looked at Eric Weinstein Newton Rings article and the Wikipedia one. But I have to manually write it on paper, or try to use mathematical software. I am highly biased because I grew up in a large family (poor) and spent more time in trash dumps and street trash and dumpsters looking for parts, than spending my meager earnings on rich person toys I could never afford.

https://content.invisioncic.com/g327141/monthly_2020_06/whyyyyy.gif.3f728e183336409090787a4a0ebdc316.gif

https://en.wikipedia.org/wiki/Newton%27s_rings

https://scienceworld.wolfram.com/physics/NewtonsRings.html

Putting things on the web in a static page or PDF is a kiss of death to understanding most of the time. And encapsulating in a rigid software code where the source is not clearly written and traceable and shared — is painful for individuals, and death to groups. I spent every day for the last 26 years studying 10,000s of cases and systemic issues on the Internet to see what would be better for global and heliospheric groups.  There might be humans near and on Moon, near and on Mars, and just bopping around the solar system one day. Sharing what they see and what they measure and ways they visualize and model things.

In a sense the rings are an artifact because of the narrow band filter output hitting lenses that move and shake.  A 3 axis accelerometer and gyro might help take out part of that.  It can be done with software, but that is so hard to distribute and maintain.

Filed as (Solar observing, Newtons Rings – artifacts of narrow band filtering, use it for good not evil)

Richard Collins, The Internet Foundation


That Weinstein article makes more sense if you sustitute r–> y and d –> Y
https://scienceworld.wolfram.com/physics/NewtonsRings.html

Where x is the horizontal from the center of the image, and y is the vertical distance from the center of the image. And the image center is likely not going to be the direction of the Jet Propulsion Labs center of mass of the sun for when the light leaves the sun.

(1) x^2 + (y-R)^2 = R^2, where x is the horizontal from the center of the image, and y is the the vertical distance as you look at the plane in side the sun. R is the radius of the sun for the precise date and time.

(2) y = R – sqrt(R^2 – x^2)
(R – y)^2 = (R^2 – x^2)
R^2 – 2*y*R + y^2 = R^2 – x^2

y^2 + x^2 = 2*y*R

The interference is happening when wavelengths from the sun, go through your narrow filter and hit your mostly spherical lens.

The x and y are in the lens itself. Except there are probably a bunch of lenses and coating and filters you never see, can barely find data on, and probably pay dearly for.

Richard K Collins

About: Richard K Collins

Director, The Internet Foundation Studying formation and optimized collaboration of global communities. Applying the Internet to solve global problems and build sustainable communities. Internet policies, standards and best practices.


Leave a Reply

Your email address will not be published. Required fields are marked *