Thanks for your work on RGB and chromacity of stars.
I was looking at these two live videos on YouTube for Maunakea,Hawaii live all sky camera and its associated, and fairly nicely synchronized, star chart
Live All Sky: https://www.youtube.com/watch?v=hPWz3mDvAuY
Star Chart: https://www.youtube.com/watch?v=S1xkCMa0uQE
I think these would make good teaching aides for astronomy groups world-wide. That is hundreds of thousands. Trying to refine the number. Since there are about 1.92 Billion kids from 5-20, most of them get a smattering of astronomy. And a short session with a real sky and labeled stars and things would help. Me, I am 72, and never knew the name of a single star. Looking at the needs for the whole Internet, I am forcing myself to learn again.
Anyway, going from spectra for stars (there are lots of sources, and all the databases seem really clumsy and eclectic) to fairly representative colors on a screen or overlay, is a critical step. Also, going backwards — taking live camera views at whatever magnification, follow the many frames statistics on pixels at locations, then checking to see if that is consistent or inconsistent with a star of a known spectra and flux at that location.
I said “flux”, which is the crux of my question. I know that the light coming to the eye or sensor can be measured in watts/meter^2. And whatever prefix or power of ten is appropriate. I have in the back of my mind, Jansky. I checked at https://en.wikipedia.org/wiki/Radiance and the names for “watt per square metre” are “irradiance” “flux density” “radiosity” “radiant exitance” — depending on “received” “leaving” “emitted”
Whatever. If I had a camera, its ADC reading would depend on the photodetector calibration, amplification, exposure time, and lots of other stuff. But, in principle, any camera should be able to be calibrated so that pixel readings convert to absolute “watts/m2” “received” at the sensor. Oh, atmospheric absorption and scattering, losses and reflections in lenses, phase of the moon.
Now I would like kids to be able to use some collection of stars to calibrate their cameras. And for a few dollars more, you can get a lot of camera these days. Or a lot of kids can point their cameras and try to combine results.
The NIST STARS Program – Update 2017 at https://www.nist.gov/system/files/documents/2017/08/29/nist_stars_description_for_wfirst_cal_plan.pdf uses Vega and Sirius, and is the usual expensive and complicated approach. And it forces only a few stars for reference. But with all stars potentially identifiable, and large datasets of outside the atmosphere irradiances (still watts/m2 outside the atmosphere), it should be possible for any image sequence (frames) to be matches and quantified.
I ran into the variable star observing groups. Their data is hard to use and the observing instructions and cameras and setup and methods are obscure.
I am looking at the whole Internet. All users of the Internet and all people. (about 4.8 Billion out of the 7.8 Billion total have some access to the Internet now). Most global education was force online from covid, and the high cost of moving students to brick and mortar classrooms and labs is forcing experiments, measurements, sensor networks, high cost experiments and data collection online, shared and with collaborative education and groups. Hard to say in simple terms, and I am trying to follow and set standards for all.
It is not hard to get the atmospheric models. They are just big and clumsy and eclectic, like everything else. A Beers-Bouger-Lambert “formula” approach breaks down really quickly when you have a lot of measurements and potentially can get real data, and not estimates based on uniform densities in paths. Now the GPS dual frequency path lengths are getting used more and more. I am straining to remember how they extract meteorological data. The electron density path totals are fairly easy as that matches the delay. But how electron density varies in the atmosphere, I cannot remember. I find these things and have no place to put them. It is hard enough translating all the units and dimensions and models. And finding all the people in the world who use something, who measure it, who work on new detectors, who process and share the data, who write websites about it, who teach and model and test and correlate. And I try to trace and track every group on the Internet.
Not sure what, if anything, i am asking. You seemed smart and diligent and knew the importance of making sure what human viewers see and report, and what kids learn and what is in the databases and measurements all fits together in a verifiable and traceable chain. That is much of what I see. I really got upset with NASA when they (still do) were posting jpgs of astronomical images (Hubble etc) online without names, no documentation, in lossy formats. That is not science I told them. Just eye candy. They let their marketing and web design groups make the decisions.
Sorry. Sore point. The same casual attitude to images, video and all measurements on the Internet is shown by every site owner. No exceptions. Or so few they hardly register. ESA, NASA, NIST, you name it and I can show you where they all are just putting cute pictures, vague references or no references or explanations at all.
So i spend my days just trying things to see what is required. The last week or so, it was all the “nuclear data” groups. They every one have about 20 basic datasets that everyone shares. But each website shows their half dozen favorites, using the hand-written and mostly undocumented tools that the webmaster or someone wrote. Everyone is proud of what they did, but they are not working together on just the implementation. ALL those sites all use different navigation, identification, site procedures, policies, linking standards. It is like they took some very beautiful subjects and databases, examples and teaching and professional tools – threw them in a blender and spread that on the Internet – “Here is nuclear data”, “Here are stellar surveys”.
I have only been at this for the last 23 years for the Internet Foundation. And 7 years before that on the early internet and networks. Remember BBS and dialup to mainframes? I am old enough I started at 300 baud. 110 paper tape. Teletype.
Anyway. Can you recommend some stars scattered over the whole sky, and a practical sustainable way to make sure that is always available for the next 50 years? (I have to deal with people who are the north pole and south pole, and in every spot on earth and in orbit. I am already making plans for Moon and Mars and in transit. But the earth based school kids and “citizen scientists”. Well that one far outstrips the full time paid professionals, in number, impact on global technology, new industries, and all other measures. If you want, I can tell you why most people are so multitasking globally, people won’t just change jobs several times in their lives. They will be working on many at any times all their lives. I am trying to set up global communities online for every topic and issues and opportunity. Because some people like to work on one thing for a long time. And that will be possible because I also am trying to get the donors, and supporters, crowdfunding, early investors, nonprofit support network streamlined and standardized. Partly that is practical because the Internet cannot function without decent pay for people who contribute online materials and services. But it is also to identify and prevent fraud and corruption and abuse.
Richard K Collins, Director, The Internet Foundation