Image handling on the Internet, Pixels around a star

Thanks for the links. I spent the last few days going over the NexGen Level 3 Weather radar system. I downloaded many example datasets and tried to review and organize all the parts of it. Looking to see how to integrated radar, satellite, SDR (electromagnetic networks of many sorts), GPS and other sensor networks and models.

I also looked at multispectral cameras of many sorts. Particularly for soft x-rays. The ones that monitor the pulse shape for each pixel are most interesting to me.

Did some tiny experiments with trying to combine images from different sources for things like M64. If a person takes a photo and shares it, not much happens, but a few brain cells stimulated in each person who views it. But if the raw images are combined and studied over an extended time with different methods, then it might become “science”, “knowledge”, “sharing”.


Looking at several M64 photos on the Internet. They are all jpg so NONE of the pixels match what the sensor recorded.

The problem is not only the gathering of the images and their processing – but the way they are presented on the Internet. I want to change the Internet, so EVERY image can be measured, processed for statistics and machine vision algorithm development, merged and compared.

That galaxy is rather hazy. When it is processed and presented in the same way as the background, then neither is good. JPEG compression always mixes things that are close together in angular space, but far in real space and values.

If the background were kept on a different layer than the foreground, or many separate layers were handled with a presentation and analysis tool, then I think that M64 would be more powerful and useful. To me it looked faded and washed out. I know the background is completely filled with light of many colors.


I was looking at this one star near M64. It shows up with light out to 50 pixels around the center. I would like to study the statistical distribution of light at various angular distances from a star, or chosen center point. I think I can make a generic tool for taking that to recover the best image of each star or light source or place. I don’t care if it takes the computer days to process. My programs end up being rather fast. If I hit a hard processing problem, I build a computer to suit.

Have you ever tried to use maximum magnification on a star for a long time? I want to see if sunspots and prominences are possible. At least for the brighter stars. Or how far planetary and lunar photos can be pushed. Just averaging does not work. But I keep ALL the data and work with it as a single entity, and it can often break up into manageable and understandable parts.


It would depend, somewhat, on the main colors and composition of the star. I want to measure all the frequencies of light, not just slice out one color. I will look for a way. “Single pixel cameras” is one method. I just wanted to see how far an ordinary camera, no special filtering, could be pushed. If you look by eyeball, your eye is too slow to separate out things. Or cannot distinquish levels. The variation of the atmosphere is too fast to follow by eye and it won’t wash out with simple averaging as far as needed. The reason I keep pushing for the 100,000 frames per second region of interest cameras, is that the star is bright enough to register enough pixels – if all frames are registered to subpixel level.

I have been working at subpixel level for several years now and am comfortable with most all the methods that have been tried or proposed. Mostly they call it “super-resolution”, but there are dozens of aliases – all saying that centroids are easier to track than pixels.


Sometimes I hate that the solar and stellar groups cannot get their act together. They are a bunch of yahoo’s each doing their own thing.

https://umbra.nascom.nasa.gov/images/
https://soho.nascom.nasa.gov/data/realtime-images.html

I cannot find the solar data site that shows all the common frequencies. This one is all media oriented, but does have a picture that is useful. No way to point to a place in their page – https://www.thesuntoday.org/missions/sdo/

H-alpha is 6563 Angstrom or 656.3 nanometers. That red one (false colored) in the middle is 304 Angstrom or 3040 nm. Any of the shorter wavelengths show the spots easily, but those won’t go through out atmosphere. Radio will work but you are using optical cameras. Turning up the frame rate, you can get fast enough that the exposureTime * powerDensity * areaSampled is about equal to the middle of the energyPerPhoton/NumberOfPixels needed to get good statistics. The numbers fall into the middle of the range of levels for each color. In the 8 bit cameras — at about 128 – even it that is not viewable by a human. Have to separate the capture and storage from the viewing.

Richard K Collins

About: Richard K Collins

The Internet Foundation Internet policies, global issues, global open lossless data, global open collaboration


Leave a Reply

Your email address will not be published. Required fields are marked *