NANOGrav raw observing data – merging all global sensor networks on the Internet

Hello,
I was visiting https://data.nanograv.org/ to see what kind of data you are sharing online.  I am tracking most all the sensor networks on the Internet, especially the emerging ones.
If I understand this talk by Steve Taylor at https://www.youtube.com/watch?v=_UptA7pkARo you are only observing at widely separated times for about 30 minutes?
He talks about the correlations and the common motions of the earth.  It occurred to me that if there were simultaneous observations in all directions simultaneously one could solve for the movement of the earth.  Since the nanoHertz signals cover 1E9/365.25*86400 = 31.69 years, all the solar system observations won’t help much, except for time of arrival.
Anyway, I would like to know what your “raw observations” look like, how many records, frame rate, size, format.
I can’t deal with more than about 20 Terabytes at a time  I just have a small machine.  I can download Gaia 3, or Wikipedia, or Google NGram, or hundreds of major GitHub projects.  I look at literally everything on the Internet, figure how it fits into the whole, and what it takes to standardize it for global communities of tens or hundreds of thousands of specialists to work together, or hundreds of millions or billions for many economic and social and environmental processes.
Have you used the LIGO strain data too?  I am working on that right now for a simple scan of the interior of the earth and moon.  It is hard working with 16384 sps data from three really eclectic instruments, but I try to work with any dataset, no matter how odd or troublesome.
If I can see a sample of one observing session, I can tell a lot.   Three stations all looking at pulsars in different directions would be better. I can see what would happen if all the sites ran continuously.  And it can be the variable stars as well.  If those are tracked for years, their models will improve and precise times and properties available.
There are many electromagnetic sensor arrays new on earth that, and many potential gravitational networks.  I used the superconducting gravimeter array almost 20 years ago now to measure the speed of gravity.  That uses the JPL solar system ephemeris as reference. It is sensitive enough to put bounds on atmospheric models.  Those are only single axis instruments, so I went through all the IRIS.edu seismometers to find that the broadband seismometers as gravimeters are sensitive enough to track the sun and moon vector tidal signal on all three axes with just a linear regression – offset and scale factor. There is no phase, which is why those three axis instruments can be used for gravitational GPS. Some of them are transportable.  But you can solve for orientation of the axes, since that much data and only six parameters allows for continuous calibration.  I like it because it works underground or undersea and is insensitive to ionospheric delays or satellite problems.  The SGs cost about quarter million and their offset changes with the helium level.  You can track the level of the helium at the station based on a simple regression to the sun and moon tidal signal.  One line vector calculations.  There are LOTS of gravimeters designs and groups now.   I have been encouraging all of them to increase the sensitivity and go to time of flight methods and frequencies.  You know it well, once they have a signal, then you can make faster improvements.  Hard to tell young people that, but it is always true.
MEMS gravimeters, Bose Einstein gravimeters, atom interferometer gravimeters, electron interferometer gravimeters, electrochemical gravimeters. The best of  them are just able to detect the seismic waves from magnitude 8 earthquakes anywhere in the world.  Of course they should work well close by for volcanoes and such.  The Japan earthquake registered on the SG network and also on exactly those broadband seismometers I had been calibrating as gravimeters.  I want to use them all for constraining the global climate models.  That is taking me a while to learn to download and use the data from the simulations.  I have worked on global climate change off and on since about 1990 when I was working for Phillips Petroleum in their Business Intelligence group. I was helping with the models and negotiations.  I am a world expert on global economic and social models and good enough to follow and improve the climate part.
I am never sure what people might be interested in.  I did go through the software defined radio groups a couple of years ago.  There are quite a few amateur efforts to use that for regional and global monitoring of the spectrum.  I think they might be slightly ahead of radio telescope groups on price and sensitivity.  If a method is so cumbersome only a few people can get involved, it tends to die.  I only have 23 years of looking at global groups on the Internet. But I would say most groups that are closed die, and part of that is pricing things so no one can get involved. That is why sharing of data from all these different networks can be so important to the future of the human species.
If I were setting up a network of pulsar monitoring stations, I would not use the big telescopes, but rather widely dispersed arrays of different kinds of sensors – gravimeters, infrasound, magnetotelluric, electromagnetic interference, ionosphere, GPS dual frequency time monitors, Mossbauer detectors (these directly sample the change in the gravitational potential), and any atomic clock detectors (also time dilation, direct gravitational potential change detectors).
Continuous, open, auditable, shared models, share algorithm development, all countries, all schools, any individuals or groups. The Internet is for everyone – no exceptions.  And when you get hundreds of millions of people synchronized and working on common models, not hundreds of millions of separate copies of partial models – things can happen in days that otherwise would take decades or centuries.  The rules for optimizing human use of the Internet are simple, you just have to use a lot of unique data.
Richard Collins, Director, The Internet Foundation
Richard K Collins

About: Richard K Collins

The Internet Foundation Internet policies, global issues, global open lossless data, global open collaboration


Leave a Reply

Your email address will not be published. Required fields are marked *