Comment of Gravitational Fields and Gravitational Waves – history of vector tidal signal calibration and imaging arrays

Hello,

I had reason to write out some of what I found measuring the speed with which the gravitational potential comes to equilibrium. It is rather long, but perhaps might be useful. I am trying to encourage the development of arrays of three axis, high sampling rate gravimeters to use for imaging sites of interest on the sun, its interior, the earth’s atmosphere oceans and interior. Any instrument that can measure earth tides can also (usually) be calibrated against the sun and moon tidal signal, and then can be used as a gravitational detector in its own right. While a bit tedious to calculate, the vector Newtonian tidal signal at locations on the earth is simple in form, universal, and easy to implement – in order to tie together what are now often separate and unconnected efforts. It should be possible with the methods already available, or a small adaptation, to measure the speed of the gravitational potential and its fluctuations routinely and to correlate those changes to real and practical measurements with gravitational sensors or any of the many geophysical sensors and routinely monitored and shared global datasets now coming onto the Internet.

I particularly want to image and calibrate models of the sun at milliarcsecond levels. The surface of the earth’s core at (10m)^3 and (1km)^3. And the atmosphere in near real time at (10m)^3 and (100m)^3. Global climate change data has real value when it is done for cities, regions and countries – routinely and with all sensors networks combined and inter-calibrated. The needs of “solar system colonization” are such that we cannot send our children and grandchildren out to explore other places unless we equip them with the best and most efficient methods for survival and growth. Combining gravitational sensor networks with other field methods is not some magical process. It is not hard – just tedious, serious engineering, statistics, organization and dedicated work by many.


It is possible to measure the speed of gravity by using the network of superconducting gravimeters. They record the vector tidal gravitational acceleration due to the sun, moon and earth at the station. I did this almost two decades ago now. And repeated the same with the three axis broadband seismometers. The MEMS gravimeters now show the same. To match the very precise calculated Newtonian vector tidal gravity to the stations data simultaneously at all stations requires that the gravitational potential reaching the stations arrive there at the speed of light. A difference of a second is detectable. It takes light about 499 seconds to reach the earth from the sun, and a coronal mass ejection on the sun, the signal would takes the same for the gravitational potential to come to equilibrium.
 
To be clear, it is NOT the gravitational acceleration (1/r^2) that is flowing or diffusing, but the gravitational potential.
 
The Japan earthquake was strong enough to register signals at the speed of light at gravity on both the superconducting gravimeter and the broadband seismometer networks. These signals due to the spreading seismic waves from the earthquake travel as changes in the gravitational potential at the speed of light and gravity. There are groups trying to improve gravitational potential and gravitational potential gradient detectors (gravimeters) to allow for an effective earthquake early warning system.
 
In August 2017, the merger of two neutron stars was recorded as a gravitational wave in the LIGO network as event GW170817, and also on many simultaneous electromagnetic sensor networks at many frequencies of electromagnetic waves. And, racing the gravity signal and electromagnetic signals over a distance of 170 Million light years, they arrived at the same time. This event, after having spent many years going over the implications of the speed of gravity measurements connecting sun earth moon and stations as a routine thing, and this demonstration that light and gravity (gravitational potential changes is what LIGO measures) made me realize that the only way that can happen is if the electromagnetic field and the gravitational field share the same underlying potential field.
 
When I was studying gravitational wave detection at the University of Maryland at College Park in the later half of the 1970’s, I was working part time with Steve Klosko at Wolfe Research and Development on the NASA GEM series of gravitational potential models – using satellite orbits, C-band radar tracking, laser ranging and other methods to carefully track things in orbit to extract the gravitational potential field necessary to fit the motions observed. So I had also gotten a good intuition for how the potential field works.
 
The gravitational potential (in my view) is a fluid or dense gas that is already at every location in space. It fills the universe. Changes in the density of voxels, because of moving masses, changes the gravitational potential. And this is by diffusion. So I say “the speed of diffusion of the gravitational potential is the same as the speed of light.” Now are there signals that can be transmitted at much higher speeds? At many times the speed of light? I would say that is very possible. And it would be a different kind of wave than that due to the flow of potential fluid or gas or boson superfluid (depending on the model you choose)
 
I have followed most every group trying to make more sensitive gravimeters. Because the gravimeter is measuring the second derivative of the strain, at higher frequencies each term of the spectrum is multiplied by the square of the angular frequency. So the absolute value of the signal gets larger as you go to higher and high frequencies. If you, as I have done many times, walk through how the gravitational potential field comes to equilibrium when observing at high sampling rates (MegaSamplesPerSecond (Msps), Giga Tera and higher) those squared frequencies make gravimeters and jerk (jerk is the third time derivative) detectors or higher time derivative detectors are likely the better choice for following what is happening to the gravitational potential at high frequencies.
 
The ONLY way I have found to separate gravitational and electromagnetic signal sources is to take samples at high enough sampling rates to use time of flight methods using arrays of detectors to correlate and identify the places in space and time where the signals could originate and arrive at the detectors at the speed of light. So I have have looked at atom interferometer gravimeters, MEMS gravimeters (there are two that I know have been tested at Liang Cheng Tu’s group in China and Richard Middlemiss group in Glasgow Scotland) There are electron interferometer gravimeters, seismometers optimized to work as gravimeters, a whole range of high sampling rate sensitivity “vibration sensors” that could be adapted. There are electrochemical gravimeters. You can use interferometry to precisely track the motion of masses down to nano and pico meter range reliably at low cost. You can use atomic force methods to do the same.
 
If you search “atomic force” “gravimeter” on Google you will find several groups trying to adapt the atomic force methods to high sampling rate gravimeters. i worked with those enough to know that the higher order resonances in the cantilevers are significantly more sensitive.
 
Without going through all the groups working on different detectors I will encourage you to stop wasting more time trying to work things out from old theories. It can be a guide, but measuring things is much more satisfying and practical. The MEMS gravimeter is just a MEMS accelerometer. I coined the rule “if an accelerometer is sensitive enough to measure the vector tidal gravity signal on all three axes, it can be called a gravimeter” Do you understand? The gravimeter is just a sensitive accelerometer measuring the accelerations locally. Measuring the gradient of the gravitational potential locally. The gravitational potential is changing.
 
I am rather tired just now. I tried for many years to encourage groups to use the Jet Propulsion Labs solar system ephemeris which gives precise positions of the planets, sun, moons (and I think a lot of the asteroids and comets and things) and solves for the gravitational potential of the whole solar system. The calculation of the vector signal at a superconducting (or MEMS or electrochemical or atomic force or atom interferometer) gravimeter only requires the positions or the earth, sun, moon and station. The masses of the sun moon and earth. And the earth rotation rate (the calculated vector centrifugal acceleration at the station) to calculate the vector tidal acceleration at the station. Then a LINEAR regressions (offset and multiplier) to fit each of the three axes, to calibrate the station in absolute (locked to the solar system ephemeris and JPL network of sensors and calculations) units. You can “weigh” things locally against the sun and moon signal
 
In words it is “The sun’s acceleration at the station minus the suns gravitational acceleration on the center of the earth” plus the “moon’s gravitational acceleration at the station minus the moons gravitational acceleration on the center of the earth” plus the “centrifugal acceleration of the station due to earth rotation”. That vector, rotated into station North East Vertical coordinates ONLY requires a linear regression to match the signal. The offset (the constant) is tied to local variations at the station. For a superconducting gravimeter you can take the daily regressions (hourly, every minute if you want) and trace out every event in the station logs – changes in the helium level, changes in the power supply, outages, temperature changes. I did that a few times, but no one seemed interested in combining all the gravimeters on earth into a single gravitational detector array.
 
This simple “Newtonian vector tidal gravity signal” allows the sun and moon portion of the signal to be removed in a globally standard way. Not much quibble about the model or what it is tied to. I tried using the traditional earth tide models first, but then Wentzel died and his widow would not let me go through his notes to improve ETERNA. So I used my old orbital mechanics background and guessed the form of the signal. Then spent a year getting it to fit precisely to the superconducting gravimeter network. I did all the data they had at that time. Because the SGs (superconducting gravimeters) are only single axis instrument you cannot take data from a single station and solve backwards for the sun and moon. But the broadband seismometers are three axes, can operate as gravimeters (albeit not very sensitive ones) and the three axes calibrated to the sun moon signal. A Newtonian calculation.
 
I am writing this out here because I am tired of explaining this. I spent much of the last two decades looking for high sampling rate, low cost gravimeters to combine to image the interior of the earth, the atmosphere of earth, the magma inside the earth, the oceans of the earth in real time using the residual after locking all the detectors to the sun and moon. The local residual is primarily atmospheric density and wind, atmospheric and soil moisture, earth tides and “earth local” effects. AND they all arrive at the station at the speed of diffusion or flow of the gravitational potential. “The speed of light and gravity” So you can use an array around a city to constrain models of the local atmosphere, correlate with radar tracking of storms, model flows of moisture in what are called “atmospheric rivers”.
 
I have worked out how to do the time of flight (speed of gravity) correlations to focus arrays on any voxel in the sun earth moon or space. Or to look outwards. For all practical purposes it matches well with low frequency magnetic and electromagnetic methods in the nanoHertz to GigaHertz range of sampling frequencies. If you measure using software defined radio direct sampling methods you say “samples per second” or “measurements per second” or “frames per second” or “records per second”.
 
Jan Harms at LIGO solved for the gravitational potential changes due to seismic waves. He did it for the strain data at the LIGO network. But the gravimeters can run at much higher sampling rates, so can form images more easily than those giant expensive and not accessible few sites.
 
Now I have an odd history of jobs. So i worked in research at Phillips Petroleum on 3D models and visualization tools using microcomputers. This is because I was considered an expert in GIS on microcomputers from my earlier work on the Famine Early Warning System (it is still going after I set up it up in the mid 1980’s) FEWS.net. My point is that I met people doing subsurface imaging and exploration using magnetic and gravitational and electromagnetic methods. And I tried to improve those devices and combine those kinds of data streams as I had be combining data from many sources before.
 
Well, I could write for months and not cover everything. But I will mention that I did check to see if the vector tidal signal needs to be corrected for the station’s rotation about the sun moon barycenter. And the answer is yes, but it is not clear. Remember the gravitational potential has more in common with a fluid at these high sampling rates. So it is better to think of “flows”, “fluctuations”, “currents”, “compressibility”, “composition”, “fluctuation spectrum” than some idealized rigid and unchanging field. I have been working on the theory of gravitational signals and the properties of the gravitational potential ever since Joe Weber told me that gravitational signals can be used for communication, and Robert Forward wrote that the gravitational and electromagnetic fields are based on the same foundation, you just need to have a common set of units. I worked that out for everyday things.
 
Readers probably won’t care what I did. To me it is my memories and things I felt were important. Now I am getting older, I can track a few things, but I don’t have the resources to pay people to make the kinds of sensors that I know can be used for this kind of imaging and communication. Robert Forward wrote about how to do lab measurements of gravitational signals. I can calculate it, but I have not done that in a lab or regional setting.  But it is well within the capabilities of most groups doing atom interferometer work now – if they would go to higher sampling rates.
 
The global lightning detection network evolved from Msps (mega samples per second) analog methods to global Gsps digital methods. And in the process allowed for global real time tracking of lightning. The same kind of approach will work to track earthquakes, volcanoes. Now if you set up global arrays, the sensitivity goes up, so the requirements for correlations go up, so the cost of computers and processing go up. But the correlation nodes and methods for radio astronomy and the VLBI will work just fine with gravimeter or gravitational detectors.
 
Now the LIGO and Mossbauer, atomic clock and any “gravitational time dilation” detectors are what I call “direct gravitational potential detectors”. They measure the change in the “pressure” “density” and composition of the gravitational potential. The combined gravitational and velocity time dilation equations that are used routinely for GPS and orbital corrections can have terms added for time dependency. And they can be put in a form that use the energy density so that magnetic and electric and pressure fields can be used. The gravitational energy density at the earth’s surface is roughly equivalent to a magnetic field of 380 Tesla. If you use the blackbody temperatures it is a bit over a million degrees, and the radiation temperature is about 1200 electron volts. I am writing from memory and I don’t use those often. I do keep in mind that the energy density of the gravitational field at the surface of the earth is in the “soft x-ray” region so I read EVERY x-ray paper that comes out, as I have time. Because at some point it will be possible to map and measure the local gravitational potential and its gradients at high 3D and temporal resolution.
 
I have worked hard for the last 24 years to study how groups use the Internet for global collaborations. Particularly for the cross-cutting issues that affect all countries and all people. There are tens of thousands of these things. EVERY country now is duplicating research, because every small group wants fame and fortune from anything they do. Rather than sharing what has already been found as living tools and workplaces with all the data that is available, each group tries to do the whole thing themselves. Now I have been writing for almost two hours and it is already past midnight. So I want to share what I have learned, but I just don’t have much energy left.
 
I would not have spent more than 40 years trying to sort out “What is the gravitational potential” if Joe Weber had not been so adamant that is was for communication. And Robert Forward wanted very much to combine the gravitational data and tools with electromagnetic data and tools. Now I would say “combine gravitational and electromagnetic sensors in global arrays with all the global sensor networks. For the Internet Foundation I found them all, learned how these sensors work, how to use the data, how to combine and correlate the data. So that any “purely gravitational” portions can be extracted.
 
But if you set the expression for the gravitational force between to masses and the electrostatic force between two masses and ask “how many electron charges per kilogram would be enough for the gravitational force to be replaces with an electron charge per unit mass – it is just a few millions. Not Avogadro’s sized. I wrote that for someone here the other day. Here is the link /?p=3556 and it was 537,851,028.228 electrons per kilogram – about half a billion electron charges per kilogram. And that is probably temperature dependent. I have other ways to model gravity, but it bothers me that people solve these same problems over and over again. and don’t simply measure, correlate, share data (not words about data), share algorithms (not words about algorithms).
 
If you go to https://hackaday.io/project/164550-low-cost-time-of-flight-gravimeter-arrays on the left side you can see one month of a calibrated dataset at a superconducting gravimeter station. This is just the vertical signal. The ONLY PARAMETERS needed are two numbers – an offset and a multiplier. That complex signal that varies in direction and intensity is a one line equation as I mentioned about. The sun at the station minus the sun on the center of the earth, the moon at the station minus the moon at the center of the earth, plus the centrifugal acceleration of the station on an ellipsoidal earth (WGS84 is what I used). Other than that the only data needed is the station latitude longitude and height, the sun moon and earth GM values. Thats it! Nie easy Newtonian gravitational acceleration. You can get the vector positions of the sun moon and earth in station centered coordinates for any time period from NASA Jet Propulsion Labs Horizon system at https://ssd.jpl.nasa.gov/horizons/app.html#/
 
The constants you need are under “extras” “astrodynamical parameters” at https://ssd.jpl.nasa.gov/astro_par.html
 
I wrote the original model in Turbo Pascal and calculated the positions from the JPL Chebyshev polynomials. It is not hard, just really tedious. If you want to use MatLab or Python they have those calculations built in now. AstroPy has them. I think Mathematica and Maple have them. I wrote Javascript programs to take the data from Horizon and do the correlations automatically. But JPL recently updated their Horizon site and I am not sure those work now.
 
Putting data online with all the required tools, where the programs can run on your computer would be the best solution. I asked JPL about taking data from global networks of gravimeters to give data back to JPL about the positions of the sun and moon.
 
When I was going through literally EVERY seismometer dataset to see which ones might be able to detect the sun moon vector tidal signal, it was brutal hard work. I had to use minimum a month at a time to see the signal at all, and I did not know how to use IRIS.edu that well. I worked blind for close to six months, already knowing how to calibrate a single axis SG (superconducting gravimeter) before I had “first light” view of all three axes of the seismometer with each the two numbers (offset and multiplier) to lock that instrument to the sun and moon signal.
Later I found was a set of instruments called the “Transportable Array” and these were placed all over the United States. And, because they moved sometimes there were instruments where the North East Vertical orientation of the instrument was not clear or not correct or not written. So, with only a linear regression needed, there is a LOT of data to work with for three axes. And if you run regressions allowing the orientation and position to vary. you can, over time, put bounds on the position and orientation of the instrument in earth centered and station centered coordinates. The JPL positions are good to 10 meters, 1 meter, 100 meters 0- it depends on the body. But they are doing lunar and Mars and all over the solar system projects – more and more every year. And I don’t think they miss their targets by much.
 
I call this method of using gravitational signals for precise position AND orientation “gravitational GPS“. It works in caves, or under water, or in bore holes, or in places where electromagnetic signals are not going to work. it is mainly a long term thing, with the current sensors. I think both Liang Cheng Tu and Richard Middlemiss used cell phone MEMS accelerometer technology as a starting point for their “gravimeters” (something that can measure the sun moon tidal signal). Could that be pushed, or some other technology, so that everyone can carry around a three axis gravimeter and use it to know where they are because of the local gravitational gradient changes? Kind of a stretch. But probably possible. I do know that there was a contract announcement for methods to update the magnetic orientation network to levels needed for cities and military now. I am just too tired to see now after three hours writing from memory.
 
I cannot just tell people what to do. I cannot pay them. I cannot afford most of the detectors and I cannot run an array of them myself because it requires spreading them to every part of the globe. But time of flight (high sampling rate) gravimeters and other gravitational detectors can focus inside the earth, or on the surface of the sun, or at the surface of the core of the earth, or the atmosphere or the oceans – and image them, the data used to improve climate and gravity models. It is not hard, just tedious. But I think the rewards and returns are worth it. Because in the process we could measure the speed of gravitational field adjustments precisely as electromagnetic ones, use such signals for imaging, use them for communication. I have some examples.
 
Sorry to write so long. I am a cautious person. I spent most of my life in public service in health, education, international development, energy, transportation – planning, modeling, gathering and organizing data for global problems.
Oh, I did also track most all of the levitation methods – field methods for moving masses – acoustic, electromagnetic, magnetic, beams, lasers, explosions. There are microgravity groups nullifying gravity that way. I would say, as I have for 40 years now, that if you create an acceleration field that matches what gravity does then it is equivalent. Groups can move things with fields now in programmed and precise paths (orbits even if they are arbitrary human chosen paths). That is good enough for me. Can we measure and exactly match gravitational acceleration inside whole volumes. Probably. Yes. it is not hard, just tedious. I call that “gravitational engineering”. Hard work, lots of measurements, lots of data and eventually nearly absolute control over matter and energy. Probably not in my lifetime.
 
Richard Collins, The Internet Foundation
Richard K Collins

About: Richard K Collins

The Internet Foundation Internet policies, global issues, global open lossless data, global open collaboration


Leave a Reply

Your email address will not be published. Required fields are marked *