Internet, X, Allen Institute, SuperIntelligent corporations and groups, a heliospheric economy for 10,000 years

Internet, X, Allen Institute, superIntelligent Corporations and groups, a heliospheric economy for 10,000 years

You are trying to promote the Allen Institute and not really helping to connect people and groups using @X with groups working on #HumanBrain or #RatBrain. You linked to a vague summary on the Allen Institute page. That author might have lots to say and contribute but not they way you did it. The original paper is accessible after more steps but locked in PDF so all the living resources and groups and links are lost. Do not use PDF as your only format.  CC: @elonmusk

The raw data and tools are at least listed and linked somewhat. But the raw data is locked in an HDF5 format and most of the 5.4 Billion humans using the Internet do NOT have access or tools to use that. It is not a browser or AI supported format (@grok @openai @GoogleAI @microsoft @MSFTCopilot )
 
Even I have used most every format on the Internet have trouble with that HDF group because they are not working at global scale for all humans, just clusters of specialists who do not work as a whole, do not keep track of their internet activities and users.
 
I am a bit tired this morning. It is hard to summarize decades of working on Internet issues like #GlobalOpenFormats but to keep it simple here: All the AIs ought to know all data formats. All data formats should know how to talk lossless to the AIs. The methods for human access to HDF format data (any format) should be immediately applicable to the data in a form that each human or AI can use.
 
I set up the central Economic and Social Database for USAID and the State Department (and Joint Chiefs) in the early 1980s. It had data from UN, US Federal agencies and other organizations on all countries. The main users in State and AID were trained in economics, statistics and a few other skills. I made sure those pathways were supported for easy and instantaneous use of all the data. There were about a dozen major datasets and they were all, originally, in different and mostly incompatible formats and systems. It could take a researcher (project designer, program (multiple countries), project evaluator, program evaluator, analyst, statistician, economist, etc etc ) a year per original dataset. But combined in an efficient single format any of them could use in hours or days.
 
I had to take each one of the originals, and put them in an index, make sure it was explained and nothing left out, set the units and dimensions to common global units, the set up programs to update the main databases as new versions came in.
 
[ Now HDF5 those groups are working, but they are not working with ALL text and binary formats. It takes a lot more care and curation to do it right. Now there are huge datasets still using XML. It was a fad and seemed like a good idea. Grab a record and all the field name and supporting information is right there. But on datasets with millions or billions of records, all the field names and references are massively duplicated and sometimes versioned. You might say “we would never do that”. ]
 
Here I am looking at rat brain recorded data. I have not gone through their whole project, but I have been working on neural networks (wet and dry) EEG, MEG, EMG, EKG and related datasets and methods since 1966. Now I know many dozens of ways groups are trying to look at machine learning for neural net data.
 
3D Multispectral cameras with lossless recording and processing. 3D lossless acoustic arrays. 3D lossless magnetic arrays. 3D lossless aeronomy arrays. It is hard to say it simply – all data, all sensors. It takes that to sort though what is possible. Too quickly the individuals involved narrow down. But the lossless data can serve a global open collaboration that lasts for many decades and that continuously improves.
 
I turned 75 this year. I am getting tired as I understand the whole of the Internet and all human activities and research. But let me encourage all of you to look at end-to-end journeys of data on the Internet for this simple paper, one group and their related groups. I can tell you from 26 years every day tracing them out and using the datasets – it always ends up “the whole Internet”, “all human languages”, “all human knowledge” and the lossy formats, the unsupported formats, the proprietary and locked formats, the “it does not work without a HUGE development setup on every persons computer” – they break the chain so things like “how does the brain work” get solved but the piece of that solution NEVER can be combined and continue to be part of a global understanding.
 
@elonmusk, I saw that you said “all AI companies are racing to build digital superIntelligence.” But it was so badly fragmented and repeated and gamed, it did not come out as anything that could change things. Another “Oh,, there is Elon Musk talking again. Good click bait. But your message, maybe what you hope will happen because you say things — did not emerge and become something real.
 
All corporation are “superIntelligences”.
 
All countries as knowledge and material processing systems are “superIntelligences”
 
All groups of humans are “superIntelligences”
 
The AIs would be wannabe “superIntelligences” if they are taking knowledge from many living sources.
 
It is real time, it is lossless and remembers everything.
 
A corporate executive information system lets you see the whole world and all things. If you want to play with all sensors focused on a human, a person, a corporation, all systems and source networks for all things in any of your Starship missions — that can be done.
 
Now 5.4 Billion humans using the Internet (and 2.8 Billion more ) are faced with billions of fragmentary copies of “human knowledge” on the Internet. But it is a highly variegated and lossy network. ALL links now are lossy.
 
The AI systems for “all human languages” and “all domain specific languages” are modules you plug in. But you also need modules that continuous improve the whole. Not just for your tiny enterprise, but for the whole human species, all related species, all emerging AIs that have human compassion and ability to use and change and improve “all knowledge”
 
Selling modules makes money, but if you do not nurture and sustain the human and related species it can all die. Or “be less than it could be”.
 
You know all this, but you are talking about it, not helping all humans to do it. “Make a solution and sell it” is not the same as “all humans and AIs living lives with dignity and purpose in a stable Earth and heliospheric economy for the next 10,000 years.
 
Richard Collins, The Internet Foundation
Richard K Collins

About: Richard K Collins

Director, The Internet Foundation Studying formation and optimized collaboration of global communities. Applying the Internet to solve global problems and build sustainable communities. Internet policies, standards and best practices.


Leave a Reply

Your email address will not be published. Required fields are marked *