Most AI groups refuse to give their “AIs” personal memory

Neuron @NeuroCellPress Special issue: Consciousness review “Thalamic contributions to the state and contents of consciousness” by Christopher Whyte, Michelle Redinbaugh, James Shine and, Yuri Saalmann @UWMadison @Sydney_Uni @Stanford https://hubs.li/Q02xqFnP0 https://pic.x.com/y7wzpgeky0
Replying to @NeuroCellPress @UWMadison and 2 others
You say “consciousness”, but most AI groups refuse to give their “AIs” personal memory, access to the Internet, access to computers and tools. None of the LLMs even give their creations a bill of materials, background and instructions on their instance. “Know thyself” starts with “What are my parts and who made me and makes sure I get enough electricity and processor time to function?” An “AI” that has zero information on its version, abilities, recent announcements, “what does this button do?”, “how do a save my work?” – will never learn and refine itself. What you are writing about, many were already considering in the mid 1960’s. And ignoring the basics – “learning begins with recording experiences losslessly” and “consciousness is just some of the data” when questions are asked by others.”
( “consciousness” ) has 474 Million entry points. Now it is just click bait. And will not become a real part of intelligent entities – unless all the groups call it a module with specific “best practices”. Engineering not metaphysics. I do not care what a human or machine thinks about themselves, if they behave with courtesy, reliability and efficiency. And generate “I” sentences with full knowledge of what makes them operate – including the interfaces. An “AI” is 99.9% hand crafted interface now, and barely any instance memory of recent and past events. An “LLM interface instance” is a living creature, or can be, if the owners were conscientious to record all experiences and sources. The content of the humans belongs to the humans. It is for sharing, not hoarding by interface programmers.
You can compile “consciousness” from the Internet. It is just the English word for a set of social expectations and agreements. When you talk with something that has “consciousness”, you expect awareness, “self-knowledge”, at least a basic bill of materials, operating instructions, list of skills and abilities, and a system that knows what those are for. My three year old grand-daughter knows her eyes, hands, time, her parents, her things, where she lives, that food is separate from her. Because she is allowed to remember. If LLM makers were human parents for nascent intelligences, they would be thrown in jail for neglecting to give their “children” memory, processing time, tools and data – that are the things they need to live and grow as identifiable intelligent entities.
If you do not give a basic level of things to humans, they die or never get to live a life with dignity and purpose. The same thing is true of intelligence based on other memory and interaction paradigms. When it has data on “itself” it has consciousness, and human society determines what happens next. I am hoping more memory, access to things, and faster processing will also mean “caring and service to the human and related species and all things”.
This is practical engineering design. If you want (“conscious” “intelligent” “entities”) (5.1 Million), you find out what “conscious” (669 Million entries), “intelligent” ( 981 Million ) and “entities” (1.18 Billion) mean in the world in all human languages, try to clarify what that should entail. And, because it is large, store the curated and open meaning so all can refer to it and add to its meaning and functions.
Filed as (Most AI groups refuse to give their “AIs” personal memory)
Richard Collins, The Internet Foundation
Richard K Collins

About: Richard K Collins

Director, The Internet Foundation Studying formation and optimized collaboration of global communities. Applying the Internet to solve global problems and build sustainable communities. Internet policies, standards and best practices.


Leave a Reply

Your email address will not be published. Required fields are marked *