Illinois Quantum: IQUIST 2022-2023 Seminar Series at https://www.youtube.com/watch?v=qSykuoY_rBA
Graeme Smith, JILA and Colorado University Boulder
If you are going to optimize using humans, then you need to think in decades and centuries. If you use supervised AIs and carefully check their work, you can do it in years and decades. You ought to spend more time measuring the non-additivity”, that is the analog part. That is the machine and experimental part. Once methods are packaged, with humans using the packages and systems, the methods and systems will have to be simple and reliable, repeatable, traceable, and verifiable. The word you are missing is “calibration”. And you need to be a lot more explicit about the human social economic and financial environment of all parts of any long range projects. And, because the quantum fields are not shielded, they will have long range “noise” of many sorts.
Much of the ambiguity in the analog portion of what you are doing is related to lack of global precision measurements of background fields. There are tens of thousands of groups working on parts of that. In each area many groups not working together, and overall specialized groups not working between fields to resolve and merge methods. You are so near the beginning, you have not spent decades exploring and mapping what happens down so many pathways. Much of what you are using is partial and incomplete. Most of those holes and inconsistencies are already being worked out. But some of the key parts are being resolved in fields you are not even looking at. There are 8 billion humans, about 5 billion using the Internet, and a current 2 billion from 4 to 24 years old learning things for the first time.
I filed this under “Calibrating the quantum noise globally, integrating “quantum” into society”
Richard Collins, The Internet Foundation