Papers by Christopher Altman
In order to confirm the existence of quantum entanglement, we conducted detailed laboratory exper... more In order to confirm the existence of quantum entanglement, we conducted detailed laboratory experiments to measure correlations between between pairs of photons generated through parametric down conversion. Using λ2 and λ4 wave plates, we were able to distinguish between the maximally-entangled Bell states. By calculating the value S of a function we determined violation of Bell’s inequality, which indicates that the photons are indeed entangled. In the upcoming chapters, we will first discuss the underlying theoretical principles of entanglement, outline our experimental setup, proceed to the results we obtained, and in the final chapter we will discuss the results.

Quantum machine learning in adaptive quantum networks, 2022
One of the central goals in the field of quantum computing is the development of novel algorithms... more One of the central goals in the field of quantum computing is the development of novel algorithms and protocols. This thesis advances the state-of-the-art in some of the most topics in quantum machine learning, a rapidly evolving area that holds the promise of developing quantum algorithms showcasing distinct advantages over their classical counterparts.
The most notable and well-understood quantum algorithms are designed for applications where security and computational capacity are paramount. Examples include massively parallel, resource-intensive high-dimensional datasets in artificial intelligence (AI), critical defense and national security applications, sensitive genetic, medical or biometrics data, and running large-scale quantum simulations for drug design, protein folding, and nano-materials discovery via quantum chemistry.
Practical quantum computing requirements demand millions of qubits and a seamless technology interface with exascale to zettascale classical supercomputing systems. The new Stargate initiative, an industry partnership between Microsoft and OpenAI to create an AI supercomputer data center supported by a purpose-built 5GW nuclear reactor, calls for an estimated budget of USD $100 billion. A successful merger of quantum and classical approaches dictates that efficient resource allocation achieve (1) maximum Chinchilla scaling factors applied to state-of-the-art large language models (LLMs) on order of trillions of parameters coupled with (2) strict DiVincenzo criteria with requirements that a practical quantum computing architecture must readily scale to millions to ten

We propose a fundamentally new quantum communications protocol based on continuous-variable secur... more We propose a fundamentally new quantum communications protocol based on continuous-variable secure quantum communications to provide space-based NASA assets with unconditional information security. All command-and-control inputs to spacecraft sent over the network are guaranteed to be impossible to spoof by any US adversary, protected by the fundamental laws of physics — whereas there is no proof for conventional secure communication reliant upon such metrics as SSL, RSA, and other classical encryption schemata. This type of fundamental security is even guaranteed if the encrypted traffic must pass over insecure and public networks not under NASA, or even US, control. Our capabilities will provide NASA with a permanent solution for its secure communication needs, and directly conform to specifications outlined by NASA for unconditional information security as referenced in the 2010 NASA Communication and Navigation Systems Technology Roadmap. Our visionary approach brings together the expertise of the two leading European groups involved in demonstrating quantum communications over a 144km free-space optical path in the Canary Islands, together with the quantum and optical communications expertise of NASA Jet Propulsion Laboratory, the continuous-variable quantum communications expertise of Quintessence Labs, an elite cadre of spaceflight-ready NASA, ESA, CSA-trained astronauts and NASA astronaut training instructors, in conjunction with the lunar and astronaut training facilities at the Pacific International Space Center for Exploration Systems (PI
Converging Technologies: The Future of AI and the Global Information Society, 2002
The complex web of the global information grid will undergo explosive changes over the coming dec... more The complex web of the global information grid will undergo explosive changes over the coming decades. As advances in science and technology converge, a myriad array of discoveries in biotechnology, nanotechnology and information technology will produce unpredictable effects that must be accounted for in any estimate of what the world will look like in this future. A strategically important feature of this world will be the emerging trend of information warfare. Though still immature at present day, this trend will become increasingly dominant in the years to come.
The most fundamental of physical theories, quantum mechanics has made little progress in reconcil... more The most fundamental of physical theories, quantum mechanics has made little progress in reconciling the illogical, often counterintuitive implications of its discoveries, and of its consequences for consciousness, the observer, and the universe. A formal treatment of the mathematical underpinnings of quantum mechanics lies beyond the scope of this paper, but this will not prevent a brief examination of contending theoretical interpretations. Counter to dominant, Newtonian-conditioned perspectives, the world at its most fundamental level behaves in wholly unexpected ways.

International Journal of Theoretical Physics, 2012
We introduce a robust, error-tolerant adaptive training algorithm for generalized learning paradi... more We introduce a robust, error-tolerant adaptive training algorithm for generalized learning paradigms in high-dimensional superposed quantum networks, or adaptive quantum networks. The formalized procedure applies standard backpropagation training across a coherent ensemble of discrete topological configurations of individual neural networks, each of which is formally merged into appropriate linear superposition within a predefined, decoherence-free subspace. Quantum parallelism facilitates simultaneous training and revision of the system within this coherent state space, resulting in accelerated convergence to a stable network attractor under consequent iteration of the implemented backpropagation algorithm. Parallel evolution of linear superposed networks incorporating backpropagation training provides quantitative, numerical indications for optimization of both single-neuron activation functions and optimal reconfiguration of whole-network quantum structure.

We outline an adaptive training framework for artificial neural networks which aims to simultaneo... more We outline an adaptive training framework for artificial neural networks which aims to simultaneously optimize both topological and numerical structure. The technique combines principal component analysis and supervised learning and is descended from the mathematical treatment for continuous evolution of discrete structures as introduced in quantum topology (1, 2). The formalism of the training algorithm, first proposed in (3), is optimal for tasks in associative processing and feature extraction. The procedure is unique in harnessing a coherent ensemble of discrete topological configurations of neural networks, each of which is formally merged into the appropriate linear state space via superposition. Training is carried out within this coherent state space, allowing for parallel revision of differing topological configurations at each step. The primary feature of our model is that network topologies are represented as specific states of the simulator. Network training results in c...
We review recent advances in optical microlens array fabrication using piezoelectric microchannel... more We review recent advances in optical microlens array fabrication using piezoelectric microchannel mounted nozzles to deposit polymer droplets to various substrates via drop-on-demand (DOD) fashion. The technique produces microlenses in a similar fashion to photoresist and can be employed to produce spherical and nonspherical lenses on the order of 20µm to 3mm at rates of up to 25 kHz on demand. Advantages include high reproducibility, flexibility and process automation while offering a simple and cost-effective fabrication solution to produce microlens arrays in multiple application domains.
We introduce superposition-based quantum networks composed of (i) the classical perceptron model ... more We introduce superposition-based quantum networks composed of (i) the classical perceptron model of multilayered, feedforward neural networks and (ii) the algebraic model of evolving reticular quantum structures as described in quantum gravity. The main feature of this model is moving from particular neural topologies to a quantum metastructure which embodies many differing topological patterns. Using quantum parallelism, training is possible on superpositions of different network topologies. As a result, not only classical transition functions, but also topology becomes a subject of training. The main feature of our model is that particular neural networks, with different topologies, are quantum states. We consider high-dimensional dissipative quantum structures as candidates for implementation of the model.
Soren Kierkegaard and Friedrich Nietzsche both felt that life is irrational. They were problem th... more Soren Kierkegaard and Friedrich Nietzsche both felt that life is irrational. They were problem thinkers who chose not to follow the systematic approach to philosophy as their predecessors did. In this regard, they stood on common ground. Both realized that no system of philosophy operates in isolation of its creators inherent prejudices. Any subjective viewpoint is biased; hence, objectivity is impossible in any moral paradigm.
The SQUID is a highly sensitive device employed for non-destructive measurement of magnetic field... more The SQUID is a highly sensitive device employed for non-destructive measurement of magnetic fields, with a host of applications in both biophysics and materials technology. It is composed of a cooled superconductive metal ring separated by a thin insulating barrier of non-superconducting metal, forming a Josephson junction. Electron tunnelling through the junction can allow for the measurement of electromagnetic field fluctuations as minute as 10 Tesla, or one femto-Tesla – some 10 times less than the earth’s natural magnetic field. An rf-SQUID is essentially a Josephson junction with tunable current and energy.

The study of consciousness and its origins has experienced radical breakthroughs with the advent ... more The study of consciousness and its origins has experienced radical breakthroughs with the advent of emerging technologies in the field of neuroscience. Once overcast with a veil of mystery, the inner workings of the brain have undergone profound illuminations made possible by rapid advances in the computing industry, coupled with insights gleaned through medical science. New technologies such as MRI, MEG, and PET scanning have enabled scientists to understand the neurological correlates of mental processes in everfiner detail, allowing them to glimpse the interior of a fully functioning brain in real-time. However, even with these exponential advances in the field of consciousness, laying a theoretical foundation to explain precisely how it arises currently remains a cryptic and insurmountable task. The most important scientific discovery of our time will be when this problem is resolved.
Page 1. 1 22 December 2002 THE BOUNDARIES OF EMPIRICAL KNOWLEDGE Christopher T Altman∗ Pierre Lac... more Page 1. 1 22 December 2002 THE BOUNDARIES OF EMPIRICAL KNOWLEDGE Christopher T Altman∗ Pierre Laclede Honors College There is a real problem here. The people who ought to study and argue such questions ...
The SQUID is a highly sensitive device employed for non-destructive measurement of magnetic field... more The SQUID is a highly sensitive device employed for non-destructive measurement of magnetic fields, with a host of applications in both biophysics and materials technology. It is composed of a cooled superconductive metal ring separated by a thin insulating barrier of non-superconducting metal, forming a Josephson junction. Electron tunnelling through the junction can allow for the measurement of electromagnetic field fluctuations as minute as 10 Tesla, or one femto-Tesla – some 10 times less than the earth’s natural magnetic field. An rf-SQUID is essentially a Josephson junction with tunable current and energy.

We introduce a robust, error-tolerant adaptive training algorithm for generalized learning paradi... more We introduce a robust, error-tolerant adaptive training algorithm for generalized learning paradigms in high-dimensional superposed quantum networks, or adaptive quantum networks. The formalized procedure applies standard backpropagation training across a coherent ensemble of discrete topological configurations of individual neural networks, each of which is formally merged into appropriate linear superposition within a predefined, decoherence-free subspace. Quantum parallelism facilitates simultaneous training and revision of the system within this coherent state space, resulting in accelerated convergence to a stable network attractor under consequent iteration of the implemented backpropagation algorithm. Parallel evolution of linear superposed networks incorporating backpropagation training provides quantitative, numerical indications for optimization of both single-neuron activation functions and optimal reconfiguration of whole-network quantum structure.
In order to confirm the validity of the entanglement hypothesis as predicted by quantum theory, co... more In order to confirm the validity of the entanglement hypothesis as predicted by quantum theory, counter to Einstein, Podolsky and Rosen’s objections as outlined in, we conducted detailed laboratory experiments to measure correlations between pairs of photons generated through parametric down conversion. Bell’s theorem, which states that no physical theory of local hidden variables can reproduce all of the predictions of quantum mechanics, was tested, and confirmed, by our experimental results.
Uploads
Papers by Christopher Altman
The most notable and well-understood quantum algorithms are designed for applications where security and computational capacity are paramount. Examples include massively parallel, resource-intensive high-dimensional datasets in artificial intelligence (AI), critical defense and national security applications, sensitive genetic, medical or biometrics data, and running large-scale quantum simulations for drug design, protein folding, and nano-materials discovery via quantum chemistry.
Practical quantum computing requirements demand millions of qubits and a seamless technology interface with exascale to zettascale classical supercomputing systems. The new Stargate initiative, an industry partnership between Microsoft and OpenAI to create an AI supercomputer data center supported by a purpose-built 5GW nuclear reactor, calls for an estimated budget of USD $100 billion. A successful merger of quantum and classical approaches dictates that efficient resource allocation achieve (1) maximum Chinchilla scaling factors applied to state-of-the-art large language models (LLMs) on order of trillions of parameters coupled with (2) strict DiVincenzo criteria with requirements that a practical quantum computing architecture must readily scale to millions to ten
The most notable and well-understood quantum algorithms are designed for applications where security and computational capacity are paramount. Examples include massively parallel, resource-intensive high-dimensional datasets in artificial intelligence (AI), critical defense and national security applications, sensitive genetic, medical or biometrics data, and running large-scale quantum simulations for drug design, protein folding, and nano-materials discovery via quantum chemistry.
Practical quantum computing requirements demand millions of qubits and a seamless technology interface with exascale to zettascale classical supercomputing systems. The new Stargate initiative, an industry partnership between Microsoft and OpenAI to create an AI supercomputer data center supported by a purpose-built 5GW nuclear reactor, calls for an estimated budget of USD $100 billion. A successful merger of quantum and classical approaches dictates that efficient resource allocation achieve (1) maximum Chinchilla scaling factors applied to state-of-the-art large language models (LLMs) on order of trillions of parameters coupled with (2) strict DiVincenzo criteria with requirements that a practical quantum computing architecture must readily scale to millions to ten