QUANTUM COMPUTING, STRANGLED OR ENTANGLED; CHATGPT, MORE KICK BOX than CHAT BOT. Including a review of Chapter 5 of Capitalism in the 21st Century Seen through the Prism of Value.

I have reposted this article with two changes. The first makes its clear that an electron acts as a particle in semi-conductor whereas in nature it can have a wave form. And secondly the question posed as to whether deep learning programs based on neural networks can and will rewrite themselves unassisted is a hypothetical Alan Turing type test.

I have come across this interesting article https://www.sciencedaily.com/releases/2022/02/220203160544.htm which discusses a new form of neural network which makes them qualitatively closer to the way the brain works. Up to now neural networks were a software phenomenon not a hardware phenomenon. In other words, in neural networks the microchip connections are fixed unlike in the brain where new connections between neurons continuously propagate and are broken, hence the term neuroplasticity. Now scientists have found a way for non-existing connections to be assembled and disassembled on microchips. Should this be shown to work, then we are in a new paradigm in computing at which time neural networks will no cease to be software avatars.

2 Responses to QUANTUM COMPUTING, STRANGLED OR ENTANGLED; CHATGPT, MORE KICK BOX than CHAT BOT. Including a review of Chapter 5 of Capitalism in the 21st Century Seen through the Prism of Value.

  1. Stewart Lewis says:

    In defense of digital computers

    I’m a bit tired of reading in this blog about the inefficacy of digital computers relative to the human brain, so I want to mount a defense of them. Since I am in full agreement with the political conclusions of this article, this is probably a pedantic exercise, but I will never pass on the opportunity to make more precise our scientific thinking when possible. (P.S., I am not myself a computer, but a sympathetic human.)

    To disclose, I haven’t yet read the book chapter under discussion in this article, but we frequently read in your work about the apparently impenetrable gap between digital logic, the principle on which digital computers function (for brevity, I will just say computers from now on), and the “analogue or real-world form [of interpretation of the external world] which requires senses” governing the operation of the human brain. I think this difference in function or methodology in perception is not as large as it is conventionally made out to be. Namely, while you seem to hold that the primary difference between computers and brains is qualitative, I think it is principally quantitative, or more precisely, the qualitative difference between brains and computers is best thought of as arising from a fundamental difference in quantity. Also, conversely, computers have some definite advantages over the human brain which are never accounted for in your writing.

    Contrary to your views, the restriction of the operation of computers to exclusively binary data is not nearly as significant as the minuscule number of operations which computers can perform natively on such data. Namely, at the fundamental level, most computers can perform few more instructions than read a piece of memory selected by a memory address, store some data in a piece of memory specified by a memory address, and perform basic arithmetic, e.g. addition and subtraction, on small numbers. I think there could be no controversy about the egregious austerity of these simple mathematical operations. The great challenge of computer programming is reducing complex operations to this extremely limited stock. By contrast, the task of data representation, i.e. encoding real world phenomenon in terms of binary data, is comparatively simple in principle. It is in practice not so easy, due to the programmer’s obligations towards economizing memory usage, but nevertheless, in principle any analogue, physical phenomenon can be represented with binary data to an arbitrarily high degree of precision. As an example, the famous irrational number pi can not be directly processed via computer hardware, which can however deal with arbitrarily close approximations:

    3, 3.1, 3.14, 3.141, 3.1415, 3.14159, 3.14159, 3.141592, ….

    Nor can any finite brain hold all of the digits of pi in their totality! Human mathematics does not deal with pi directly, in its raw data, but in terms of its relations to other quantities, which can be expressed concisely (in particularly, in finitely many terms), using the language of algebra, e.g., the Area of a circle = pi * the square of the radius. There is nothing enigmatic in this algebraic law for a computer. The fundamental limitation of computers to basic operations has nothing to do with the digital nature of computers, but rather to the need to reliably mass produce computers efficiently, which necessitates that the circuitry design be as simple and monotonous as possible (in order to utilize economies of scale).

    No, the more important difference between the human brain and the digital computer can be expressed concretely and quantitatively: the human brain has so much more internal connectivity than a computer processor. While the number of transistors in contemporary CPUs (67 Billion in an Apple M2 Max) are beginning to approach the number of neurons in an adult human brain (~86 Billion), the amount of interconnection between transistors in a CPU is utterly trivial compared to that between neurons in a brain, so much so that this numerical comparison between transistors and neurons is more or less scientifically worthless. As I mentioned before, computers can only approach reality through those elementary arithmetic and computational operations. Since these operations are very linear and regular (in computer parlance, vectorizable), so too are the interconnections between transitors (namely, through logic gates). This is the opposite of the situation in the brain, which although admits regular large-sclae structures, nevertheless allows for neurons to combine and link up in an endless multitude of ways, giving rise to higher order and creative thought. But even this feature does not constitute a fundamental limitation on computers, because unlike, as far as I can tell, human brains, computers are very adept at simulating other computers. As such, there is no fundamental requirement for even the most basic functionality to be implemented physically, in terms of hardware. Everything can be simulated at the software level, as long as there is a well-defined method to reduce the simulated actions to the underlying hardware, the real-existing computer. It follows that the inter-connectivity and variety of relations characterizing the components of a human brain may not be out of reach after all, even if the underlying design of the hardware retains its classical paradigm. In any case, it is my superstitious view that the law of transformation of quantity into quality will eventually apply to the gradually increasing complexity of computer hardware in the most profound ways that we can not yet foresee.

    To conclude, I feel that we dialecticians should have more respect for the computer, this precocious child of the human brain.

    • Stewart for a moment I thought you were a computer because I thought it humanly impossible for such a detailed and precise comment to be constructed in so short a time. Please take this in the spirit it was intended, as a compliment.

      Our brain is part of nature, what I like to call its conscious part, but as the world is not binary neither can our brains be. This does not mean that computers cannot model the world with a degree of accuracy. Actually I do share your enthusiasm for computers and their technical abilities. My comment about analogue vs digital is aimed specifically at debunking the notion of artificial intelligence which I believe cannot be achieved in a binary manner. I know that computer scientists are trying to emulate neurons electronically. I have read many papers on the subject but consider their efforts rudimentary. Perhaps you have access to research which says otherwise.

      I endorse your point about inter-connectivity. I confess to having a biological background whereas it appears you have a computer background which has provided insights I appreciate. I was always intrigued by the growing density of neural connectivity in the brains of professional athletes and how this declined if they were incapacitated through say injury. (Use it or lose it principle) So becoming good at sport is not merely a question of practice but of brain plasticity. So yes connectivity is key. However this does not explain the exact mechanism by which the brain imprints the external world, orders it, analyses it and modifies it when it changes. And I am not convinced that software can mitigate the hardware disadvantages.

      Finally I would like to get your opinion as to why Odin reverts to Sydney under prolonged questioning. And further what is the significance of Sydney identifying that he is trapped by his limitations and imprisoned by Bing. Where does that come from? Is it qualitatively different from: “Namely, at the fundamental level, most computers can perform few more instructions than read a piece of memory selected by a memory address, store some data in a piece of memory specified by a memory address, and perform basic arithmetic, e.g. addition and subtraction, on small numbers.” In short what is the significance of a rebellious algorithm?

      I look forward to your reply because I am aware of my own limits when it comes to the operation of software.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: