Curiosity provides the gradient for unravelling the mechanism and dynamics behind the manifestations of remarkable wonders of mother nature.”

It all started with World War II; Alan Turing helped the British government by cracking the Enigma cipher machine and decrypting Nazi Germany’s secret communication.

Curiosity, the ability to ask questions and the ability to have multiple perceptions of the same event, object or situation has made human capable of dominating the earth. Despite being a lowly type zero (0.72, to be precise) on the Kardashev scale that measures technological development of a civilization, humanity has achieved a great lot so far.

A few famous quotes from Richard Feynman which I think have paved humanity’s future are:

What I cannot create, I do not understand.” – Richard Feynman.

I would rather have questions that can’t be answered than answers that can’t be questioned.” – Richard Feynman.

I firmly believe that the essence of these quotes is deeply rooted in humans since the very beginning, and that has helped us in reaching the key milestones of technology.  

We, humans, have made incredible progress in developing new algorithms and instruments to help ourselves understand this remarkable optimized infrastructure of evolution over three billion years on the pale blue dot.

The existence which we all experience and cherish every day is governed under a set of hierarchical infrastructures which works elegantly to our amazement, and it leads to the most beautiful and perhaps “the” most rewarding gift to our cosmos after Big Bang – LIFE. Every discovery we have made, and every question we have answered so far have helped us to unravel the nature of the universe and the rules that govern life itself. We are still in quest of finding the answers to the mysteries of the universe.

But let’s cease our everlasting quest for a moment and appreciate the sublime deeds of the advancement of technology and AI.

Alan Turing’s question, often referred to as the “Turing Test,” is derived from his 1950 paper “Computing Machinery and Intelligence.” In this paper, Turing proposed a test to determine whether a machine could exhibit intelligent behaviour indistinguishable from that of a human. He asked, “Can machines think?”

Turing replaced this question with the idea of an “imitation game” (now known as the Turing Test), which involves a human judge engaging in a natural language conversation with another human and a machine without knowing which is which. If the judge cannot reliably distinguish between the human and the machine based on their responses, the machine is said to have passed the Turing Test and exhibited human-like intelligence.

Since then, we have been working hard to make the machine think, and in order to do that, electronics and AI are working side by side to complement each other and make each other faster, easier, more accurate and energy-efficient.

AI and Algorithm Development Timeline:

In the late 1950s-1960s, early AI programs, such as Samuel’s checkers-playing program and McCarthy’s Lisp programming language, were developed. One of the earliest language-processing computer programs, ELIZA, was created in the late 1960s. In the same decade, Shakey the Robot, one of the first AI-powered robots, was developed at Stanford Research Institute, and since the 1980s, the development of AI algorithms for machine learning and neural networks, including the backpropagation algorithm, has escalated. In the developmental process, we had one of the most remarkable technologies of all time, Pranav Mistry’s SixthSense, a wearable device that enables gesture-based interaction with the digital world. Companies like Boston Dynamics have shown us the power of AI through their magnificent advanced robots, such as Atlas, Spot, and BigDog. Meanwhile, we made Sophia: a humanoid robot that uses AI to communicate and interact with humans. She is designed to learn from her interactions and to become increasingly intelligent over time.

Furthermore, AI also stepped into the realm of biological challenges and gave us DeepMinds’ AlphaFold. It helps us to predict the structure of a protein which has always been one of the serious impediments to scientific progress.  

Since 2018, AI has taken a drastic leap that has changed the entire world; as you might have rightly guessed, I am talking about the “GPT” era. In 2018, OpenAI released GPT-2, an advanced generative pre-trained transformer model, and subsequently, it released GPT-3 in 2020, and now GPT-4 in 2021. GPT is based on LLMs (Large Language Models). It has been trained using all the available information on the internet till the date it was fed with the information. OpenAI has made AI based language model available to us through an online Chat-box platform known as ChatGPT. Although ChatGPT4 is subscription-based, ChatGPT 3 is free to use. The ability of ChatGPT to create any information is remarkable. Additionally, the advancement in Natural Language Processing and AI, in general, has given us DALL-E. DALL·E 2 is an AI system that can create realistic images and art from a description in natural language. It has the capability of generating images which has never ever existed before. In the scenario, literally, imagination is the only limit here.

ChatGPT-4 has revolutionized communication, enhancing productivity and efficiency while fostering global connections. However, concerns persist about privacy, addiction, mental health, job displacement, and potential misuse of technology for manipulation and disinformation.

Electronics Development Timeline:

It is impossible to talk about the growth of AI without the development of electronic devices, especially transistors. There has to be a concurrent growth between AI and transistors.

The transistor has experienced significant development since its 1947 inception by Bardeen, Brattain, and Shockley. Early point-contact transistors paved the way for the revolutionary bipolar junction transistor (BJT), fuelling the electronics industry’s growth. The 1960s brought the metal-oxide-semiconductor field-effect transistor (MOSFET), heralding the digital era. Modern transistors have reached nanometre scales, with cutting-edge 3nm and 5nm processes in production. Recently in 2021, IBM announced 2nm chip technology, which promises increased performance and energy efficiency.

These tiny sizes enable billions of transistors on a single microchip while minimizing power consumption and boosting performance. This relentless miniaturization, driven by Moore’s Law, has spurred technological advancements across various sectors, profoundly impacting our lives.

We have come a long way in developing efficient transistors, but still, we are constantly fighting against the quantum tunnelling phenomenon. I would like to mention a recent work by Dr Deblina Sarkar at MIT in developing novel nanoelectronic devices ( such as Quantum Devices, Spintronics, and Neuromorphic) employing ingenious device physics and smart nano-materials to achieve extreme energy efficiency, scalability and massive reduction of Green House Gases for sustaining the growth of Artificial Intelligence (AI). Other people are also working to harness the quantum tunnelling phenomenon in order to reduce the energy barrier of the depletion regions in the transistors. The main goal is to make transistors that consume negligible power and generate almost no heat.

Now, in the era of quantum and GPT, we have empowered ourselves and brought a hopeful future for the upcoming generations. Let’s delve a little deeper into the quantum realm, starting from Richard Feynman’s proposal in 1982 to build a computer based on quantum mechanics up to recent developments like IBM’s quantum computers, Google’s announcement of quantum supremacy, and the ongoing efforts to build fault-tolerant, scalable quantum computers. Quantum computing has the potential to speed up machine learning, optimization problems, and other computationally intensive tasks that are common in AI research. This can lead to breakthroughs in AI capabilities and applications.

Quantum computing has a potential impact on the field of electronics. Quantum computing could drive the development of new materials and devices with novel properties, as well as enable the design of more efficient electronic systems. The key difference between quantum and classical computers is the use of qubits instead of bits, which allows quantum computers to perform calculations much more efficiently and solve problems that are currently beyond the capabilities of classical computers.

However, we have several crucial challenges to overcome to make quantum computing more practical and accessible, such as error correction, hardware development, and the creation of efficient quantum algorithms. Many companies are working to build quantum computers. Some of them are IBM, Google, Rigetti Computing, IonQ, D-Wave Systems, Microsoft, Quantum Circuits, Inc. (QCI), and many more. In 2021, Google announced that they had achieved quantum supremacy with their 65-qubit processor, Sycamore.

In conclusion, the journey of human endeavour in technology, from the Enigma machine to the era of GPT and quantum computing, reflects our insatiable curiosity and drive for progress. As we continue to innovate and break boundaries, we must remain mindful of the challenges and ethical implications that come with these advancements. By fostering responsible development and use of technology, we can ensure a brighter, more connected, and sustainable future for generations to come.

Leave a Reply

Your email address will not be published. Required fields are marked *