Wednesday, July 9, 2025

Is Contemporary AI a Mature Engineering Discipline?

 

At a recent industry meetup, a thought-provoking question emerged: Has Artificial Intelligence (AI) evolved into a mature engineering discipline? The discussion was lively but largely focused on AI in isolation. That conversation stuck with me, prompting a deeper dive—not just into AI, but into how engineering disciplines as a whole mature over time.

As an electronics engineer by education and a software professional by trade, I’ve had the privilege of witnessing the evolution of two of the most impactful fields of the last century. Exploring their journeys offers valuable context for assessing where AI stands today.

Electronics Engineering: From Breadboards to Modular Mastery

The early days of Electronics Engineering involved manually designing circuits on paper, followed by prototyping on breadboards using discrete components—resistors, capacitors, diodes, and transistors. Once a design was validated, it was implemented on a fabricated Printed Circuit Board (PCB) for practical use.

The advent of Integrated Circuits (ICs) shifted the focus away from discrete components to chip-based solutions, enabling the development of more complex and compact devices. With the introduction of electronic design automation (EDA) software, breadboards became obsolete. Engineers could now design and simulate circuits virtually, while innovations in PCB manufacturing allowed for multilayer, high-density layouts, driving massive growth in the electronics industry.

As circuit complexity increased, the repairability of boards diminished. Faulty boards were more cost-effective to replace than repair, giving rise to a module-based approach where entire subassemblies—power supplies, heatsinks, sensors—were treated as replaceable units. This replaceability, coupled with the establishment of standardized design processes and professional certifications, reflects the discipline's maturity.

This evolution from component-level design to modular systems is one axis of engineering maturity. Another is the educational background required for practitioners. Initially, expertise in electronics demanded a PhD-level understanding. Today, a broad spectrum of professionals—from electronics engineers and science graduates to high school hobbyists—can contribute meaningfully to the field.

Software Engineering: From Binary to Low-Code

Software Engineering has experienced a similarly dynamic journey. Early programming involved writing binary code, followed by assembly language. The development of the C programming language marked a turning point, unlocking higher-level programming languages such as COBOL, which ushered in increasing levels of abstraction and domain-specific specialization.

Today, we have dedicated languages for specific tasks: HTML, CSS, and JavaScript for front-end development; SQL for data manipulation; and many others. Libraries and frameworks—like Java’s Struts for web applications—further simplified the development of complex systems. The latest trend is toward low-code and no-code platforms, allowing users to configure applications through visual interfaces rather than writing traditional code.

Metrics in Software Engineering have also matured. Early measures focused on hardware utilization and code efficiency (e.g., Chidamber & Kemerer metrics), but contemporary metrics, such as those in the DORA framework, emphasize business impacts and user satisfaction.

 

Innovations in storage media from punched cards and magnetic tapes to floppy disks, hard drives (HSSs), and solid-state drives (SSDs) have further supported the field’s ability to handle larger instruction sets and data, reflecting its growing maturity and capacity to address societal needs.

On the education front, software engineering has democratized significantly. What once required a PhD is now accessible to computer science graduates, electrical engineers, general science graduates, and even self-taught individuals and high school students.

Contemporary AI: Powerful but Early in Its Journey

In contrast, contemporary AI remains in its early stages of development. The field demands a deep understanding of complex mathematical concepts—such as linear algebra, statistics, and optimization—alongside specialized programming skills in languages like Python and frameworks like TensorFlow or PyTorch. This high entry barrier mirrors the educational requirements of early Electronics and Software Engineering.

 

AI is characterized by rapid innovation, evidenced by the proliferation of research papers and the emergence of new standards, such as the Agent Name Service (ANS) and Model Context Protocol (MCP). Experimentation with novel hardware architectures and internal communication protocols (e.g. DeepEP by Deepseek) underscores the field’s dynamic nature. However, current metrics in AI remain narrowly focused on technical aspects, such as hardware utilization, code efficiency, and the accuracy or "truthfulness" of outputs, rather than broader business or societal impacts.

 

Significant challenges persist, including the need for explainable AI to ensure transparency in decision-making and the ethical implications of deploying AI systems. These factors, combined with the field’s reliance on cutting-edge expertise and its evolving foundations, suggest that contemporary AI has not yet reached the maturity of its engineering counterparts.

Conclusion

While contemporary AI has made remarkable strides, it has not yet achieved the maturity seen in Electronics Engineering or Software Engineering. Its dependence on advanced technical knowledge, the rapid pace of research, and the predominance of technical metrics all indicate a discipline still in its formative stages. By contrast, Electronics and Software Engineering have developed standardized practices, broadened educational accessibility, and shifted their focus to encompass broader impacts—hallmarks of mature engineering fields.

 

To advance AI toward maturity, the community must prioritize several key areas: investing in education and training programs to widen access, establishing industry-wide standards to stabilize practices, and reorienting metrics to measure societal and business value alongside technical performance. Only through such efforts can AI fully evolve into a robust and mature engineering discipline.

Tuesday, July 8, 2025

Should we fear AI?

 

Upon conducting an analysis encompassing social media, writable internet platforms (including blogs, podcasts, vlogs, etc.), and traditional media outlets (such as newspapers, magazines, TV, etc.), one may observe a substantial volume of content that instills fear regarding Artificial Intelligence (AI). This article would like to explore the origins of this apprehension and provide an assessment of the validity of these concerns.

Before proceeding further, it is essential to set a comprehensive definition of AI.

Artificial Intelligence refers to a technology designed to autonomously identify patterns within underlying data, patterns that its creator does not explicitly predetermine.

To comprehend the underlying reasons for apprehension, consider the following points:

1) Rate of Change: With the accelerating development of technologies and the usage of technologies in society, the rate of adoption of technologies is accelerating.

Here is an illustration demonstrating the increasing rate of adoption of technologies:


The below illustration depicts the varying timelines it took for different technologies to achieve widespread adoption in society.

The most popular AI tool (ChatGPT) took just a few days to reach a user base of 1M.

The evident acceleration in the adoption rate of AI resembles the velocity of a rocket, and this rapid pace of change is causing considerable discomfort among many individuals.

2) Globalization: Historically, technologies were often adopted in specific geographic regions before gradually spreading to others. However, the globalization of businesses, media, and the widespread accessibility of the internet have dismantled these geographic silos concerning technology adoption. Consequently, technological adoption in global society has witnessed an unprecedented acceleration. This phenomenon has not only eliminated staggered adoption across the globe but has also led to a rapid adoption rate without sufficient implementation of effective regulatory and security safeguards.

3) Social Media: In recent times few significant changes in society and technological space have occurred.

a. The advent of social media and the availability of smartphones to the masses has enabled everyone to express ideas and made them available to the whole world.

b. Algorithms of social media platforms amplify sustained engagement (Ref: https://theconversation.com/social-media-algorithms-warp-how-people-learn-from-each-other-research-shows-211172).

c. Earlier media was essentially controlled – only a few had the privilege to dissipate the views.

This paradigm change has enabled everyone to express his/her views irrespective of the depth of subject knowledge.  Coupled with the human desire for 15 minutes of fame (https://en.wikipedia.org/wiki/15_minutes_of_fame) has encouraged lots of people to jump to bad wagons.

 

4) Herd Mentality: The emergence of a directional trend in the media often triggers a herd mentality, where individuals tend to follow prevailing narratives. Moreover, numerous religious and spiritual leaders perceive an opportunity to promote their agendas, whether consciously or unconsciously. The dominance of social media further amplifies this effect, as traditional media outlets increasingly draw cues for opinion-making from the dynamics set in motion by social media platforms.

5) Fear-mongering: Fear serves as a potent motivator and forms the foundation of fearmongering surrounding AI. It is noteworthy that a considerable number of individuals who express opposition to AI are, paradoxically, among the foremost users of AI in their own products and services. Conducting surveys among this demographic could reveal intriguing insights into the apparent contradiction between aversion and practical utilization.

6) Quantum leap in compute and Big Data: The technological progression of AI is underpinned by two key phenomena: the accessibility of cost-effective computational power and the emergence of Big Data technologies. Regardless of the advent of AI, particularly within the Deep Learning paradigm, the groundwork was laid for the development of technologies aimed at uncovering patterns within underlying data.

Indeed, AI stands as a powerful tool, akin to a new entrant in the technological landscape. It necessitates substantial refinement to ensure security, user-friendliness, and robustness. From my perspective, contemporary AI represents an initial iteration in a series of tools that are yet to be invented and perfected.