So what is the point of engineering research in academia?
I don’t imagine anyone believes that formalising and standardising the study of the natural world was an unnecessary complication to society. The movement towards empiricism, statistical analysis and inductive reasoning sowed the seeds for the scientific revolution, which has since thrived and blossomed. Similar to the requirement for these rules of best practice, emerging scientific fields needed an environment, an institution, which would allow practitioners to make the most of their intelligence and insight, as well as to pass forward this knowledge onto future generations. And so we have universities: The bastion of truth and inquiry, developing and disseminating ideas that may shape the world. Here we see congregation of the world’s greatest minds within a border which can foster discussion, collaboration and expedite the passage of science. The freedom found in these walls to investigate novel ideas has immeasurably benefited humanity and our understanding of the world. However, much of the discovery which has been found in this arena can largely be thought of as scientific endeavours. I am of the opinion that the role engineering plays here is becoming increasingly confused, misguided or even misplaced. A fundamental idea leading to this conclusion is the conflicting guiding principles found for these two disciplines. Science exists as a means to discover the truth of the universe. As theories are conceived and rigorously tested, then developed and refined, we may imagine an iterative convergence towards the underlying truth being sought after. Ideas which do not pass muster are discarded, allowing us to move forward (on average). Engineering, on the other hand, can be thought of as an application of science and mathematics to yield a solution to an intractable problem. However, for every problem there are infinitely many solutions. As a consequence, the more money and academics you throw at a question the more “solutions” one is likely to see, as opposed to one singularly refined idea.
It is possible to argue that, just as incorrect hypotheses are eventually discarded in science, ungraceful designs will fall by the wayside in engineering research. Though this might be true in principle, the metrics for academic success are at odds with such a force (more on this below). Another counterpoint is that it becomes harder to judge what is a “good” and “bad” design when the endgame is much more vague, which can be illustrated by comparing academic research to engineering in the wild. Research and development within a company has a much clearer goal in mind, with employees either working directly to improve existing products or, less commonly, doing quasi-blue sky research to uncover the technologies leading to the next generation of devices. For the former task, the work becomes more constrained, as there are clear specifications to meet, markets to maintain and manufacturing and budgetary realities to adhere to. Though there are still multiple possible solutions to implement, this additional information helps illuminate the best path forward. And for any given choice made there are many moving parts, and the consequence of every decision intertwine and feedback upon each other. From this perspective, isolated research relating to only a specific issue in the scheme of any multi-faceted problem, which can occur in academia, seems disconnected and less useful. Even when considering more fundamental R&D performed in industry, there is still an end goal or application, though perhaps more nebulous, which lies ahead.
I described engineering as an application of science, but it is perhaps better to think of these fields as lying on a spectrum. As we proceed from one end to the other we transition from pure inquisition, to the inquisition with intent, to the refinement and optimisation of a concept to reach a goal. Everything on the far left of that spectrum, relating to the science, naturally falls in comfortably with the ideals of a university. As we drift across though, that position becomes less tenable. To give a more clear example I will draw from my own experiences, limited though they may be. The field of neruoprosthetics is a very, very interdisciplinary arena, requiring the know-how of electrical and mechanical engineers, material scientists, electrochemists, neurophysiologists and the list goes on. The work which is being performed is similarly varied, with common research topics including designing low power microelectronics to untangle the mess of neural recordings, increasing the density and complexity of electrodes for interfacing with the cortex, as well as electrochemical and histological examination of these interfaces with biology to make truly stable and functional chronic brain machine interfaces. Naturally, there are many fundamental scientific questions which still need answering, such as understanding the exact circuits and signals which relay information to control our perceptions and motions, or how to best interface with these circuits to achieve therapeutic targets. Tied to this, we do need ever improving technologies which help us probe the brain with increased resolution and accuracy. But not all work is tied to such fundamental exploration. There also exists work simply optimising technical design specifications, such as for microfabrication processes or low power circuitry. I do not mean to belittle this work. What I’m trying to place in my head is whether it best pursued through the academic freedom granted in a university environment. For example, we see comparatively little/no research into graphic, product or UX design, because it is obvious that this is work best left for where the action is happening, where the application is. There are many well-funded long-term projects which do aim to generate a real, tangible, functioning product at the end. Though I would be interested to see what the data says about how many of these move from a successful proof of concept to something commercially viable, as well as how many of the PIs are actually desirous of this.
The reader might state that, regardless of outcome, research into these sorts of engineering disciplines is a good thing, as it adds to the font of knowledge which may then be drawn upon from those in industry and academia alike. There is truth to this statement, though I believe that the font is overflowing. As the publish-or-perish metric driven culture of academia is ratcheted up again and again, over reporting of inconsequential engineering improvements feels like an inevitable and undesirable consequence. At the end of this process we are left with an unsortable pile of average ideas, with any truly revolutionary concepts buried and lost. Again, this is not to say zero knowledge is gained and it’s all a waste of effort, but what sort of return on investment is actually achieved in many engineering departments… Is the culture of academic research suitable for this sort of pursuit? I would also happily wager the majority of young academics would have moved directly to R&D in industry, had there been corresponding positions readily available for them. So why has there been such a boom in engineering research? A contributing factor might be due to the fact that it appears harder and harder to justify funding blue sky research (not that it should need any justification) with all the low-hanging fruit long since picked. There is a real drive to be solving the worlds problems and the return on investment for fundamental science is harder to conceptualise. As such, research becomes more and more practical, requiring the work of engineers getting bogged down in technical details. Has the shift in focus of universities towards practical endeavours resulted in bloated inefficiencies and sucked resources away from what it does best? The reader will be able to point to numerous success stories being spun out of campuses, but we must step back and keep to generalities. It has certainly been proven before that monumental technical progress can be achieved outside the walls of these ivory towers.
The area of technical development which has had the most profound impact over the last 100 years is, in my opinion, the computing, semiconductor and microelectronics industry. The heart of all this is the transistor, an electrical switch from which logic gates are constructed, thereby enabling arithmetic, memory, control and, ultimately, computers. The nascent electronics industry revolved around the application of vacuum tubes. These began their journey in Menlo Park, Thomas Edison’s applied research laboratory (not a university), and were further refined to diodes and triodes by Edison Telephone, the Marconi Company, General Electric and AT&T (not universities). The game changer was the development of solid state transistors in the 1940’s at AT&T’s incredibly prolific Bell Labs, a maturation which led to the founding of Shockley Semiconductor, then Fairchild Semiconductor and then Intel. In the years since, transistor dimensions have shrunk down to 10 nanometers, allowing density increases in the order of millions. Progress is perpetually pushed by the billions of dollars the top companys, such as Intel, Qualcomm and NXP, spend on R&D very year. Naturally, collaborations are formed with universities, but I view this as incidental, as this is simply where a large number of the necessary engineering professionals happen to be currently. In this example we see the industry’s enormous pool of resources, and capitalistic drive, as the primary mover of electronics to the form it is in today.
To be my own devil’s advocate for a moment, I can think of some reasons why academic engineering research could be necessary, though the importance of each will vary from field to field. Firstly, research is expensive. Is is a real concern that the extraordinarily cheap yet incredibly lucrative venture of software development has caused a shift in perspective and priorities of investors. The poignant maxim of start ups - “hardware is hard” - epitomises this reality and the reason a company or investor is unlikely to throw piles of cash at any sort of moonshot (unless you’re Elon Musk). This is particularly true for more conservative, expensive and risky industries, such as Biotech and Medtech. Having academics with less of a financial responsibility may indeed help catapult the next generation of ideas into the world. Furthermore, many nascent fields require a certain amount of foundational knowledge or infrastructure before a viable business can be built. If all this research was performed in house by companies its access would be limited by trade-secrets and IP (unless, again, you’re Elon Musk). But could the same benefits be achieved if we simply redistributed all the money trickling down to academia to genuinely applied research, untethered by the onus of publication metrics and unsustainable short-term post-doc positions?
Much more could be looked at on this nuanced topic, and I offer no real solutions, only thoughts (this a blog so you should have expected inapplicable babble from the get-go). To finish with some anecdotal evidence, when discussing with fellow student’s about their research projects, a consistent theme is that they describe their work as essentially pointless and the overarching goal unlikely to ever succeed. It is sad to hear such a dispirited and jaded attitude emanating from the future leaders of the academic world, so I truly hope to find answers to the question posed in the title.