A Roll of Dice by Callum Lamont

Luck is common to all. While we curse our misfortune when it works against us, it mostly hides in a cognitive blind spot for the remainder of our experiences. It shouldn’t, for we essentially have no control over this world around us.

Read More

Biases and Beliefs by Callum Lamont

Prior to the development of the scientific method, humans were largely guided in their actions by customs, traditions and intuitions. The initial inception of ritualistic practices within hunter-gatherer societies provided a means through which knowledge and understanding of the land could be passed forward through generations. As societies grew in size and complexity, so to did these forms of teachings, culminating in some of the more well known theologies of today: Judaism, Islam and Christianity. Less focused on the land itself, the lessons contained within these holy documents put more consideration towards interactions with neighbours and a general framework for the community. It is unfair to uniformly condemn religion as without intellectual rigour, however, the interpretive nature of the material, in conjunction with the zealous following it entails, clearly can be an incendiary combination. As a result, the prescription of religion has provided a mixed result for humanity, offering, offering both the enlightenment of the Islamic Golden Age and the brutal ignorance of the Dark Ages

Read More

Questioning our choices by Callum Lamont

A unifying concept between neuroscience and physics is how the great questions in each field boldly attempt to unravel the mystery and fundamental nature of reality. Of course the routes follow divergent paths, with physics focusing on the externalities, such as what is time, space, matter and how did it all begin, while neuroscience has internalised the problem. What is free will? What is consciousness? Do they really exist and, if so, how could either arise from a complex, yet conceptually simple interconnection of nerve cells?

Read More

Technobabylon by Callum Lamont

Over its turbulent years, the 20th century yielded a number of technological breakthroughs which laid bare a new paradigm for the next millennium. Here the brute power of mechanical automatons found competition in the more nuanced comprehension of information theory and computation. The gestation of these concepts eventually yielded a new model for sharing information and birthed the internet, a final gift to the new world ahead.

Read More

Switching Seasons by Callum Lamont

One of my favourite periods of each year is the shift in weather found at the beginning of Spring and Autumn. There is something invigorating about change, though perhaps this is more an opinion largely held by me (and maybe explaining why I’ve lived abroad a couple of times). I know others who would prefer it to be Summer 12 months of the year. Yes, summer is more practical, and we all like some gorgeous weather, however, as I’ve written before, we quickly acclimatise to our surroundings. After 3 months of heat, a sunny day doesn’t pack quite the same punch as it did previously, leaving me wanting something a little more.. complex. 

 

So we are afforded two major adjustments each year, both hitting me in a different fashion. Naturally, the shift from winter into spring has an air of rejuvenation, with the sun finally peeking through the clouds and the vibrant greenery. There’s an unquestioned optimism for the future. Autumn, on the other hand, takes a different, more melancholic approach. Days are getting darker and gloomier, the colour palette expressed by the trees is depressed yet at the same time intensified, and an increasing amount of time is spent indoors. This all sounds like a downer but the feelings it evokes could be described as analogous to nostalgia. A feeling which is a paradoxical union of contentment, longing, what was and what is now. Not necessarily a negative feeling, just a bit more nuanced. And following from this, of course our associations with the seasons are inevitably impacted by the memories corresponding to these periods. To take an empirical approach, our thoughts and emotions which relate to these changes stem from the brain, but it is intriguing to delve slightly beyond the vail of mechanistic neuronal coding to another cause for this effect. 

 

The circadian rhythm is a series of hormonal responses which set our biological clock, keeping our sleep-wake cycles in near sync with the rise and fall of the sun. Even when thrown halfway across the globe, this ebb and flow of hormones is resistant to change, leading to a frustrating disconnect between our internal and external environments. Only with time does our body learn and adapt to its new surroundings. I am curious if the emotional response described in the preceding paragraphs is also somewhat rooted in a rhythmic tide of horomonal responses. These thoughts are on my mind as, having recently moved to London, I'm getting mixed signals with the most recent change of season. While the odd sunny day does pick me up as one would expect going into Spring, I feel a Autumnian slant to my disposition as of late. Am I in a more abstract form of jet lag, with my body primed to expect a shift to gloomier weather in the near future? Is there a biological foundation for this fact, in which such a response could be kept in check over the course of an entire year? Such effects are present in nature, as evidenced by seasonal hibernation or migration of species. To be sure, seasonal affective disorder is a real and documented condition, whereby decreases in sunlight can alter melatonin and serotonin levels, eliciting depressive symptoms. However, such an acute effect is at ends with the enduring rhythmic response I describe. Though.. perhaps it's the general dreariness of English weather merely throwing my senses a curveball, the trajectory of which my stagnant mind cannot yet correct for. 

Our resilient minds by Callum Lamont

You’ve come to a fork in the road. Both routes lead home, though you’re in no hurry, with it being a sunny day and the whole afternoon clear. In one universe, you swing left, along the bustling high street, taking the view of the shop fronts in your stride. Your good mood and optimism catches you off guard and, the figure noticed for this week’s Lotto winnings encourages you to splurge on a ticket. (If the reader could kindly suspend their disbelief for the remaining paragraph) Against all odds, you win the $6 million jackpot. However, in another universe, oblivious to the potential future gains tied to the decision before you, you swing right. Your good mood, optimism and blaring earphones have left you unmindful of the surrounding terrain, causing you to misstep and careen down a small, yet significant, stairwell. A fracture to the femur and hip, a permanent limp and prolonged rehabilitation follows.

An evident, yet obligatory, question is how happy are you in the immediate fallout of these scenarios? Very and not at all, respectively, would be a natural answer. To probe this introspection further, consider a year later. The consequences of either event still reverberates through your daily life, in similarly positive and negative directions. However, in spite of what one could rationally argue, it seems that these two versions of yourself are, more or less, as content as you are right now. Known as the hedonic treadmill theory, initial support was grounded in a 1978 study, which analysed two groups of people, having either recently won the lottery or become disabled.1 While the concept is still somewhat contentious, the findings of this study (and many since) suggested that the overall quality of life reported by either group tended to recalibrate to near former levels, despite the initial polarising forces affecting their conditions. The results, while surprising in the current context, actually touches on a not so surprising core principle of life; it adapts. Almost by its very definition evolution enforces and optimises for an ability to adapt. Over time, physical constraints and changing environments place an urgency on organisms to develop appendages to overcome such burdens. But it should be no less evident to consider a psychological flexibility for change, equally necessary in countering such scenarios. We may not often fit a psychological model within evolution because we arrogantly consider this at odds with our assumption of animal cognisance. However, we cannot deny that animals may possess varying temperaments, from skittish or aggressive, to cooperative and empathetic, and that such behavioural traits have a direct effect on their genetic fitness. Therefore, a capability to regain an equilibrium and composure in the face of adverse events, despite any presence of consciousness, is in the end unsurprising.

A fascinating aspect is how the hard-coded biology and the more nebulous psychology of an individual both clearly play a role, which provides a tool to sharpen our blurred distinction between these two disciplines. To date, much understanding has been achieved relating to the psychosocial, physiological and genetic processes underlying this cognitive mailability. A more descriptive term of what were are discussing is stress inoculation and resilience. Stress isn’t just a 20th century byproduct of the office space. It is a cascade of primal responses and reflexes, coursing through your body and mind, triggered by what your body perceives as danger. The hypothalamus-pituitary-axis (HPA) is a significant promoter of this effect, driving up noradrenaline and cortisol levels, resulting in increased heart-rate, energy release and narrowed focus. While understandable when faced with a predator in the wild, such a response is antiquated for the emotional struggles more relevant for modern day life. Additionally, while the encounter in with the predator is fleeting (regardless of outcome), psychological trauma can be enduring. The stress response is unsustainable, and indeed, detrimental to our health.2 As such, mechanisms are in place for its moderation, allowing one to accept, adapt and move forward. For example, distinct molecular actions, such as the role of Neuropeptide Y, which opposes the HPA response, have clearly been associated with resilience. Following this, we can now understand how particular genetic alleles can predispose individuals to lack such an ability, making them more susceptible to anxiety disorders, such as PTSD. Despite sounding a bit “new agey”, similar beneficial effects are mediated simply through a general ability to maintain positive emotions, which can be extended to gratitude and humour.3 Recent research has demonstrated the significant neurobiological changes which can be induced by our behavioural and cognitive processes. Positive thoughts and can rewire neural circuitry, strengthening rewards pathways important in stress resilience.4 Notably, studies have found that primates exposed to moderately significant stress events early in life were able to overcome and showed a significantly reduced stress reaction to events prompted later in life.3

While important for clinical applications, this knowledge can be appreciated in less drastic circumstances also. Remember changing schools? Your first day at a new job? We are skilled in settling into new surroundings, and I believe it stems from a similar root. It serves on well to make the best of a situation, to view any and every hardship as a learning opportunity. If you willingly wade into the deep end now, it will likely serve you for when you're thrown in at another time. Somewhat paradoxically, when faced with hardship, our conscious awerness is likely to detriment our healing, as we ruminate over the sequence of decisions leading to the unfortunate precipice on which we stand. So if you feeling the stream of negative reflexes penetrating your thoughts, the best approach would be to divorce yourself from such compulsions and let your mind do what it does best, and look forward to being better for it in the near future.


1. P. Brickman, D. Coates and R. Janoff-Bulman, J. Pers. Soc. Psychol., 1978, 36, 917–927.
2. I. N. Karatsoreos and B. S. McEwen, J. Child Psychol. Psychiatry, 2013, 54, 337–347.
3. A. Feder, M. Haglund, G. Wu, S. M. Southwick and D. S. Charney, in Neurobiology of Mental Illness, eds. D. S. Charney, J. D. Buxbaum, P. Sklar and E. J. Nestler, Oxford University Press, 4th edn., 2013.
4. E. Garland and M. O. Howard, Health Soc. Work, 2009, 34, 191–199.

 

In Defence of Dogs by Callum Lamont

This is going to be about Dogs. Dogs, and in a big bad way. The aim is to, more or less, empirically prove that dogs are the best pet/nonhuman companion you can have. No subjectivity required here. If you prefer cats, great. But you’re wrong. Now if you’re curious why, please read on. If you plan on just having a rant at me for being completely biased, partial, skewed and cherry picking arguments, your comments will fall on stubbornly deaf ears.

So, what makes dogs so amazing. Well, they are adorable, but I'm going to build my campaign on something less shallow. Most present in our minds are the useful services they provide. There is the comonplace use of guide dogs, explosive detection dogs and, much to the incovenience of select travellers, drug sniffer dogs. These tasks require a dedicated companion, intelligence and a hell of a nose. What else is amazing, and what makes them, as mentioned previously, the best possible pet/nonhuman companion is the intertwined history and social evolution occurring between their species and humans. It is believed that dogs came into being somewhere between 15,000 to 100,000 years ago. The presumed origin was the result of our domestication of wolves (why someone decided that would be practical, appropriate and appreciated by others I could not estimate). Reexamining the spread of dates, this could mean domestication occurred during Homo sapiens' angsty hunter-gatherer phase, way before the development of (relatively) sophisticated agricultural based societies. Additionally, genetic studies indicate a number of separate domestication events, whereby a convergent evolution was achieved across a number of cultures. 1 So we can see that the origin of dogs was remarkable and, apparently, destined to be. One last point worth mentioning is that, through the rise of agriculture, while other species have since been exploited as natural, fleshy vending machines (bacon, spare-ribs, backstops, and milk please), dogs largely appear to be raised and bred for a singular purpose, our happiness. Dogs never toiled the fields and for the most part, never sacrificed themselves for human sustenance. They stood beside us, and still do, as members of our tribes. Our embracing of this species, over such an extended period, with selective breeding for particular traits, has led to some staggering connections being forged between us.

One would naturally assume chimps, being just one rung below us on the genetic stepping ladder, would share many traits with respect to intelligence, communication, and social order. There are similarities to be found here, yet our attempts to begin any cordial inter-species dialogue have been rebuffed. 2 In contrast, dogs, despite their differences, can impressively interpret a number of social cues and gestures from humans. Perhaps the most fundamental is the ability to understand the concept of pointing to reference an object out of reach. It sounds trivial, but the development of these actions highlight a conceptual leap to more abstract notions (i.e. this represents that), not entirely dissimilar to the development of language and mathematics, which has benefitted our species so much. An explanation of this understanding is that this is the result of our ever increasing proximity in society, where an adaptive behavioural response to our motions and body language was of great benefit to our four-legged friends. Dogs can also reciprocate these interactions and cue human attention if needed. For example, if food is hidden or placed out of reach, canines are able to alert us to the presence of said snack and their desire for it. Following on from this, there has been debate over whether dogs may possess “theory of mind”, in which they are able to adopt the perspective of humans and can acknowledge their attentional state. 3 Evidence supporting this is the increased likelihood of these pets to snack on forbidden food if they notice their owners focus is directed elsewhere (dog owners are currently nodding their heads in agreement).

Our love and appreciation of this family pet has also led to unexpective benefits. It has become increasingly common for drastic medical interventions, including chemotherapy and the fixation of prosthetic limbs, to aid in the health and well-being of dogs. Though such a concept is bewildering to many, there is great potential here for mutual benefit. Indeed, due to a more comparable physiology, immunology and genetic diversity in populations, studying such treatments at this level can yield insights benefiting our own medical technology. Cancer therapeutics may stand to gain a lot in this context. Between species, tumour initiation and progression are influenced by similar factors, resulting in homologous cancer histology, gene expression and behaviour to therapy.4 The current gold standard, in which we induce synthetic tumours in lab rats, falls short in all of these respects.5

Unsurprisingly, the effects of our increasingly processed and westernised diet has also trickled down to our canine companions. A recent genetic study demonstrated that dogs have yielded a number of genes, not possessed by wolves, relevant for the digestion of starches.6 This mirrors similar changes occurring across human populations with the cultivation of crops, such as wheat and barley. Such findings demonstrated the parallels to be found between our species, as well as remind us of how privy these animals are within our homes and families. By and large, we treat them as one of us. We share our snacks, and would likely share a beer if it was appropriate (it isn't, please don't). The haphazard comings and goings of cats speaks to their separatist, dare I say anti-social, nature, preventing us to ever develop a sense of trust and camaraderie. So dogs are pretty remarkable, and while I noted at the start that I would remain empirical, their best trait is they have so much love to give, and love receiving it in turn. They are deeply integrated into our family unit, often anticipating the first born as a means of training. They care and consider. Ever accepting of out hugs and upwardly inflected doggy talk. They may nip but rarely scratch, and never with malice or agency. Emotion is portrayed in their eyes and are one of few animals who share a common desire for play not survival, ever content to enjoy the simple pleasures of a belly rub. They are our best friend.


1. R. K. Wayne and E. A. Ostrander, Bioessays, 1999, 21, 247–257.
2. H. S. Terrace, L.A. Petitto, R. J. Sanders and T. G. Bever, Science, 1979, 206, 891–902.
3. M. A. . Udell and C. D. Wynne, J. Exp. Anal. Behav., 2008, 89, 247–261.
4. M. Paoloni and C. Khanna, Nat. Rev. Cancer, 2008, 8, 147–156.
5. T. F. Vandamme, J Pharm Bioallied Sci, 2014, 6, 2–9.
6. E. Axelsson, A. Ratnakumar, M. L. Arendt, K. Maqbool, M. T. Webster, M. Perloski, O. Liberg, J. M. Arnemo, A. Hedhammar and K. Lindblad-Toh, Nature, 2013, 495, 360–4.

Everything is complex (awesome) by Callum Lamont

Some things are just complex. Everywhere you look you can see more examples. This most recently came to the fore of my mind during a bit of background reading for my PhD. Here I was looking into the encapsulation and protection of integrated circuits. However, before I could investigate that, it would be useful to read about their fabrication techniques. Namely, how the components constructed on silicon wafers are ultimately made useful via the deposition of precise metal tracks (which are unfortunately quite susceptible to corrosion). But wait, how are those nanoscale structures (transistors, capacitors, diodes etc.) patterned into the silicon wafer in the first place? Ok so using using photolithography, a mask and a process to implant dopant impurities in the surface, the functional interfaces within the material are created which then control electron flow. What's the photolithography etching off though? Ah, there's a thin silicon dioxide layer formed in the previous step, to insulate and isolate the material, as well as precisely allow selective diffusion of dopants just mentioned. This is pretty easy to create. You just need a furnace and exacting controls over the flow of necessary constituent gases. But before we do any of this, we need an ultra flat perfect wafer of single crystal silicon. First, this requires electronic grade silicon of ultra high purity (too many steps to bother explaining). Then we need to turn this into a single crystal, requiring all crystallographic defects within the microstructure to be removed, including even minute atomic gaps within the lattice. The process essentially requires us to melt then, carefully and precisely, allowing the silicon to solidify in such a way that the thermodynamic and kinetic gods grant a single crystal to emerge. Production of a 3 meter ingot will typically take several hours. After which it can be sliced into wafers and then polished to produce an immaculately flat surface, using abrasive powders with diameter of 1 micron and then a chemical etch.

 

Then of course there is the degree of control and precision required for all the equipment used in patterning the 14 nm structures on the wafer. Not to mention the complexity and conceptual barriers to interface the simple operations of a billion transistors (either on or off, 1 or 0) with higher level software we can intuitively grasp. I point all this out so when your phone takes slightly longer than expected to load up your emails, you may be a little more appreciative of the countless man hours and ingenuity that allowed us to get to this point. As an aside, why is it so easy for us to quickly transform such complex advancements to an assumed convenience, then to unquestioned rights. How come when one is given an iPad, they generally take the time and effort to understand how to work it, but not how it works. Perhaps it's the incremental advances, whereby an iPad more or less operates in the same fashion as an iPhone, which is like a smaller computer/larger cell phone, which is the offspring of a sequential line of ever improving technologies. I am definitely guilty of this practice. Despite having driven a car almost everyday for the last 5 years, I have only the most broad understanding of how they actually work. I suspect there’s some evolutionary pressure to minimise our cognitive load where possible (aka being lazy). 

 

In spite of this rant, at least we (I mean someone else) knows how these things work. The matter gets more grim once you start venturing outside this bubble of technological order. Take the economy. Now think about how it really functions. It is the sum influence of millions of people buying, selling, saving and fretting (though not in an entirely egalitarian fashion) through an opaque and entangled web of interactions. Another layer of obfuscation is imposed when considering that not all players are acting in the most rational way. Some people have a better grasp (or at least think they do) of these interactions and can use it to their advantage. The end result being we now we have a few somewhat parasitic industries as a byproduct of this convoluted system of globalised trade. Other people take this assumed knowledge and try to apply it for good, helping guide policy decisions to improve employment and our quality of life . However, in the end, I view economists akin to weathermen, trying to forecast the dynamism of an intricate system where the whims of chaos theory have long since overthrown their initial presumptions. 

 

To surmise so far, in the scale of complexity, the haphazard meanderings of many seems to outweigh the directed and focused efforts of a few . However, the most intractable doesn’t involve even more people, just the one. You, or more specifically, you're body. In my opinion the most complex intertwined arrangement of information which, in its tortuous pathway of feedback loops and self-direction, miraculously results in you being here today. However, despite such a forbidding task, human ingenuity has once again allowed us to wrestle this concept with a certain degree of success. A great deal can already be spoken for the advancements in genetics and genomics. Considering Crick and Watson only proposed the DNA double helix a touch over 50 years ago, it's surprising the lengths we have come. The ever improving literacy of our genetic signatures may allow us to eradicate many human pathologies. Headway has also been applied to some other interesting avenues, such as creating crops which generate greater yields and are less prone to our increasingly volatile climate. Another particularly creative use of this biotech is as a tool for manipulating mosquito populations in the fight against malaria. As usual these technologies are double edged, with much caution needing to be paid towards the serious ethical implications involved. This is a level of complexity I'm not even bothering to touch, but if you want to use your imagination I’ll quickly point out that prenatal testing is getting to the stage where mothers could receive a blood test, from which we can reconstruct the genome of the foetus. What individuals do with this information...

 

Now, for all our mastery of the genome, we're not significantly closer to fully apprehending its secrets as we are that of the economy. Sure we have some broad, big picture ideas of how this machine operates, however, many of the notable tools we use for its manipulation (i.e. restriction enzymes, CRISPR) have come about more by chance than genius. Don’t be fooled, we are not creating these tools from scratch. They are derived from bacteria, in which they originally served as a line of defence from viral invaders, and have since been repurposed for our ends. It's like an alien civilisation leaving us cars and all the tools necessary for their construction, and from that we could develop..say.. trucks (or maybe only slightly different cars). That aside, the great challenge is not manipulating DNA, but understanding it. Each cell contains 3 billion base pairs, of 4 different distinctions, ordered (to varying degrees) into genes. Genes which can play multiple complex, yet seemingly unrelated, roles. Genes to express other genes. Sequences which promote or suppress expression of genes leagues down the genome from where they reside. And at this stage were largely taking about a system contained within itself. We can delve down the rabbit hole further and start considering the effects of environmental factors (e.g. stress and obesity) that can affect our epigenetics, which can then be passed on through generations.

 

A large degree of the aforementioned complexity arises from the feedback nature of the system, in which the end stage of this stream of information processing (enzymes and other proteins) can then act back onto the initial template. And despite my derision, at least we're making some semblance of a headway into this field. There is another another aspect of the human body that remains largely encrypted, cognition and consciousness. Similar to DNA, one can think of the brain as structured into different layers of information processing. In the genome we have the basepairs, these clustered into threes (called codons), which then code for amino acids in proteins. The structuring of information in the brain is somewhat more complex. Firstly we have single neurons and their inter-connections, then we have... something, then we have ideas thoughts feelings and desires. And surely there must be something in between, as neurons act in a very mechanistic sense, operating by simple physical rules and cues with regards to their inputs from other neurons and with an output of either fire or not fire. Obviously these neurons are not self aware, but how can they coordinate to create a system which is? We can make one step in the right direction by comparing this structure of information processing not to a biological counterpart, but to a digital one. As mentioned earlier, the base material foundation for electronics and computation is only one party in this tango of complexity. The innovation really lay in developing a method of translating it's very simple output (0 or 1) into higher order software. This is aided by compiler languages, which begin abstracting raw sequences into ideas or concepts more relatable to a human developer. Naturally, no one would expect it possible to extract information from the seemingly random noise of 1’s and 0’s output by a computer. Following this line of thought, it is supposed that if we step back a degree there may be an analogous higher order pattern of signalling between clusters of neurons, which carries this information in a more relatable form. It has also been proposed that such higher order signalling in the brain may feedback on itself to influence subsequent neuronal firing and thought patterns, in the same way that enzymes act back on the DNA which created them.  

 

So if thought, ingenuity and consciousness all arise from a complex representation of information, which at its core follows a very simple reproducible mechanistic procedure, then surely such processes are transferable into a nonbiological medium, say a computer chip. If possible, then I believe true artificial intelligence will eventually be achieved. Now you may have heard a lot about AI in recent times, particularly with advent of deep learning and Bayesian networks. However the more I read about these, the further I think we are from reaching true AI. These techniques are, at their core, pretty dumb. They are becoming more human than in recent years, however I would define intelligence as a global flexibility. Being able to go from absolutely nothing to eventually extending the topic past it’s original application. Think about how you use previous rules in language to create new words, maybe for comedic effect or maybe to help articulate your point. Additionally, we can also consider those who hear this use of language and, through drawing upon a vast network of discrete and distributed knowledge, piece together the implied meaning. This is so quintessentially human than not even all humans can do it all the time (there’s always one mate who sheepishly asks for clarification). Such flexibility, in adding, combining and reinterpreting previous concepts, in reasoning outside the existing framework of thought, is what I feel underpins true intelligence. Now, none of this is an argument against our inability to create AI, it’s just why we don’t really appear that close right now. Ultimately, it boils down to the fact that it’s just too bloody complex (but we’re still awesome)