2024-09-08Updated on 2024-10-13

A place for easy, regular and imperfect logging, usually from my phone.

edit on GitHub

2024

Richly Apart, Mundanely Aware

October 13th

I still have a deep affection for universities, teaching and science.

There are prominent examples today of individuals who seek to undermine institutions that they have left behind.

Quitting one thing or transitioning to another are common life moments. Rarely is conspiracy or persecution the cause. Life intervenes in a variety of ways. New interests develop. Values shift. Tiredness sets in.

To tie oneself to something, then stand apart from it, is a rich experience. One can preserve a deep appreciation without the devotional blindness.

There are too many voices trying to destroy or remake what they have abandoned, instead of giving their unique perspective and helping people understand.


A given institution has a ubiquity of mundane problems. It would be better if they did not exist, but they are too numerous to contain and so boring that they don't hold attention.

I taught in a university program that prized positive employment statistics for its graduates. Understandably, our students wanted jobs and there were corporations ready to provide them. The problem was never that a company said "stop teaching that" or "suppress criticism of this practice". I'm sure this kind of pure corruption happens but I never saw it.

Well-meaning professors would simply ask companies "what should our graduates know?". The response would be that a student should know such-and-such a machine, or this-and-that technique. Naturally, they did not request an emphasis on challenging orthodoxy, questioning norms or pioneering new ideas. They shouldn't have had to either.

The professors wanted to prepare students well for graduate positions. The students wanted to know what was needed to succeed in their jobs. Pushing against this rigid structure would cause anxiety. Professors: why wasn't the time on this novel concept used for skills development? Students: Why are you making me aware of criticisms that relate to my chosen career path?

For me, learning to push buttons and fill containers was not an ideal use of a four-year university degree. These skills should be taught on-the-job or in shorter, cheaper programs. Our role as academics, I thought, was to expand the notion of what is possible, show how to develop a new idea and reveal how discovery happens. With this philosophy, I had a lot of fun classes, and hopefully I blew a few minds, but there was always the pressure to show how the buttons get pushed and the containers get filled.

Occasionally I would hear murmurs from people that some tool or concept that was being taught "was not used in industry". I would respond that — well — it should be, or maybe it is being applied in secret, or perhaps a student in the future will be the first to apply it successfully.

Turning, Standing, Bagging, Drawing

October 10th

We would harvest peat ("turf") when I was young. It was monotonous and backbreaking work. There were multiple stages across different days: cutting, turning, standing, bagging, drawing.

The cutting would be handled by a farmer. We then turned over the dozen or so long lines of turf so that the wet underside could dry. Another day we would stand the turf in little castles to improve the drying rate. After that we would fill hundreds of bags with the dried turf and leave the filled bags lining the ground. Finally, we would each put three or four bags in a wheelbarrow at a time and draw them one-by-one to a bank at the edge of the bog. These would later be dumped into a large trailer to be taken home.

I always found the peatland landscape ("the bog") incredibly bleak. Each stage had its own unique pains and frustrations. There was balancing a full wheelbarrow on a slippery plank sunk into a patch of wet marsh. There was arching over the ground trying to make mushy pieces of turf form an orderly castle.

Covered in dirt and exhausted, we would break for tea and sandwiches, before starting in again.

Unfalsifiablity in appropriately non-scientific contexts can still be a problem

October 9th

Popper used falsifability as a criterion for demarcating science from non-science. One of the characteristic properties of science, he argued, is that its statements could be falsified. A strong theory may survive attempts at falsification but still in principle be falsifiable. This runs counter to the intuitions of many non-scientists, who consider the strongest ideas to be those that are certain and irrefutable. Falsifiablity is commonly discussed in the context of pseudo-sciences that have some aspiration to be recognised as scientific, or at least as good as or better than whatever science they seek to displace.

Practitioners in some domains may have no intention of being scientific and it would be absurd for us to expect them to be scientific in all matters. While we should not expect someone writing about software, food or business to always do so as a scientist would, we should still be wary of unfalsifiable statements. For example, if a person claims that societal progress halted in the 1960s, or that video games have ceased to be interesting, or that manufacturing has been perfected as far as it can go, or that they have a framework that is the optimal conceptual foundation from which to solve a problem, it can be helpful to think what does this person take to be 'progress', 'interestingness', 'perfection' or 'optimal' (?), and is it even possible within the framing of their statements to convince them or their followers otherwise?

At the root of the resultant impasse is typically a studied ambiguity of phrasing, which maximises the scope of the statements and renders them capable of absorbing all counter arguments; this is usually coupled with a rejection of precising definitions.

Acceptable pseudoscience

October 8th

It seems to be broadly acceptable to express pseudoscientific views about food and nutrition. I've met several people with skeptical dispositions, who worry about the circulation of conspiracy theories, and may even have some scientific training, but will nonetheless advocate for strange and baseless food regimens. When pushed, they may speculate on the real motivations of the experts who undermine their beliefs. Misinformation and false promises about nutrition appear every day in legitimate media outlets. When a celebrity makes millions from bullshit food supplements their tenacity and opportunism is admired. It is understandable that misinformation about diet worries us less than misinformation about vaccines, for example, as the dangers are usually less immediate. If we as a society, however, consider there to be long-term population-wide issues relating to diet and health it would be preferable if we did not invest so much in nonsense and instead tried to solve the problem. Unfortunately, once "consumers" start believing in nonsense then (1) it is potentially marketable, (2) it becomes the subject of techno-scientific investigation and (3) it receives media attention for (2), which can lend premature legitimacy to (1). People get outraged when a section of the population believes misinformation about politics or medicine, but what should we expect if false opinions and anti-expert sentiment are mainstream in so many other domains, including but not limited to food and cosmetics? It's a structural issue; believing in one kind of nonsense primes us to believe in other kinds; investigating nonsense can lend credence to nonsense; reporting on nonsense can draw attention to nonsense; applying technical knowledge to nonsense can optimize nonsense.

October 5th

The liminal space as a culture phenomenon always interested me as a kind of popular metaphysics. Mundane things can be looked at anew for different reasons. Science helps us find underlying mechanisms. Spirituality... transcendent meanings. Art... subjective interpretations. The liminal space is a metaphysical model of a type of space. It only says: there are quotidian spaces that you encounter which can be viewed as having a particular quality of betweeness (note: the scholarly interpretation is much richer). It says nothing of causal explanations or religious epiphany. While the liminal space came to inform a certain A E S T H E T I C, it could be appreciated without any appeal to aesthetics. There are other metaphysical concepts that can be similarly applied. The heterotopia, for example, a space which is characteristically other. More commonly known, the imagined future place: a utopia or dystopia. Rather than a new fad-ish obsession with one or the other, the liminal or heterotopic, it would be exciting to develop our general capacity to produce and apply new conceptual spaces.

Science made me smoke

October 1st

In my first week of secondary school I was offered joints and cigarettes at multiple points. I had no interest at the time. Unusually, I didn't smoke until about ten years later when I did a PhD. For whatever reason, I did my best work at the weekend. Security let me inside the laboratory and it was nice and empty: I could focus on my work. There was no food or coffee, however; sometimes I would bring lunch but it took planning and eating a sandwich in a lab was unpleasant anyway. Often I had to babysit experiments and couldn't leave for long. So I started smoking. It was the most practical and convenient vice. Small, portable and only moderately intoxicating. After whatever experiment I was doing, I would jam the back door open and smoke in the grim, empty carpark. I began to smoke in other places too; reading a book with coffee and a cigarette was a particularly special combination. They were my four smoking years. Soon after graduating I stopped smoking regularly. As I was stopping I would crumble mostly-full packs that I had bought and throw them away. If single-serve cigarettes were a thing I might still buy one occasionally. I think it could help me read.

Teenage peer-pressure didn't make me smoke, it was science.

A night-to-day phenomenon

September 24th

One Summer, when my friend and I were young, we were lying on the grass at around midnight. It was pitch black but for a few moments the sky became suddenly illuminated as if it was the middle of the day. This might have been the only time in my life I would describe myself as having been "awe-struck". A few seconds later everything dimmed back to night again. A fella walked by who had been fishing on the beach and was like "did you see that?". The next day nobody else seemed to have noticed and there was nothing in the news. Now I mention it to someone occasionally; they say things like "I dunno...meteors?".

Procedurally-generated sauce

September 22nd

I've often thought that randomness could be used as a design element in the food industry. As a whole, the industry is not concerned with Design and is obsessed with control. There are definitely things that we would not want to be random, like the level of a vitamin in a nutritional formula for hospital patients. However, sensory aspects like flavour and texture could be made more interesting with the introduction of some randomness. Taking the nutritional formula example again; a key problem with these products is that people get bored and stop consuming them. This is called "taste fatigue" and can cause a lot of harm when people are relying on a nutritional formula to survive.

There is a randomness in home-cooking that can be enjoyable: what I cook today might be different to how it was last week; I want that subtle variety as long as I can still recognise it as the same meal. When food scientists think about 3D printing they often think of how its programmability can improve the degree to which food is predictable. This seems... boring. In other areas, like game design, programming is also used to generate novelty and surprise. I think there is a place for a procedurally-generated sauce.

Familiar unknowns

September 22nd

When I started helping my brother paint houses it struck me how unfamiliar I was with the names for things I saw every day, like plinth, architrave, and coving.

Uncommonly spoken commonalities

September 20th

When you've had an interesting experience there's a question as to whether it needs to be communicated.

It could be so common as to be mundane and people might question why you bothered saying anything at all.

Sometimes it will be so uncommon that people will find it strange to hear.

The special case is when you refer to a common experience that is otherwise referred to uncommonly. This can be confused with the mundane or the strange but when expressed the right way its impact can be significant.

Authoratively poetic

September 16th

Reading Mrs Dalloway...

Rigid, the skeleton of habit alone upholds the human frame.

[...] like something alive which wants to confide itself [...]

Authoratively poetic.

Good interruptions

September 13th

A good conversation has interruptions. It's one of the things that makes in-real-life conversations distinct from those by email or chat app. We interrupt for different reasons: an awareness that we have deviated from the main topic; an identification of something critical that has been expressed in the moment; an attempt to halt progress towards something unproductive or harmful. It is a common occurrence that your interlocutor will intuit what you are trying to say while you are struggling to say it. Friends interrupt all the time. I love being interrupted. It nearly always clarifies my thinking. For some there is an apparent desire to have conversations without interruption. Everyone speaks in order. Hands are raised. People wait their turn. These types of conversations can be joyless and slow. It is only necessary when a group exceeds a certain size. One of the advantages of meetings with smaller numbers of people is that it facilitates interruption, as long as people interrupt in good spirit. When a powerful individual attends, the manner and force with which they interrupt can have a different quality. This is a separate matter entirely.


The interruption dynamic in chat apps is unique and interesting. There is an indicator that someone is typing that is potentially interruptive or that can prompt interruption. Maybe you are both typing at the same time. Anticipating a misinterpretation of what you previously typed you may now try to clarify, or noticing a particularly long typing indicator duration you might query if the message is going to be serious, or expecting them to offer a joke or insight you might try to get there first. You type something and interrupt their typing, their typing indicator stops and you hope it starts again, then relief when it quickly does! In a group chat a known malevolent actor begins typing... Some plan to defuse the emergent situation, others observe silently and some remove themselves completely.

Academic titles fight

September 12th

During my time in academia there was once a heated university-wide debate about academic titles. At issue was whether our university should change our titles to be more like other Irish universities who use North American-style titles. One of our Lecturers might be called an Assistant Professor somewhere else; some argued that the latter was more "important sounding" and could therefore be more competitive in a funding proposal; perhaps they might command more respect at a formal event. Lecturer, however, was a tenured position, unlike an Assistant Professor in the North American sense, who is typically pre-tenure. Also, neither Lecturers nor Assistant Professors are assistants to anyone in particular; an academic position is characteristically independent of such hierarchical impositions. These concerns were dismissed as unimportant by Associate Professors who themselves wanted to be known by the more impressive Full Professor. Some people expressed vague, mystical reverence for the title of Professor, wanting it only for a tiny minority. Few were interested in using titles that fit the actual role. Just at the level of language, what differentiates one who professes from one who lectures? How from these titles might a member of the public discern that these academics spend an inordinate amount of time doing research and not teaching?? It was mostly about people wanting nicer-sounding titles, right? At least that was my judgement in the absence of any evidence being presented concerning the relation between academic title and success in funding acquisition. I engaged in the debate to a degree but stopped when I thought to myself "what would any reasonable person who does not work in a university think about this?".


Currently this website has blogs, notes, projects and whatever these are (sub-notes? logs??). I think the blog posts should be relatively complete, to the point of my being comfortable with calling them "articles". For example, they shouldn't include things like "hey I just made some beats, preview at the embed". I'm never satisfied with those announcements and should just use the projects page to record the titles and links.

Image phone commit test

September 11th

Testing making and commiting images from phone. A little bit cumbersome, not impossible, and of unclear value, but it works.

Abstract image

Doing

September 10th

When making anything there is the doing and the reward. To stay motivated you nearly always need to love doing the thing, then you can do without external reward, by which I mean status or compensation. If you find the absence of external reward troubling then consider reward in a broader sense: the possibility of reward; the reward in doing; the reward in learning; the reward in showing; the reward of the next thing...

Philosophy without arguments

September 9th

There is no philosophy without arguments.

If an incoherent utterance is deemed to have merit or warrant scrutiny then somewhere a philosopher will endeavor to put it in standard form.

Then it can be determined if this and other occurrences, natural and artificial, are valid, or at the very least forceful.

It seems there will always be philosophy because philosophers find arguments even where none are apparent to the rest of us.

This is except when they stipulate otherwise, like Popper in his analysis of science, where he decided that the process of discovery was a matter for the psychologists.

To do something intelligible that resists philosophical interpretation is but an act of provoking a philosopher's stipulatory demarcation.

Even then, there is still philosophy happening.


Holism in loops

September 9th

Ideas viewed as transformational often amount to a renewed focus on the concept of a loop. Then rationalists start conjuring yinyang and ouroborus. Take circular economics for example.

My childhood in a Myst-like desktop

September 8th

When my family first got a computer in the mid 90's it was a Packard Bell that had this alternative environment for navigating the system. It was essentially a Myst-like set in a family home with book shelves, fax machines and things of that nature. I remember my mother bringing me to her friend's house, which was bigger than ours and had a newer Packard Bell with an even fancier pseudo-3D navigator.

Misleading ratios

no date

When I supervised younger scientists, I would often find myself saying things like:

A "tripling" in the measurable redness of milk is unlikely to be meaningful or interesting.

I'm not a numbers person. I use them when necessary. You typically won't hear me complain about levels of mathematical literacy. I once read a book about innumeracy that had lots of clever examples, many of which I didn't encounter in the world. The above example, however, is something that I think causes mass confusion and delusion on a regular basis. Let's call it:

Misleading ratios. Maybe there is a fancy name for this but I could not find it.

It goes something like...

Last year there was one sighting of a person with blue face paint in our supermarkets. This year the number has grown two-fold. That's a 100% increase!

Of course, it's still a really small number of blue-faced people: two. Supermarkets are not overwhelmed. Cashiers do not need special training for the tricky task of age-verifying beer-buyers with blue face paint. Politicians do not need to start reminiscing publicly about how the country used to be before its citizens began painting their faces blue.

There could be one person who consistently paints their face blue every day. That's a small number that could still be meaningful: why is that person doing that? It could be an interesting story. If one year a second person tries it out on a whim then that two-fold increase is likely meaningless. It's neither a big nor stable change in the context of a town or city. Wait until you see a trend across multiple years; even then, check whether there are worthwhile causes and/or consequences relating to the trend you are reporting.

People are frequently over-excited by scientific results, crime statistics and business metrics where some kind of Hollywood ratio has been deployed.

"This area of the brain normally has vanishingly small levels of activity but our treatment saw a three-fold increase."

"Crime has been declining steadily for fifteen years but last year saw a doubling from the previous year's historic low, causing major concern."

"Sales for this product have been effectively zero for six years but they quadrupled in the last quarter."

This misleading use of ratios is everywhere. It is an infinitely renewable resource for toxic politics. In science it is sometimes given a false legitimacy by the assignment of a p-value. It is a most routine feature of data rhetoric, or persuasion by numbers, manipulating our emotions while teaching us nothing.