Machine Learning and GenAI
I first came across Machine Learning as a PhD student at a graduate social. Someone was teaching a computer what drugs inhibited kinases, and then asked the computer to predict other drugs which would inhibit kinases. This seemed cool to me! A human was looking for a way to ask a productive question which would materially contribute to the search for drugs to inhibit kinases. They had a clear technical grasp of the whole process, and they were exploiting the affordance of computation to develop insights orthogonal to traditional chemical intuition. This sounded like exciting science centred on a narrow research question.
Ten years later, “generative” artificial intelligence based on large language datasets is commercially available and capable of doing ok-to-fair performances of many common language-based tasks. Many educators – myself included – are concerned about what this is doing to students’ learning in this space: ChatGPT can write a coursework lit review.
In this blog I don’t want to ask “is GenAI good?” but rather to try and pin down what the exact difference is between using Machine Learning to discover new kinase inhibitors and using ChatGPT to write a coursework lit review. I think the conversation about AI in academic settings requires a stronger distinction between these cases, but I also think that articulating the difference helps us understand what the bigger issues are.
To do this I am going to use Marx. Not as a political theorist, but as an economic one.
Marx and the Machine
Many historians draw on Marx’s analysis of labour to situate technological advances. On this analysis, technologies are exploited by the merchant class when they permit capital investment to reduce labour costs.
A specific example: the hand loom for weaving was superceded by the steam loom. There were technological ways that this was superior (the quality of cloth, the speed of weaving), but also economic ways which it became the dominant process. By investing capital (buying a steam loom), an industrialist could replace labour (the artisan weavers in cottages across a county).
Marx had a high level of respect for the productive output of capitalised industry, and the booming textiles market in Victorian Britain drew substantially on the affordances of the steam loom. Of course, this economic growth had social consequences. Mill owners prospered, often at the expense of artisan weavers but also by creating a new market for new volumes and types of cloth. On the other hand factory conditions were typically poor for workers, with low wages for sincerely dangerous work. Capital accrued to the mill owners by giving workers less than labour’s share of the cloth’s market value; the business’ profits were the “surplus value” of workers.
But mills produced a demonstrably useful product. People use cloth, people need cloth. Whatever the working conditions of a mill, it is socially useful to have a supply of high-quality cloth.
Marx and AI?
The Machine Learning search for new kinase inhibitors seems like it follows the formal structure of replacing labour with capital. In principle at least, you could search for new kinase inhibitors by employing people to do inhibition studies.
But inhibition studies are not currently happening at this scale because they are not a good use of the scarce resources available to scientists. The Machine Learning PhD topic wasn’t replacing labour – certainly not at the scale of textiles factories putting weavers out of business – but rather speeding up the rate at which a limited number of people can search for new kinase inhibitors. Yes, Machine Learning is a capital investment (the PhD stipend, the research grant, the opportunity cost of not doing other work), but the labour it replaces is the work of following leads which prove unfruitful. No-one really loses out when we find a new drug candidate which we wouldn’t have discovered otherwise.
What about ChatGPT writing an essay for a student? Certainly, capital investment for ChatGPT is substantial: it is a company attracting tens of billions of dollars of investment, as well as costing poor students about £20/month. It replaces the labour of writing an essay, but this academic labour sits in an interesting economic context. No-one loses money when a student uses ChatGPT to write a lit review on N-heterocyclic carbenes, and in this sense it might be seen as having a resemblance to the Machine Learning on kinases.
But the type of work which ChatGPT replaces here is importantly different to the kinase inhibitor case. In my view, writing an essay is a valuable process but finding a novel drug candidate is a valuable outcome. We can develop an analysis of value around this distinction, if we look to Marxist conceptions of commodities.
The Three Values of Commodities
Classical commodities are things like gold and wheat: goods traded on the basis of what they are, rather than who made them. You can’t tell who mined your nugget of gold or who harvested your ear of wheat. In technical language, commodities are “fungible”: you can readily substitute French wheat for English wheat, or Russian gold for Indonesian gold.
Marxist conceptions of commodities go beyond completely fungible products and try to articulate how objects like cotton cloth might be treated as something which occupies a space in the market alongside gold and wheat. For Marx, cotton cloth has three types of value: the value of the human labour which went into making it (“Labour Value”); the social value of using it (“Use Value”); and the value at which the cloth can be sold or swapped (“Exchange Value”).
The arduous human labour of making the cloth is an important dimension because it makes it highly reasonable to want to economise the act of making cloth. The weavers in their cottages had incentives to be quick and to avoid mistakes. Society at large had incentives – some of them very respectable humanist incentives – to explore ways of speeding up the act of making cloth. I think this Labour Value is the central merit of the Machine Learning project for kinases: finding inhibitors is arduous, and this tool can help. There is social merit to speeding up the discovery of kinase inhibitors because it economises the human labour involved.
The Use Value of cloth is fairly clear: you can make a shirt out of it. Similarly, a shortlist of kinase inhibitor targets has Use Value: a chemist can economise the scope of their inhibition studies by using it.
Exchange Value is a little harder to place for the kinase project. It might be seen in the way a computational PhD project was funded instead of an experimental suite of inhibition studies, perhaps, but this is somewhat speculative. Much academic research is deliberately constructed in an environment of almost-free exchange, so it is hard to determine what price a kinase inhibitor shortlist might fetch.
Perhaps this is a way of coming at the distinction with the ChatGPT essay. The conception of Exchange Value applies much more readily in this case: an essay is exchanged for a grade. On the other hand, the Use Value of an essay written by a machine is much harder to determine. You give me cloth and I can make a shirt. What social use does a GenAI essay serve?
The Labour Value aspect of the ChatGPT lit review is very clear, though. Students save time by using these tools, and their time undeniably has value. As funding for education becomes increasingly squeezed, it is important to acknowledge that time spent studying can be difficult to reconcile with the need to earn money. There is a reason students might see the attraction of a tool which saves them time.
Which Values do we value?
So my argument is that the Machine Learning project has a clear Use Value (despite having a poor Exchange Value), while the ChatGPT lit review had a clear Exchange Value (despite having a poor Use Value).
This raises immediate questions about what Values we centre when we think about Chemistry as a discipline. Research outputs have Use Value to us. Our discipline grows when we make a new molecule or analyse new materials or develop new models. I think lit reviews are typically valuable, too: arranging existing knowledge into a coherent structure is often a good use of a researcher’s time. So why doesn’t it feel valuable to read a ChatGPT essay submitted by a student? Why does this output feel less valuable to me than a lit review constructed by a PostDoc?
There are probably lots of answers to this question, but I think the most compelling one for me is about what a degree is. I see a Chemistry Education as being about a student changing because they study my subject. The Use Value of an essay is that writing an essay changes the student, and this is why it’s worth doing. It is worth becoming someone who has written essays.
But the logic of Exchange Value is helpful to lay out explicitly because it makes the conflicting view so coherent. It is self-consistent to claim that a Chemistry Education is about getting a bit of paper saying that you have a Chemistry degree. This credential has value: it lets graduates access better opportunities. Using ChatGPT to help you get that commodity can be seen as playing the game.
What Marx’s values let us do is to set this up as a commodification problem. Is the Chemistry Degree a commodity whose value lies in the way it serves as a passport to better employment prospects? Or is the Chemistry Degree about the social value of providing the world with people who can think? I wonder whether a lot of the AI anxiety among educators stems from the way we don’t have a completely clean answer to this question.