There’s a curious terminological schism that nobody talks about at what passes for dinner parties for the AI crowd.
There’s a curious terminological schism that nobody talks about at what passes for dinner parties for the AI crowd.
When I read Andrej Karpathy’s endorsement of “context engineering” in a Twitter exchange with Shopify’s Tobi Lutke, I felt he tapped into something we all felt to some degree: tweet={"url":"https:\/\/twitter.com\/karpathy\/status\/1937902205765607626","author_name":"Andrej Karpathy","author_url":"https:\/\/twitter.com\/karpathy","html":"\u003Cblockquote class=\"twitter-tweet\" align=\"center\"\u003E\u003Cp lang=\"en\" dir=\"ltr\"\u003E+1 for
There’s a pervasive problem with semantics in artificial intelligence. It’s present at the creation – the term itself characterises the subject as a man-made simulacrum of something ‘natural’ the way we speak of artificial flavourings and artificial rubber.
If you spend any time on LinkedIn, it’s almost a certainty that you have come across a bevy of alleged ‘agentic AI architectures’. They all look something like this: All very neat, but the audience might be forgiven for asking what exactly is agentic about this, except for relabeling subprocesses in what is a run-of-the-mill RPA workflow as ‘agents’. And the audience is, this once, perfectly right.
DeepSeek has been grabbing headlines in AI circles lately, showing up everywhere from Discord servers full of ML enthusiasts to LinkedIn posts where “thought leaders” tag each other in endless threads.
The Department for Science, Innovation and Technology has just dropped its long-awaited AI Opportunities Action Plan, a 50-page vision of how the UK government plans to guide us into an AI-powered future.
When he first began his excavations at what is today Hisarlik in modern-day Türkiye, Heinrich Schliemann set out to find a single city – the city of Homer’s Iliad, a city many actually felt lay in the realms of fiction rather than any map he could lay his hands on. By the time excavations were over, Schliemann would find not one but nine cities, all built on top of each other. 1 In that, he found something relatively common – cities
As I sit here at year’s end, I’m reminded of the ancient Swedish tradition of årsgång - the ritual winter walk taken on New Year’s Eve to divine the fortunes of the coming year.
The awesome thing about language is that, well, we all mostly speak it, to some extent or another. This gives us an immensely powerful tool to manipulate transformational tasks. For the purposes of this post, I consider a transformational task to be essentially anything that takes an input and is largel intended to return some version of the same thing. This is not a very precise definition, but it will have to do for now.
The year is 1959. Eisenhower is on his second term, Castro just kicked Batista out of the country and Ray Charles’s Let the Good Times Roll is topping the charts.
Say you’re busing tables and you’re trying to pass someone in a wheelchair. What do you do? Do you say “excuse me” and wait for them to move? Do you say “excuse me” and then try to pass them? Do you just try to pass them? Do you say nothing and just try to pass them? All of these are, actually, pretty legitimate answers. Now, say you’re a robot.