Srinivasan Keshav posted a link to this excellent deepdive by Prasad Raje of Udemy into the advances that DeepSeek R1 has made from a perspective of the core technology. References Vaswani et al (2023). Attention Is All You Need. arXiv.
Srinivasan Keshav posted a link to this excellent deepdive by Prasad Raje of Udemy into the advances that DeepSeek R1 has made from a perspective of the core technology. References Vaswani et al (2023). Attention Is All You Need. arXiv.
Now that I've switched to a new website, I'm working on open-sourcing its components. I've got a lot of small OCaml scripts that are all work-in-progress, and so not quite suitable to be published to the central opam-repository but I still need be able to run them conveniently on my own self-hosted infrastructure.
My colleagues Thomas Swinfield and Eleanor Toye Scott lead the publication of a comprehensive report of the steps the voluntary carbon market needs to take to restore its scientific credibility, with input from many of us in 4C and beyond. This paper represents three years of hard work from the team on trying to blend remote sensing with carbon quantification.
This website has been through quite a few iterations over the years. The first version in 1998 was written in Perl and hosted on OpenBSD; the second was rewritten in 2000 when I got commit access to PHP; the third rewrite became a hybrid OCaml/PHP/Perl special in 2004 in Blosxom; then the forth rewrite around 2013 got turned into a unikernel in MirageOS;
We have just updated our preprint on using LLMs for evidence decision support with more evaluation results and corrections from peer review. See also the fantastic EEG seminar talk that the student group who worked on this over the summer gave towards the end of last year.
After some years of hard work, our Mapping LIFE on Earth biodiversity metric was published today in a special issue of the Royal Society Philosophical Transactions B! The idea behind LIFE is that although human-driven habitat loss is known to be the greatest cause of the biodiversity crisis, we do not yet have robust spatially explicit metrics that quantify the relative impacts of human actions on species extinctions.
Back in July 2024, a large group of conservation and computer scientists got together in the CCI to prioritise the storm of AI-related projects that have been kicking off around the world. Our key goal was to harness AI to accelerate the positive impact of conservation efforts, while minimising harm caused through either the direct or indirect use of AI technologies.
Josh Millar and I have been having great fun designing embedded systems for cooperative biodiversity monitoring. Josh presented our work over at LOCO 2024 with an abstract on the Terracorder project. Read more if you enjoy a combination of machine learning and ESP32 hacking.
All the work we've been doing on biodiversity (such as LIFE) comes at a fairly large computation and storage cost due to the amount of data that we churn through. This gets worse when you consider the exploratory nature of science -- we sometimes just need to mess around with the large dataset to test hypotheses which are often shown to be wrong.
Customers of online services may want to take carbon emissions into account when deciding which service to use, but it's currently difficult to do so due to the lack of reliable emissions data that is comparable across online services.
Ryan Gibb and I have been thinking about how the current Internet architecture fails to treat the carbon emissions associated with networked services as a first-class metric. So when the LOCO conference came up, we tried extending the DNS with load balancing techniques to consider the carbon cost of scheduling decisions.