top of page
  • Writer's pictureDave Black

Intelligent Apiculture Part II – AI’s Arising Issues

Artificial Intelligence (AI) is playing an increasingly prominent role in our lives, so last month science writer Dave Black answered some big questions, detailing what AI is and how it works. This month we have more questions, leading our science expert deeper into an exploration of how AI might assist apiculture, and the issue of the big money and big environmental impacts swirling around our AI future.

Q. Do you think AI can make beekeeping smarter?

Dave Black: Hmm. How long have you got? Maybe. One of the possibilities you’re reading about already is what I’ll call geospatial intelligence. Imagine it would be possible (it kinda already is…) to collect and use data from lots of sources, including satellites, fixed and mobile sensors (GPS), aerial images and the bits (like RFID tags, fridges, weather stations) we can connect to the ‘Internet of Things’ (IoT). All the data is used to produce real-time maps and simulations and we can imagine never registering hives and apiaries, because intelligent software will pick them off maps and report them to the Management Agency (and send us the bill).

By utilising GPS technology and the ‘internet of things’ many compliance activities associated with beekeeping may be able to be automated in an AI-powered future.

There will be no Harvest Declarations. The AI will note visiting utes (it’s already managing their batteries, range, and your honey house energy consumption), associate the surrounding vegetation types, crops and terrain (all these are already mapped in most developed nations), monitors and forecasts the weather (accurately, because AI and big data), and reports the boxes and their potential bounty. We could include ecological data on all possible pollinators (for example there’s already one current ‘prototype’ that claims to associate iNaturalist observations (a biodiversity social network). Theoretically it could determine the price to charge for the most effective number of pollination units for the crops in range. Equally, it’s possible to monitor and manage the carrying capacity for any given location; we could imagine seasonal ‘surge pricing’ (like road tolling) for resource use.


Increasingly ‘Smart’ apiaries will provide data about hive health and performance which intelligent software will integrate into local and regional assessments of productivity across several years. The ‘intelligence’ knows what data is important to all the different participants. Apiary visits are scheduled by the AI. The software takes weights from axles and lifting gear, communicates estimated yields to the contracted extraction facility, and live data informs packer inventories, market trading and prices – globally. People trade ‘futures’ in honey, and hedge currency to do so. Monitoring hives and their environment in real time is used to predict outcomes, streamline export compliance, and achieve logistic efficiency. The effect of the next cyclone will be factored into economic forecasts before it’s off-shore. Think of it as smart, hyper-sophisticated surveillance.

Q. I’ve seen adverts for ‘precision beekeeping’ and ‘smart’ apiary software already. Sign me up?

There is a cost for knowing everything, and to me it’s difficult to see beekeeping having any sort of priority for these sort of developments. As they become common place elsewhere, as corporations can recover a return on investment in more lucrative sectors, beekeepers will adapt too, eventually. The value of experience and expertise to a business will probably diminish and some businesses will consolidate into larger more strategic entities, but very little in apiculture is going to be solved with better data unless you are a pretty big business. If you think about it though, the current manuka turmoil is not going to be resolved by an intelligent machine.

Q. That sounds like more of the same to me…

Producing intelligent systems takes time, money, and an awful lot of data. It’s not surprising that the effort is largely centred in the wealthy global north, scrapping free data from a largely English-speaking internet but reliant on outsourced cheap labour in the developing world. The technology media has widely reported news from The Information, an American technology-focused business website, that last reporting year OpenAI’s losses reached $USD 540 million, double the previous year. With a projected revenue of $USD 200 million the CEO was apparently trying to raise venture capital amounting to $USD 100 billion to continue its AI development. When undertakings of this size are underway even large companies like Microsoft find the cost of entry sufficiently high that it’s more cost-effective to partner with others rather than develop their own system. The real story here is about the power of money, not the power of AI. The temptation is greater than ever to wield Kaplan’s hammer (that’s the ‘law of the instrument’ that states “give a boy a hammer and everything he meets has to be pounded”).

One of Microsoft’s hundreds of data centres spread around the world where huge banks of servers power their company – including AI technologies – and it all takes vast quantities of water to cool.

Q. Shall I start saving up then?

Money is not the only thing that concentrates power in the hands of these companies. Once trained the first models continue to learn from their users, the more people that use the products, the better they get. As more people have conversations with ChatGPT, for example, the model gets better. Better models get more users, and so on, exacerbating the gulf between themselves and any competitors over time. Building the technology into other products extends their reach, so Microsoft’s Bing Chat and Microsoft Office now include OpenAI’s ChatGPT. Each time a user writes a document, edits a spreadsheet, or searches the internet the information it contains goes straight back to OpenAI’s servers. It’s not just your car watching you. If OpenAI thought it worthwhile, they could know more about your operation than you do.

Q. I can’t afford one of those cars anyway…

There are all sorts of big issues that will have to be resolved, and that won’t be easy. The companies making these systems are not unaware of their resource consumption, or transparent about it, but it takes an increasingly spectacular quantity of electricity. It’s been credibly suggested a ChatGPT query takes five to 10 times as much electrical power as a regular search query. Stored data isn’t measured in terabytes any more and it’s been estimated by 2035 humans will have created 2,000 zettabytes of data[i] (270 or 1021 ) kept in a thousand data centres (often built in cooler countries) consuming hundreds of megawatts each. The numbers are open to debate, but computing could account for 51% of global electricity consumption, a third of it from data centres, and 23% of greenhouse gas emissions by 2030[ii].

OpenAI and its large language model technology ChatGPT are playing a big money game, with the CEO reportedly seeking $100 billion dollars in venture capital to continue AI development.

All this energy also needs cooling and a lot of water. Open AI’s ChatGPT was trained by Microsoft in Iowa. The weather is cool enough for the supercomputers during much of the year, but even so cooling water used in the summer months took 6% (11.5m gal) of the district’s supply in July, which is seen as an issue for further expansion of the facilities. A report from Microsoft in 2022 documented a 34% increase in its water consumption mainly due to AI research. University of California scientists suggest a single ‘conversation’ with ChatGPT (20-50 interactions) consumes about half a litre of water[iii].

Q. A bit bigger than beekeeping then?

You remember how the global Covid vaccine rollout went? For one reason or another, first poor people, then poor countries, got left out. The rest of the world is looking on as unaffordable technology widens the inequality already at the root of some of the problems they are trying to solve, or alternatively, becomes a new kind of commercial colonial imperialism limiting their independence. As William Gibson famously observed[iv], “The future is already here - it’s just not very evenly distributed”. Fundamentally, the big problems are not technical, they are socio-political or matters of faith, morality, greed, emotion, ethics, or gender; all the things Artificial Intelligence can know nothing about, but essentially human.


AI’s current apostles read too much science fiction and have a very limited skill-set. They see ‘intelligence’ as a single component on a one-dimensional scale that goes from dummy to smart. That’s not the reality of human intelligence, and it certainly doesn’t begin and end with language. For people, intelligence is a collective emergent property of living a multi-faceted, conscious, social existence. Maybe we can create another, non-human, kind of intelligence, but can it be accountable, responsible even? Will it know the value of failure? Would it imagine and dream? Could it forget, or forgive?

Dave Black is a commercial-beekeeper-turned-hobbyist, now working in the kiwifruit industry. He is a regular science writer providing commentary on “what the books don't tell you”, via his Substack Beyond Bee Books, to which you can subscribe here.

References

[i] Gerry McGovern (2020) World Wide Waste, ISBN 978 1916444621

[ii] Anders S.G. Andrae, and Tomas Edler, On Globa Electricity Usage of Communication Technology: Trends to 2030. Challenges 2015, 6(1), 117-157; https://doi.org/10.3390/challe6010117

[iii] Shaolei Ren et al, Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models arXiv 2304.03271v1

[iv] Willian Gibbson (Auth. Neuromancer) as reported by the San Francisco Examiner in 1992


0 comments

Recent Posts

See All
bottom of page