Video Transcript

Charles Sturt: Embracing AI in agriculture

 It's my pleasure now to introduce Sarah Carney from Microsoft. And Sarah is the Chief Technology Officer for Microsoft, Australia and New Zealand. 

She's based in Canberra. And prior to Microsoft, Sarah served in the Australian army.

Hello. Thank you. I don't normally bring my phone to the stage, but I was so inspired at the Expo yesterday. I thought I'd try a live demo. So, partway through, we're going to go a bit crazy and try and talk to my phone and get some insights from it. 

The team asked me to talk a bit about artificial intelligence, and I find that such a hard thing to do for an audience that already knows so much about this.

You're already embracing this in everything that you do. So I'm going to try and set the scene. 

What is AI? Why are we talking about it? Why is everyone so excited in this moment? 

And then talk a bit about where we think the future goes and some of the crazier things that we see happening. 

So a tale of two stories.

I want to talk about the AI Revolution or really the evolution that we face into. We talk about artificial intelligence all the time as if it's a singular thing.

And people use these words interchangeably. We talk about machine learning, we talk about data science. Want to talk a little bit about what is artificial intelligence?

And it's not just one thing, it's actually a body of knowledge. 

It starts 70 odd years ago with this concept of could we make machines act like humans? 

So artificial intelligence and nested within that are all these evolutions that people then use interchangeably, deep learning, machine learning, and this latest evolution, generative AI.

And the reason we love talking about it in this moment is because generative AI makes it so personal. 

Hands up, who hasn't had a go with chatGPT, copilot or some other form of generative ai? 

You write poetry for your loved ones. You look really smart. It's awesome. I'm going to try it later. Live on stage on some farming topics, but why is it so revolutionary? 

Don't panic. This is classic AI. In order to make AI work, traditionally, you needed a huge bucket of data that you then trained to do one thing.

And if it's visual, you wanted a whole lot of photos. If it's auditory, you need a whole lot of auditory files. So in order to play the game of Chihuahua or muffins, you needed a whole load of images of chihuahuas and muffins. This is a great game about a year ago. Is it a chihuahua? Is it a muffin?

But traditional AI needed a huge amount of data to enable you to do that one thing.

And of course that means it's really rigid. It can only do that one thing. It does it really well. 

The reason why we love generative AI is it's a giant bucket of data. It's all the things, and that's why you can ask it to write poetry, and you can ask the same system to draw your picture, and you can ask the same system to give you a scientific analysis of a set of papers.

It's got a huge bucket of data that can do many things, and that's why it's so interesting. 

But often we get caught up in, it's smart. It feels really bright, it feels human-like in what it's doing, but what it's actually doing is just making the next best guess. 

So if I say the cat sat on the. Most of you in your heads are now saying, Mat, that's what the system does.

It draws a line between the phrases that we use, the words we use, and the most common next word. So that's how it feels so smart. It's making the next best guess. 

But what it has enabled is a whole range of really interesting ways that people are using this. And I gathered four use cases that talk to some of the breadth of the ways people are using this.

So there's a company in New Zealand called Vista. They are cinema operators and their customers want data on how many tickets do they sell, how much popcorn was sold overnight. And normally they would have to log into some kind of spreadsheet, a database to get that information. 

And so Vista thought, what if we could create individual personalised podcasts for every single one of our cinema operators. So they can listen to that information on their way into the office, and they never have to look at that dashboard again. 

So a really cool way of giving people that information they need individualised for hundreds of different people across their ecosystem every morning.

Really interesting way of using generative AI across data that already existed. 

We all know the Northern Territory has a great sense of humour when it comes to tourism. 

We see you in the Northern Territory. But they embraced it. They thought, what could we do in this moment with generative AI that is different and fun, but still part of our personality?

So they built a chatbot using the personality of Abby Chatfield. Now, if you don't know Abby Chatfield great. Quite a bright personality. 

So the chatbot says things like, Hey, Gorge, OMG and maybe that's not how I want to be talked to by a chatbot, but it's great. It has personality. It reflects the Northern Territory.

And so you can go and ask for tourism tips. Where do I go fishing? How do I plan a holiday? 

And it has this wonderful personality wrapped around it. So again, using generative AI to do something that they would do anyway, but just a little bit differently. 

A little bit of fun. More seriously. Education is seeing such an uptake in this, and I think we can all relate to the fact that teachers are overstretched, that we can't get enough of them into all of the areas where we need them.

And so the opportunity to use things like generative AI to create lesson plans, to interact with their students in different ways, to actually create individualised, curated personal learning journeys is pretty powerful. And so my son has learning challenges, and I've seen him come alive through the use of things like generative AI.

He can participate in class in ways he couldn't before. He can't write. He can use generative AI to help create stories and become an editor and curator of that content, which he couldn't do otherwise. 

He'd spend an hour long lesson writing 10 words, and now he can actually participate. So thinking about how it unlocks opportunity for our children and the next generations in fundamentally different ways, and then protecting us all.

Most of the banks have been using this in different ways, and you'll see it emerging in the way in which you interact with your banks, through their interfaces, through chatbots, through risk and fraud protection. 

Lots of different ways that AI is being adopted, but what I thought I'd do is take a quick look at agriculture and I spent yesterday at the expo. 

I saw so many different products out there, which is what inspired the live demo. I'm going to attempt. We'll see how we go because you know, you have this huge opportunity in front of you and there is no industry of which so much is being asked of so few.

We need more food. How do we feed the people we have in Australia, let alone the world? 

How do we think about agricultural more holistically and keep this sustainability front of mind and optimise our operations? 

There are lots of different places where this is occurring and anyone who went to the expo yesterday would've seen almost every single one of these demonstrated in some way, shape or form, whether it's genomics and looking at how we produce different seed types or how we actually look at the genomics of our cattle and our sheep, how we use agricultural equipment.

I had so much fun with the robo dog yesterday and the giant drones, agricultural retail and agronomists. 

I've been working a lot with New Zealand on some of the things that they're doing in that space. How can they use IOT to determine when vaccinations are at the right temperature or not, and get alerts based on that.

Maybe their refrigerators are going down. I was reading the news this morning and there was this article about Fido, which is a new AI tool that can sniff out leaks in your plastic pipes.

That's pretty cool. The challenge I think you will face though, and I know I certainly do, is there is so much information in so many different places. How do I actually get access to all of this? How do I know which thing I need to look at, at which point in time? 

So I'm going to fetch my phone and we're going to give it a go.

So if you think you have 10 different systems using 10 different types of AI and you've got to go to 10 different places for it, that means you're less likely to use it and you're probably not getting the value out of it you want. 

So what I'm going to show you is a system that's looking across all of my applications and data.

Normally I would've trained it on agriculture, but I didn't have time overnight to do it

Hey Sarah, how's today treating you? I'm having a problem with the nectarines I'm growing. 

The leaves seem to be curling on the plants. What could that be? Leaf curl on nectarines often points to fungal infection, like peach leaf curl.

Any odd colours on the leaves, like reddish or purple. Or maybe you've had a particularly wet or humid season. 

I definitely have spots on the leaves. What does that mean? 

Dark spots, especially if they're raised, could mean you've got a pest issue like aphids or mites. Any sign of tiny bugs lurking around if the spots, okay, so you get the idea, right?

So she's tapping into a body of knowledge that's sitting behind the scenes. And normally what I would do is I'd add a photo image to that as well. She'd look at the picture that I'm showing her and provide insight off the back of that. So that's what I think the future looks like for how do you talk to that data?

How do you think about all of those systems in one interface where you can ask questions? 

And it's pulling that insight from every single place. I had a lot of fun with it yesterday. Come and see me. I can show you some of the things it does. This one is really great. I had fun yesterday. I said with the great big drones.

This is me standing in a paddock in New Zealand. I spent a lot of time in New Zealand standing in paddocks. Turns out I'm really good at identifying cows and not rocks. That's what we were trying to do with the drones. The question they had was, we have to do an audit of our cattle. How could we do that with drones?

Because right now they walk through paddocks, three people with clickers clicking to count and they match them all at the end. 

Nope, I got different count back. They go again and they do it all again. So they thought, what if we could fly a drone over the field and get a count instantly of what's in that field?

And then they were like, what if we could work out how many calves there are in that field? Carving season, we don't know how many calves there are. 

What do we want to know where everybody is? Because you know, large farms, large holdings, I don't have just one field. I have to look at, I have 50 fields. And so this is a great project to take that to the next level and get that instant information that you can action on the ground.

And the challenge New Zealand is facing is limited migration. We can't get enough people to staff the farms. So how can we staff them smarter? How can we get information for that one person? You know, it blew my mind that they're doing the milking on these farms with one person running the entire operation and using technology to augment that.

So I thought this was fun, but I thought I'd take you a little bit further. 

And so there are two projects that Microsoft's been working on that I thought I'd share. 

This one's a little bit crazy. This one's called Project Florence. So we love to collaborate with unusual people. This was a collaboration with an artist in residence.

And the idea was, the question they asked is, what if we could talk to trees? 

And what they actually built was an interface where the conversation you have is turned into light signals that are then given to the tree. The tree can then give you light signals in return as a response. I know you're all laughing going, that's crazy.

That's not a thing. It was a really fun experiment. And actually what they found is they could have a really good conversation with the tree about lack of water, problems with pests. 

And so this is the concept they're working on into the future. How could you use this kind of technology, which feels so futuristic and crazy, to help you manage and understand the real needs of your plants?

If you were talking to the plants and not just a sensor in your plants. So that's the first crazy project. 

The second one, is called Premonition.

And this came off the back of Bill Gates has a real passion around malaria. And so his question was, what if we could predict where infections might occur? What if we could predict before it even happens through sensors like this? 

And so this premonition system actually looks at insects in the ecosystem.

It analyses those insects and provides insights into infections that are occurring in the ecosystem around you and can predict with incredible accuracy, chances of infection for humans, for animals, for livestock, for your plants. And so this feels quite future looking, but is actually a really quite close concept of what AI could do for you in the future.

Predict what could happen to your farm, what's about to happen, and the ways in which you could actually mitigate against that. 

And so with that, I'm going to hand to our next speaker who's going to inspire you, and then the debate team who's going to bring the passion. And I can't wait for all the questions that you will have on what AI should or should not do.

Thank you.

Thank you very much, Sarah. Fascinating to see what the possibilities might be there.

Let's go to the practical side now, and it's my pleasure to introduce Charles Simons.

Charles has worked in viticulture in South Africa and New Zealand, California, Germany, Austria, and Australia. He's recognised for his innovation integrating cutting edge technology with traditional techniques and earning him awards such as Viticulturist of the year in New Zealand and Australia.

So he's now the chief sales Officer at BioScout, where he's helping others to implement smart Tech, and in this case study, he'll share his experience of implementing AI in the vineyard and the journey from sceptic to supporter. Please, big welcome for Charles.

Thank you so much, Tim for that introduction and thank you, to the organisers for inviting me to talk about my, my very short journey through agriculture across the world, and also my involvement now with embracing AI. 

Probably one of the most amazing wineries and vineyards, I think globally is called Craggy Range and Hawke's Bay New Zealand.

I was very lucky to work for them in the beginning when we started expanding across New Zealand and really got the opportunity to work with all sorts of tech. That was quite new. Now it's common in most vineyards. We don't have the fancy, big machines that broad Acre farmers have got. Everything is tiny and small.

But I got to work with things that were new to me and that brought me across to Australia in 2012, to work for, I think arguably the most amazing wine makers in Australian modern day Philip Shaw. For those who don't know, he was the head winemaker for Rosemount Lindeman's back in the day.

He did for Australian wines what Cloudy Bay did for Malborough Sauvignon Blanc, they were enormous back in the day. And I got to work with a lot of technologies while working for Philip.

And like many in the room, I was like, God, really? I don't really need this. I know what I'm doing. I've studied viticulture. I don't need a box to tell me what's going on. So I'm like many other farmers that decided pretty quickly, I don't believe in this stuff. I can do it myself but met these guys nonetheless.

We started to design the dashboard, decided to design the AI behind the scenes. 

And BioScout was born to what it is today. And we'll go through it now to give you a background of how we implement AI and what we do. I shifted across with these guys. I'm a part owner of the company as well.

So I shifted across from my viticulture career to BioScout and now consult back to the viticulture industry. 

Measuring spores is a pretty slow process because you basically have to set a station up in the field. It's got a roll of a sticky tape inside. Spores and pathogens get trapped on that tape. Someone's got to go and collect that data, take it to a lab, analyse it, give the information back, then you go to the farmer, here's your results, make your decision.

And that could take up to two to three weeks to get that information back. 

So Lewis worked out pretty quickly that this is not going to work. We need to automate this process. And so we decided to put our engineering and mechatronic skills together and we launched BioScout successfully now for the last couple of years.

Um, for those that don’t know, BioScout is a unit that sits in field. That's a very old picture of the unit, but it's completely self-sufficient. It's got its own battery pack, a backup battery pack, solar powered, it's got telecommunications that we are currently working on with satellites.

So at the moment we rely on GSM or mobile phone networks, but we are going to launch that into satellite phones, so we can go anywhere we want. We basically rely on an airflow to come into a confined nozzle space, take air, and then we disperse that pathogens and spores that's in the atmosphere on a roll of sticky tape.

Think of the tape like a VHS tape. It's quite flexible and small. And the glue captures everything. And as it moves along to the camera, it gets sandwiched between another roll of tape, therefore protecting that data so that we can go and collect that tape if we want to DNA analyse the information on that tape. 

We analyse the spores, we push it to a dashboard in real time, and the farmer can make a decision right there and then what needs to be sprayed, when he needs to spray, and what he needs to do to protect his crop. 

For most of us in the room, we all know what that is, and it's a disease triangle of how we manage our farms.

And for many years, for me as a viticulturist and most farmers today, we only focus below the line. You only know the environment and you know your host and then you make an assumption, oh, but it's raining, so therefore there's disease, so therefore I have to spray. 

But you don't know what pathogens are in the atmosphere. 

You don’t know what pathogens are in your field. So you are assuming that there's accumulating of pathogens, so therefore you have to spray. And all that equates to is the over application of chemicals or the under application of the wrong chemicals at the wrong time, therefore into resistance issues.

And we are now currently seeing in Moree where there's a chemical now, and I won't mention the name that got brought out two years ago. And there's already resistance against that chemical. This is in chickpea farms. So there's already showing of that disease. 

Now we know that that isn't true because how can you decide without the pathogen, but that's the standard.

When I came out of university, there was nowhere ever in my studies that told me this is how you spray. This is how you protect your crop. I started working for a very big South African farmer, and he's probably at least a head longer than I am. Taller than I'm, and told me from the get go, do not tell me what you learned at university.

I'll teach you what to do. And that's what most of us young farmers, do when we start. You work for these influential big guys or women as well, and they tell you what to do and there's no reason why you're doing it. So there was never a challenge for it. Now from the get go, BioScout worked on that basis there.

The image on the left-hand side is the image that we took. 

As you saw on the previous slide, AI was there, but it was never trained to do what we wanted it to do. We couldn't go to a shop front, buy that AI and implement it in our system. 

We had to train it to do what we wanted it to do. So we've got four plant pathologists and biologists in the office that count spots for us, and we had to train AI at the same time to find what we wanted to do.

That's an image that I took yesterday from a unit just outside Wagga as part of the GRDC network.

There's 65 of these of our units across the eastern and western grain belts. And on the left hand side you can see the quality of the imagery that we can take with our microscopic camera. 

And I told my head of AI, Nick Lillywhite, to tell the system to find me plumeria and rust.

And it gave me that slide back within about five minutes. And within five minutes I could tell what was going on that farm and one snapshot. 

So plumeria, it's a bit hard to see for me, but hopefully you can, is that little jelly blob here. That's bloom area. Alternate area is the little worm banana shaped one.

Rust is obviously very easy. It's the real rust blob on the right hand side. 

But it counts that every time. Even if I give the AI, the, that picture again to count, it'll count exactly the same consistently 24 hours a day from that image, the purest in the room, or the scientists will come to us and say, how, but you counted 600 parts, I counted 602. 

So it's not a mistake. We counted one, not two, but there is a percentage of wrong and right between human counting as well, because most spore counts will start at eight in the morning, and they will count nonstop till five o'clock.

They'll count tags or spores, and at some point they'll get a phone call, they'll get tired, they'll make a mistake. This doesn't, it keeps on counting the same amount day in, day out. 

There's a saying that we use in the office that the worst we'll ever be at giving you data is yesterday. Because the more we take pictures, the better we get tomorrow, and the more we take pictures in other regions, the better we get at what we do. 

That's a snapshot of the units. Currently in Australia and New Zealand, we are in Canada as well, and we are also now launching into Europe.

I think in February we sent the first units across to Europe. Now, AI and data in a broad way is what is at the core of BioScout 

AI provides us with enormous scale. So now we can quadruple our production that we can count on each day and consistently count that information.

AI was the salvation for BioScout because if we didn't adopt AI, we will be the same as every other system that relies on human error and human error to count that information that we wanted to give to you. 

As a farmer, we're able to capture a niche skill and change the capacity of what we do.

Like globally, there's a very small amount of people that can count spores very well, and we can use our data to make their job easier, give them the information and they can verify that for us, which we are currently doing to make sure that we have the right spores, and analysing so currently we've got 300,000 spores identified successfully.

We have got roughly about 7 million. I did the slide about a week ago. We're now well over 8 million images and we do about 20,000 images of slides per day and we analyse them within 24 hours. So each time that 20,000 comes in, we'll analyse them. 

That tape scan is split into 220 images, and those images are the size of a USB plug. So if you had to take a football field and press one end to the other, it'll be a football field wheel of images that we've pushed out over the last two years that we've got these images.

Now, we are industry wide, so you can see here the eastern and western grain belts. 

There's a few around Victoria, south Australia. They are mainly in vineyards. We've covered a large portion of this, and as I said earlier, even though that we are across the globe in New Zealand, Canada, Europe, next year, we are better at finding grain diseases when we scan more viticulture diseases, we can also find new diseases.

If you look at the New Zealand model, we had units in Marlborough. And, we started noticing spores on our slides. We didn't know what it was, but the funny thing is they kept moving between stations. So one day we picked it up here and the next day it'll be a couple of hundred meters away, but nothing here on the old station.

And we asked the farmer or the winery, what were they mulching, were they spraying? What are they doing? 

Because we only focus on pathogens for crops, in this instance vineyards. And we actually successfully identified facial eczema in sheep, which is obviously a biosecurity risk of Australia. And we didn't intend to find it.

We just knew it was there because we could see it. And that kind of made us realise that there's so much more that we can do with our technology beyond where we are relying on spores to give farmers the information that they need. 

AI is at the core of what we do. There's no fancy or nice story that you can turn AI off and use a device without it.

It is what we do. We cannot offer you this kind of services if we didn't have AI running in the background.

I thought I'll put that slide up because I'm doing a talk in January at the clean Tech conference or clean tech forum in North America and I did this slide for them. 

Because we always talk about climate change and what climate does to the environment. 

But just from microscopic fungal diseases, we lose 20% of global crops just to fungal diseases, which is about a trillion US dollars.

But the problem is that we spent 20 billion US and producing fungicides and the application of that fungicides, which is 75 million metric tons of CO2 emissions just in fixing that problem.

Which is for the US alone, since 92 to 2016, their yield has stayed at 125%, pretty consistent. But since 2006 to 2016, they've increased their fungicide use by 600%, which means there's an over application of fungicides. There's resistance. They don’t know what they're spraying for, but they're spraying because they have to and they don't know what's going on, what the problem is because they're missing that pathogen link.

And that's where we are trying to make a difference here. 

If you break this down into a New South Wales format, new South Wales alone produces 12 billion Australian dollars in crops, and we lose 20% of that every year to fungal diseases that we are trying to give you the information to fix. 

Well, for me, understanding the scale of the problem and the problem that we are tackling and food security is at its core.

The challenge that I gave our team in the office when I started so many years ago was that I want to spend more time with my family. 

And we all know that in agriculture it takes a lot because you have to be either on a tractor, you're in the field, you're making decisions, you're stressed about the weather, and you're not spending time at home.

And I wanted to make a difference. And there's no way that you can take a problem of the magnitude that we are now seeing and doing it manually. 

It had to be streamlined, mechanised, so that we can get the information out quickly. There's this massive ravine also between scientific data and what happens in the real world.

And I think that's why we as farmers are sometimes so sceptical to take that information and take on technology because it's never been proven in the real world. 

And so our challenge here is to give a farmer as much information that we can give him to help him or her make a decision that's going to make a difference.

And one of these was this leaf here. I'm sick and tired of using water sensitive paper. It's the most useless piece of information there is because unless you get it straight away, the water keeps spreading on the paper. So unless you pick it up within a minute, the water droplets keeps going and it looks better than what it actually was.

I want that in a digital form. This is my design. Can you make it for me? 

And Nick, our AI said but you forgot to mention AI. It'll be in there as well. And I thought, God, not again. Like, what are you talking about? And then he started breaking it down in technical terms how it'll work.

And now it's something that just runs that leaf autonomously. So what you see on that leaf is seven zones, which you can't see with a naked eye. There's seven zones in the front and seven zones in the rear of the leaf. And each of them runs independently. So 14 zones across the leaf. It's AI powered or generated.

It is 3D printed, so it's printed in the 3D model. It's got its own solar panel. The battery lasts for about 10 years. It's GPS tracked. But it gives you real information and field. So it gives you water coverage. It gives you humidity, not as a proxy, but as a real leaf wetness. because us farmers, we rely on leaf wetness to decide risk of disease, but that is based on humidity, not on actual leaf wetness.

And for me, when someone says to me, a leaf is wet at 85%, but at 83%, it's not wet. It makes no sense because how can that be possible?

Now, along with how AI will give you a forward prediction, we will now be the ability later this year to switch on a function within our units that, along with this leaf wetness, will give you forward prediction on your disease outbreak risk. So we won't just give you spores now.

We'll take weather analysis that you put in, what weather app you want to use, whether you use Elders, whether you use Bureau of Meteorology, whatever app you want to use, you can input that data in there and it will take that data from the cloud and analyse it and tell you disease outbreak risk.

So it'll go in a seven day period and it'll say if the spore stays where it is now, and this is your weather conditions, this is how your disease risk will look seven days ahead. This is if it's in perfect condition, this is if it's a worse condition, and you can base your spraying based on that information in real time.

We'll also turn on a functionality where all your spray inputs will automatically be uploaded into the system. So you can see your effect, your efficacy on spore loads by your application of chemicals. You can see whether there's resistance against your chemicals or there should be a drop in when you spray your chemicals.

I did this in New Zealand. There's a trial, so it's trialled now for the second year in New Zealand, and it's also a trial at Queensland University to their strawberry field. 

What I didn't show is that leaf will be cut into any shape we want to have it in. So whether it goes into, strawberries, canola, cotton, the leaf will be cut to fit that same crop to give you the information.

It's not always going to stay a vine leaf. This stays in the canopy. It gives the operator of that unit real time canopy information what goes on. And with AI, with a spore count, it gives you the information that you need to make a decision on what your outcomes are. How I used AI water sensitive paper to trial the initial leaf to see what's wet.

And I used an AI analysis out of the United States to work out how much percentage of that leaf was wet. Now if you pay really close attention, if that was my farm, I would say, Ooh, that's pretty good. I'm happy with that spray coverage, it's even, it's all over the leaf, but only 42.52% of that block is actually wet.

There's a huge amount of that little block that's not wet. And so therefore, how can that be good spray coverage? 

Because then someone will come out and say, ah, you've got to up your water rate. But no, it's not always upping water rate. It's speed as well. How fast are you going? And in real time you can now use this data and you can see on the bottom left hand side, I'll go to the next side.

It's a bit bigger. I've now used AI to work out what percentage of each of those leaves that paper is wet. Now each of them represented back in the front, same as the leaf. And you can see the percentage breakdown on each of them. So what I now do is I bring this along with our units, information that we give you through AI and give you in the cab, you sit on the tractor or on the farm. 

Or let's say a grower added Moree, I use Moree as an example, and you use a contract sprayer as your farm is that big. 

You can't be there. I know what it's like when you pay a contractor, did he actually spray because I'm paying him all this money and there's an empty chemical drum? 

But I don’t know how good the application was. With this, I can see in real time what's happening.

But what I've done now also is load the actual disease outbreak on each of those discs. 

And the colour relates to the disease outbreak. So if it's green, that amount of spray coverage with that amount of spore load in the atmosphere with that weather conditions, that's going to give you good protection. 

If you spray only the red, you are not protecting your crop. You are going to get a disease outbreak because there's not enough moisture or chemical particles on that leaf to actually protect your crop going forward.

And so for me it was all about understanding how can I make farmers' lives easier? 

I want to spend more time with my family. I've got a young family and I want to make sure that I can be less hours in there and more at home. 

And I'm going to leave you with the last picture, which for me is something that I've, I've always wanted to have kids and to be able to finally have kids was the slide that I think for me sums it up.

And I love big tractors, right? So I put that on there. But you can either choose option A, do things the normal way you got told and spray because you have to, or option B, use AI to detect spores along with weather data, to predict outbreaks and only spray when necessary. I'll choose option B every day.

Thank you.