Text Transcript with Description of Visuals
| Audio | Video |
| Dr. Alex Kirkpatrick: Hello. I’m Dr. Alex Kirkpatrick, and I’m a science communication researcher and educator at the Center for Sustaining Agriculture and Natural Resources at Washington State University. There’s a lot to talk about when it comes to artificial intelligence, ranging from its technical aspects to its impacts on sustainability and agriculture as a whole. But first, it’s useful to establish exactly what is and what is not artificial intelligence. In this video, we’ll define precisely what we’re referring to when we talk about AI. | Footage fades in of a person with dark hair pulled back, sitting before a pink and green gradient background. Speaking into a black microphone while wearing a white vest over a black t-shirt. Text at the bottom left identifies the person as Alex Kirkpatrick, PhD. |
| [ Music ] | The screen fades to black, then more footage plays, showcasing an aerial view of a combine harvester moving through a golden wheat field, leaving rows of cut straw behind it. A female farmer wearing a cap and plaid shirt uses a tablet in a sunny field, with farm machinery in the background. Another woman holds out a green leaf in a field surrounded by the same crop. Inside a greenhouse, an automated irrigation system sprays a fine mist over rows of lush, green plants. A presentation title slide features logos for Sustainable Agriculture Research and Education (SARE), Washington State University, US Department of Agriculture (USDA), and for the AI Institute for Transforming Workforce and Decision Support (Ag AID), along with the main text at the center of the page Defining artificial intelligence and the subtitle What are we talking about when we talk about AG AI?, all set on a light background with a decorative vine on the right. |
| While the science of AI dates back to the mid-20th century, digitization, exponential growth in computing power, robotic hardware and sensing, and many other advances in human understanding have propelled AI technologies into mainstream industry. | The voice of Dr. Alex Kirkpatrick, PhD, plays out over footage of two people standing next to a sprawling control panel filled with switches, dials and buttons in an industrial control room. Then a row of several tall, black server racks with perforated doors appears, revealing the stacked internal components and blinking lights. |
| AI has even crept into the daily life of the individual, leaving few of us untouched. | A large radio telescope dish against the warm colors of a sunrise. Two office workers are collaboratively looking at a computer screen, with a man leaning over a woman’s shoulder and both pointing towards the monitor in an office setting. Clips of people looking at their devices as they scroll in bed, while gathered in a living room with others, in a kitchen and other locations. This aerial image shows a sprawling city with numerous red and blue map pin icons overlaid across the illuminated streets and buildings. |
| It’s important to note that many definitions of AI exist in academia, in America, and around the world. But here’s the US government’s definition: “A machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments.” | Another presentation page is displayed. The right side displays a detailed photograph of an electronic circuit board. And the left features text that appears as the narrator speaks. The black text reads: U.S Government defines A.I. as: A machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. In brackets: 15 U.S.C. § 9401. |
| The code goes on to specify that an AI system uses machine and human inputs to perceive both real and virtual environments, abstract such perceptions into models through analysis in an automated manner, and use model inference to formulate options for information or action. We’ll use this definition as our definition of AI throughout these videos because it’s likely the most influential if you’re based in the US. If that feels a little clunky for everyday conversations and you’re looking for a shorthand, here’s an informal definition that I often use when talking with non-technical audiences. | Below the main definition, bullet points and sub-points appear outlining how artificial intelligence systems use machine and human inputs. The title of the section reads, Transcription Artificial Intelligence Systems use machine and human-based inputs to: A. Perceive real and virtual environments B. Abstract such perceptions into models through analysis in an automated manner. C. Use model inference to formulate options for information or action. |
| Artificial intelligence is simply intelligence displayed by algorithms and machines. | Another presentation slide featuring a text definition of AI on the left and a close-up, greenish-tinted image of an electronic circuit board on the right. The black text on the left reads, A shorthand definition of A.I. might be: Intelligence displayed by algorithms and machines. |
| Most people recognize intelligent action when they see it, but it might beg the question: What constitutes intelligence? A shorthand I employ is to say intelligence means it can navigate its environment and make decisions and predictions without human help. | |
| Let’s consider some everyday AI examples. Netflix recommendations might not seem all that intelligent. You accidentally watch something by one actor, and the next thing you know, you’re inundated with recommendations for the rest of their back catalog. But is it artificial intelligence? Well, it’s a machine-based system. It’s a computer algorithm written by humans to mimic human decision-making about what you might want to enjoy. It gathers data on what you’ve watched and predicts what else you might like based on the presence of shared parameters like genre, actors, and other users’ ratings. And then it makes the decision on what to recommend to you without any human interference. Your feed changes as a result. It becomes tailored to you without any human oversight necessary. Humans set the objectives, but the AI goes about its task independently to affect an outcome independently. It’s also intelligence displayed by code and navigates its virtual environment appropriately to produce change in that environment. So it fits my shorthand definition, too. And Netflix, Spotify, Apple podcast recommendation algorithm is artificial intelligence. | Next, the speaker is positioned on the left side of the presentation slide. The right side of the slide contains text, that appears as they speak. The right side of the slide contains text outlining the definition and application of a machine-based system, using Netflix recommendations as an example. A machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. More text appears below as they speak. With the title ‘Netflix recommendations’, With four bullet points below explaining how the system works: Machine-based, human-defined objectives. Predicts preferences, makes content recommendation. Influences your Netflix feed in brackets virtual. A large green tick appears beside the text. |
| What about a common email spam filter that you might use, wittingly or not, every day? | A clip plays of a person typing on a keyboard, then a cursor hovering over a spam button on a computer. |
| Well, it’s certainly machine-based, and the programmers defined its task and purpose, which is to identify and separate spam emails from legitimate ones. A spam filter predicts whether an incoming email is spam or not based on parameters like keywords, metadata, sender information, and it takes action without human interference. It decides on a balance of probabilities how to categorize your emails and into which folder they should be filtered. So yes, a common everyday email spam filter is artificial intelligence. | The next slide again features the speaker on the left side of the slide, with text on the right defining a machine-based system using an email spam filter as an example. Text reads: A machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. More text appears below as he speaks. The main title is Email spam filter. Below the title, there are three bullet points explaining how the system works: it’s machine-based with human-defined objectives, predicts spam using training parameters, and filters emails into appropriate folders. A large green tick appears beside the text. |
| You see, AI is already ubiquitous. Most of us are AI users. But polling suggests that most Americans don’t think they use AI at all, even when they use it every day. This phenomenon actually has a name. It’s called the AI effect. As an AI technology becomes more familiar, more everyday, users tend not to perceive AI’s presence in it. It can be useful to ground any conversations about AI by pointing to everyday applications like content recommendation algorithms or spam filters. People have a tendency to spring into the abstract world of science fiction or the far future when talking about AI, which can distract from the concrete here and now that often demands our attention. It can be helpful to refocus audiences on more mundane AI applications and make people aware of the likely truth that they are already AI adopters. Just like I did right then. But what about agricultural AI specifically? Well, allow me to prime your thinking by first showing you this shamelessly promotional video from John Deere about an autonomous tractor system. | Dr. Alex Kirkpatrick, PhD, fully reappears on screen against the soft gradient backdrop. |
| Doug Nimz: When you started out in the spring, you work the soil, and it just smells so fresh. When you till it up and it’s just the greatest smell. | Footage shows a man wearing a cap, a light-colored jacket, and blue jeans, walking in a field of dry, harvested corn under an overcast sky. A text box in the bottom-left corner identifies the footage as ‘John Deere 2022, Autonomous 8R Tractor’ and includes a YouTube link for a video. |
| When I started farming, there basically was no technology. Every tractor was driven manually. Everything was done manually. You’d be planting. You had to follow a line. If the sun was wrong, you would lose the line. Darkness, you couldn’t see your marks. Moisture, you couldn’t see your marks. Then you’d get squiggly rows. | He unravels a corn husk. Then, archive footage shows a tractor being driven in a vast field. |
| My name is Doug Nimz. I’m a farmer from Blue Earth, Minnesota. I’m a fourth-generation farmer, and I raise approximately 2,000 acres of corn and soybeans. | The man now faces forward. He is wearing a cap and a light-colored, button-up shirt stands in front of a large green John Deere tractor. |
| I really never thought I would see an autonomous tractor in my farming career. For me, it was really exciting the first time I got to take the autonomous tractor to the field, swipe my phone, watch the tractor start with no one in the cab. Start doing tillage, come to the end of the field, turn on the end do the tillage just as well as I can do myself with no one in the cab. I can pull up the app. I can monitor the tractor, see how much of the field it’s gotten tilled. I can check the fuel level. I can check the app to see how much of the field is left. If there was something in the field that it wasn’t sure about, the tractor will stop and alert me. Is this something I can go around? Do I need to go out and remove an object from the field? The app gives me all this information so I can monitor everything very closely. On farms, labor is always a challenge. | Doug is walking through a dry, tilled field while holding a mobile device in his hand. Doug taps his device and looks up as the driver less tractor begins to move across the field, pulling a tilling implement. The large, modern John Deere tractor slows, and Doug retrieves his phone from his shirt pocket and looks at it. |
| You need labor or lots and lots of hours for short periods of time. | Aerial footage captures a green John Deere tractor driving down a dirt access road between two large agricultural fields, kicking up dust behind it. It turns into one of the fields on the left. |
| The auto steer and technology has helped reduce our labor load, which makes my life a lot easier. Autonomy will help because we will be able to put a tractor out in the field and let it run for 24 hours a day because it’s not manned. But it also helps us with the weather because we can run so hard when soil conditions are fit. The thing that excites me the most about autonomy is not being locked in the tractor cab all day. It will just allow me to run my business better because I can just pay closer attention to other tasks. Now, we’ll be doing the jobs that we always wanted to get done but never had time to because we were in the cab all the time. | |
| Farmers are fairly traditional, but I have a feeling that once they try it, they will become very accepting of it. I think the tractor can do a better job than I can do. Autonomy, it’s going to be a life changer for me. | A woman smiles as she gets out of a tractor cab. Then Doug shows a smartphone screen to two women as they sit smiling together on the front steps of a house. Later, he stands with an older man as they converse with a tractor behind them. |
| Dr. Alex Kirkpatrick: A tractor like that in the John Deere advert, or the autonomous concept from CNH that you see there in the picture on your screen, isn’t AI in of itself. We all know that. But the computer system operating all autonomous vehicles is AI-based. The system is coded to mimic human control by processing data from multiple sensors, such as GPS for location tracking, and cameras and LiDAR for detecting objects and terrain, measuring distances, etc. The farmer themselves can also define some of the goals to meet their own individual needs. | Dr. Alex Kirkpatrick speaks, as a new presentation slide appears featuring a red autonomous tractor operating in a field on the left, with text on the right that reads, “A machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments.” The bullet points appear below the title, “Autonomous tractors,” as Dr. Alex lists them out. They Computer system interprets sensors. Programmers slash users define goals. Uses sensing and training data to predict and make decisions. And lastly, Operates the tractor as a human intelligence might. |
| A smart algorithm can then make decisions about navigation, obstacle avoidance, plowing, seeding, harvesting, all based on a dynamic awareness of its surroundings. It senses something in the way, predicts it’s a rock based on training data, and makes the decision to slam on the brakes or steer aside. It doesn’t need human interference if it’s working properly. In this way, it’s acting appropriately and with foresight in its environment, simulating that portion of human intelligence it takes to operate the tractor effectively. So it fits the definition of AI, of course. | A short animation illustrates an autonomous vehicle utilizing Lidar Light Detection and Ranging technology, with blue light beams emanating from the white car as it drives down a road. The presentation slide with the red tractor, along with the text, reappears. |
| But AI doesn’t need a robot to act upon the real world. It doesn’t even need to navigate the real world at all to simulate human intelligence. Watch this quick video from agtech company Plantix. | A large green tick appears beside the text. Dr. Alex Kirkpatrick, fully reappears on screen against the soft gradient backdrop. |
| [ Music ] | An aerial view of a rural agricultural landscape with rolling green fields, tilled soil, and rows of dense crops. A mountain range is visible in the background under a cloudy sky. An overlay of text at the center reads, Plantix, a mobile application, revolutionizes farming with a little help from A.I. |
| Dr. Srikanth Rupavatharam: Agriculture is the oldest occupation, and more than half of the globe’s population is directly or indirectly dependent on that livelihood of agriculture. | A text box in the top-left corner identifies the footage as Plantix Twenty Nineteen. A mobile app revolutionizes farming with a little help from A.I. It includes a YouTube link for a video below. A man with dark hair wearing a purple collared shirt, standing outdoors in an agricultural field. The background consists of numerous tall thin wooden poles likely used for supporting plants, with some green foliage and trees visible further back. Text in a box below identifies him as Dr. Srikanth Rupavatharam, Director of Agriculture Science for ICRISAT, and provide information about an app called ‘Plantix’. |
| It is mainly comprised of farmers coming from the developing world, having less than one or two acres of land. | A clip captures several people working among rows of tall, wooden poles supporting green plants in an agricultural field. A woman leans down as she works on the crops. |
| The major problem that the farmers around the world face is losses because of pest or a disease or a nutrient deficiency. They depend on the local information that they get from other farmers or experts like extension workers from the agriculture departments. Can they reach everyone? No. And here comes a digital tool called Plantix. | Hundreds of birds fly from a vast plowed field. A group of people, mostly men, are gathered outdoors in an agricultural field, appearing to be attending a presentation led by one man on the left. Dr. Srikanth Rupavatharam reappears on screen. |
| Robert Strey: The whole idea of Plantix is that we can use modern technologies to help farmers everywhere in the world. We use machine learning, artificial intelligence, and big data to derive insights. But this might sound a bit abstract to you, so let me give you an example. | A man with a Mohawk-style haircut wearing a black long-sleeved shirt, in a setting with green plants and hanging objects. Text in a box below identifies him as Robert Strey, CEO and Co-Founder. |
| Let’s say you see a sick crop in your field. You can just, like, easily take an image with Plantix. This image is sent to our servers, analyzed by deep neural networks, and we will tell you, on the spot and instantly, what is wrong with your plant. | A clip shows a person in a blue and white striped shirt walking in between rows of green crops, then kneeling and holding a smartphone close to a row of green plants. |
| Alexander Kennepohl: So, almost every plant disease and pest is leaving a specific pattern, for example, on the leaves. And this can be just the yellowing of the margins, or this can be some kind of lesion or some kind of spots in any kind of form and color. | A man wearing a white t-shirt and glasses speaks in an office setting, with text to the left identifying him as Alexander Kennepohl. |
| Robert Strey: More than five million images have been sent by our users. We can automatically recognize more than 350 different plant pests and diseases. And we can play back this information to the farmer when he needs it direct on the spot. Alexander Kennepohl: With the detection of the problem, the problem is, in most cases, not solved. That’s why we are providing biological treatments as well as chemical treatments and also cultural practices, so that the users can learn how to prevent the damages in the next season. | Robert Strey reappears on the screen. Screen flashes through images of farmers in fields using the Plantix application. Alexander Kennepohl reappears. Screen flashes through images of farmers spraying their plants and spreading fertilizer. |
| A farmer, Sandeep Shinde, speaks in another language. The translation reads, “It is very easy to use this app. You need to have a 3G-enabled Android phone with an active internet connection. Once you download the Plantix app from Play Store and save it, you can easily use it.” He continues, “I would like to tell all the farmers that using this app can help you to reap more benefits and income from farming. You all should definitely use this user-friendly app.” | The image shows a rural scene with thatched huts and simple structures. Chickens wander across the dry, rocky ground. A large leafy tree stands at the center. A bearded man in a bucket hat and a blue-and-white striped shirt stands on the right. To the left is a green agricultural field and distant hills. Text to the left side identifies him as, Sandeep Shinde, Farmer, Talewad Village, Junnar Taluka, Pune district. Sandeep walks in a field of cabbage plants. He crouches among rows of large, green cabbage plants and uses the Plantix app on his phone to inspect the crop he’s touching. |
| Simone Strey: Everything we do is dedicated to the farmers’ need. So we want to support them to increase the quality and quantity of their yield. Our vision for Plantix is to become the worldwide biggest farmer community and to provide farmer knowledge anywhere at any time. | A woman with short blonde hair pulled back, wearing a dark collared shirt, in an outdoor setting with green foliage and buildings behind her. Text identifies her as Simone Strey, CEO and Founder of the Plantix app. |
| Dr. Alex Kirkpatrick: Apps like Plantix use a form of AI called machine learning, which is covered more in other videos. Although disembodied, existing in some static data center or cloud, the algorithm is definitely a machine. It’s code. It can navigate the virtual realm. Humans write the code, define the parameters of the data set and the allowable outputs to achieve human desires like spotting diseases. The algorithm is then expected to work pretty much independently without much human interference. The algorithm can predict whether or not a disease is present based on the image. It can decide to tell the user, yes, this is a disease, or no, it’s probably not. | Another presentation slide is split vertically: the left shows a woman using a teal smartphone in a green agricultural setting. The right side has a white background with black text and bullet points. The paragraph above the points reads, A machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. |
| The algorithm has to act appropriately in the virtual realm in order to refer to its limited memory and signal its decision to the user through the app. In other words, it influences the virtual environment. So yes, it’s AI. | The points below the title, Plant Village, Plantix, and other plant slash disease recognition apps, read, Disembodied algorithm. Recognizes specific diseases in specific crops. Predicts the likelihood of disease. And lastly, Signals its decision to the user. Each point appears as Dr. Alex speaks. |
| So what about CROPWAT, or any number of other fixed models or predictive computer software that can help calculate things like irrigation requirements? Well, it’s computer software. So check. It’s a machine. Humans designed it and defined what it should do. So check again. But it doesn’t learn anything from the data, really. It recognizes no patterns. It’s essentially a big calculator unresponsive to context. It certainly doesn’t make any flexible decisions at all. Static data goes in, a pre-defined calculation takes place, numbers come out based on a complex yet rigid model, and the machine itself doesn’t influence its virtual environment based on any decision it comes to independently. Unlike Plantix, it follows the same process every time. Human input, crunch, useful displayed output for a human to interpret and make decisions. Nothing is learned. It can only display what its limited coding lets it display and cannot respond to a live weather or market conditions. Useful, but not artificial intelligence. | A large green tick appears beside the text. A split presentation slide: on the left, Dr. Alex appears speaking into a black microphone against a soft gradient background. On the right, black text on a white background. On the right, black text on a white background reads: “A machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. CROPWAT decision support tool.” Four bullet points appear, one: Computer software. Two: Calculates irrigation requirements based on a fixed model of inputs. Three: Uses deterministic formulas, and four: Requires human interference. A red X appears beside the text. |
| [ Music ] | ‘The takeaways’ written in large black text, centered on the turquoise colored background screen. |
| So, here are some key points to take from Video 1. AI is everywhere already. Most of you are users, and most other people are, too, even if they don’t know it. AI’s use in agriculture is growing. It’s more than just hype at this stage. It’s already here. Most importantly, we set our formal definition of AI in keeping with the US Department of State’s definition. Very helpful to keep in mind when recognizing what is and what isn’t AI. I also offered a shorthand, informal version that might be useful when referring to AI with your audiences. It’s simply intelligence displayed by algorithms and machines. And intelligence means it can navigate its environment and make decisions and predictions without human help. Important to always know that virtual environments are still environments. To an AI, every environment is simply a simulation. | A white slide shows black bullet-point text on the left defining AI and its growing role in agriculture. The right side displays a glowing light bulb graphic with green leaves and a vine wrapping around it, with faint vine patterns in the background. The text reads: A.I. is already a commonplace technology. The market for ag A.I. is growing rapidly. A.I. is a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Or: Intelligence displayed by algorithms and machines. Intelligence means it can navigate its environment and make decisions and predictions without human help. |
| [Music in background] There’s a lot more to still talk about. You can advance to Video 2 at your leisure, or indeed watch any of the videos in this toolbox in whatever order best suits you. Thanks for your energy and your engagement. | “The closer” written in large white text, centered on the black background. Dr. Alex Kirkpatrick fully reappears on screen against the soft gradient backdrop. |
| [ Music ] | “See you in the next video” written in large white text, centered on the black background. A white informational slide displays funding acknowledgments and disclaimers. Text at the top reads, National Institute of Food and Agriculture, U.S. Department of Agriculture. On the left are the USDA logo, the Western SARE logo, and the Ag AID Institute logo. To the right, black text explains the grants and required statements, including” “This material is based upon work that is supported by the National Institute of Food and Agriculture, U.S. Department of Agriculture, under award number 2023-38640-39571 through the Western Sustainable Agriculture Research and Education program under project number WPDP-24-013. USDA is an equal opportunity employer and service provider. Any opinions, findings, conclusions, or recommendations expressed in this publication are those of the author and do not necessarily reflect the view of the U.S. Department of Agriculture.” |