Text Transcript with Description of Visuals
| Audio | Video |
| Dr. Alex Kirkpatrick: Hi, Dr. Alex Kirkpatrick here. And in this video, we’ll delve a little bit more into the broad concept of artificial intelligence, which we defined as a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. This video aims to clear up some misconceptions, perhaps even some misinformation that can derail everyday conversations about artificial intelligence and quickly send otherwise productive conversations careening into the infinite landscape of speculative fiction or overhyped threat or ability. Along the way, we’ll keep exploring the impacts of increasingly smart algorithms on efforts to achieve agricultural sustainability. Let’s explore. | Footage fades in of a person with dark hair pulled back, sitting before a colorful gradient background. Speaking into a black microphone while wearing a white vest over a black t-shirt. Text at the bottom left identifies the person as Alex Kirkpatrick, PhD. |
| [ Music ] | The screen fades to black. Footage plays, showcasing an aerial view of a combine harvester moving through a golden wheat field, leaving rows of cut straw behind it. A female farmer wearing a cap and plaid shirt uses a tablet in a sunny field, with farm machinery in the background. Another woman holds out a green leaf in a field surrounded by the same crop. Inside a greenhouse, an automated irrigation system sprays a fine mist over rows of lush, green plants. A presentation title slide features logos for Western SARE, Washington State University, USDA, and for the Ag AID Institute, along with the main text at the center of the page “Categorizing artificial intelligence.” All are set on a light background with a decorative vine on the right. |
| While the science of AI emerged in the middle of the last century, the release of ChatGPT to consumers in 2022 ignited public awareness of AI and propelled it into the mainstream. | A dark-themed interface shows the question input box of an AI large language model. A user types in the question, “Will AI help?” Below it are suggested prompts. The user clicks the arrow icon and the LLM answers the prompt, with text rolling across the screen quickly. A short clip shows a person’s hands at a white desk, writing in a notebook beside an open silver laptop. A black headset sits to the left. The laptop screen displays an AI interface with prompts. |
| But there’s far more to artificial intelligence than ChatGPT or other large language models. Agriculture is likely to employ an array of different types of AI to achieve specific goals. So, it’s helpful to understand how AIs are categorized by researchers and developers. | Dr. Alex Kirkpatrick reappears on the screen. |
| [ Music in background ] There are three major branches of AI based on capability. First, we have artificial narrow intelligence or weak AI. This is the most important branch of research and technologies to us here and now, because simply it’s the only one that exists outside of theory. The AI we have today are each designed for a narrow discrete function like “avoid obstacles” or “generate an image” or “recommend a show to watch.” They can’t do much outside of that narrow function, hence narrow artificial intelligence. People’s fears about the present influence of AI on workforces and society more generally are sometimes blended with the concept of artificial general intelligence or strong artificial intelligence. Such an intelligence will be comparable to a human intelligence. Without supervision, it could think across many domains like a human can. It could drive a tractor. It could reflect and write about its experiences driving a tractor. It could recognize when it was time to deploy frost mitigation in a vineyard or learn how to manage irrigation, depending on multiple environmental factors. It could basically learn anything a human potentially could. It could process emotional cues and simulate emotional intelligence. It could make sense of abstract concepts like love and friendship, like we can. It’s been the focus of a lot of speculative fiction, but artificial general intelligence exists only in the public imagination for now. Then, of course, there’s a step beyond simulating human intelligence and creating a superintelligence, an entity that could surpass human intellect. | The text, Categories based on capability is written in large black text, centered on the turquoise colored background screen. A presentation slide lists three AI capability categories on the left, with a light bulb wrapped in green leaves on the right. The text reads: “3 Categories based on capabilities. One, Artificial narrow intelligence or ‘weak AI.’ Two, Artificial general intelligence, in brackets AGI or ‘strong artificial intelligence’: With two sub points that are 1. Human-like general intelligence, 2. Emotional intelligence and understanding abstract concepts. Three, Artificial super intelligence in brackets ASI: With two sub points that says, Surpasses human intellect, and Self-aware with superhuman insight.” |
| We are likely centuries away from creating machines that are more intelligent than us across all domains at once, if that’s even possible. Such a machine might become self-aware hypothetically and outdo its own control mechanisms and programming. We’ve all seen the movies and pondered the philosophical questions. But we’ll keep the concepts of general or super intelligence to the far future. We’ll be keeping our conversation concrete and fixed firmly on the realistic here and now. We’re only going to discuss weak artificial intelligence and its applications, risks, and benefits to sustainable agriculture. | The points are removed from the presentation slide, excluding the first category, which slides to the center of the screen. It reads, 1, Artificial narrow intelligence or ‘weak AI.’ |
| Weak AI is all the AI we have today. We’re talking about machines that humans design to perform a specific task or a very narrow range of tasks, like differentiating between an apple-shaped object and a leaf-like object on a branch using sensors. Perhaps, it might also perform the additional task of extending a robot towards said apple-shaped object and retrieving it like this. | A new slide shows Dr. Alex speaking into a black microphone on the left. On the right, black text reads: Weak, narrow A.I. is the A.I. of today. Task-specific. |
| [ Music ] | Footage shows a row of apple trees with red apples on both sides and an autonomous harvesting machine moving down the row. The machine has multiple blue robotic arms mounted on a platform. A text overlay at the bottom left reads: TEVEL 2023 “Fruitful & Tevel partner to bring autonomous harvesting to Chile” With a YouTube link to the footage. A smaller caption at the bottom right states: “The system consists of Tevel’s robots mounted on a ground platform created by Darwin Harvesting Group.” The Tevel logo appears in the bottom right corner. A multi-tiered robotic harvesting machine moves between narrow rows of green apple trees, their branches heavy with red apples. Several blue, drone robots use suction arms to pluck red apples from the dense foliage. In the upper left, the drone’s blue body is visible as its arm reaches out to gently pick an apple. On-screen text reads: “Each apple is delicately picked using suction and carefully placed on the platform.” The large harvesting machine moves steadily through the orchard, with an array of robotic arms and drones plucking apples from the trees. On-screen text states: “Multiple robots picking efficiently side by side while collecting real-time data on every fruit picked.” A split-screen image shows a robot picking apples. The left side displays a camera view of apples on a branch with labels such as SUPER, SUPER 3.5, and SUPER 4.1. The right side shows the same apples on a black background with the same labels, showcasing the AI analysis. The text at the top says: “Selective picking customized to Unifrutti’s color grading scale.” A close-up shows a pile of freshly picked red apples moving along a white and black robotic arm. At the end of the robotic tube, a brush gently cleans the apples before they settle into the storage bin. On-screen text reads: “The onboard bins are filled with the freshly picked apples.” The drone picks an apple. On-screen text reads: “Each apple is gently plucked to ensure optimal quality and to avoid bruising.” A blue robotic arm with a suction cup reaches into a dense apple tree filled with green leaves and red apples. Green bounding boxes, generated by a computer vision system, highlight several apples. On-screen text in the bottom right reads: “The robot detects and picks hard-to-reach apples, hidden deep within the canopy.” The machine moves between rows of apple trees. On-screen text in a green box at the bottom-right reads: “Unifrutti is revolutionizing fruit harvesting by partnering with Tevel, paving the way for the future of agriculture in South America.” |
| An apple picker like that, or an app like Plantix, can recognize patterns such as what a disease might look like or what an apple generally looks like, but there’s no abstract thinking involved. It doesn’t really conceptualize what an apple or a disease is, or what it means for us humans or an agricultural system. It cannot jump the fences of its programming. To switch tasks, it needs to be reprogrammed or retrained, if that’s even possible. | The presentation slide reappears and shows Dr. Alex on the left. And on the right, black text reads: Weak, narrow A.I. is the A.I. of today. Task-specific. |
| So now we know that, here and now, all we have is weak AI with narrow functions and narrow parameters. But those three broad classifications don’t really tell the whole story. So, let’s look at the categories of AI that are based on functionality. | Two more bullet points are added below Task-specific. They are: “Recognizes patterns” and “Reliant on pre-programming.” The text box vanishes, and Dr. Alex’s face fills the screen. |
| There are two categories of existing AI, or weak AI, based on functionality that we’ll explore in just a moment and throughout this toolbox. Reactive AI algorithms respond solely to immediate inputs they receive and follow preset rules to complete tasks. These systems don’t necessarily need to remember past actions, and they won’t change their approach or improve based on experience. They react the same every time. An intelligent braking system in a vehicle might use reactive AI responding to objects it senses and relative speeds, etc. Early chess-playing computers like IBM’s Deep Blue are another good example. Deep Blue could analyze the chessboard state and make moves, but it didn’t learn from past games or adapt its strategy beyond what it was programmed to do. Reactive AI is limited to very specific tasks and relies on real-time data. Limited memory AI can retain and use recent data to make more informed decisions. This type of AI has reactive elements too, but adapts based on limited recent memory. Remember the self-driving tractor we looked at in the first video? It likely remembers recent obstacles in the short term like rocks or machinery, so as to avoid them on their path or navigate around them. Chatbots can remember earlier exchanges in a conversation to provide context-aware responses, but until such models are retrained, they don’t generally retain this memory over the long term. More on limited memory and machine learning later. The final two categories are again hypothetical and not subject for exploration here. For informational purposes only, you might hear theory of mind AI, or even that some researchers and theorists are engaged in theory of mind research. This relates to AI general intelligence and the notion that a machine might simulate the entirety of human intelligence. They could recognize implicit meaning and provide socially intelligent responses. Then there’s the concept of self-aware AI. This relates to superintelligence. We’re talking about artificial life almost and very definitely, the domain of science fiction, not reality. We are planting our conversation firmly in reality. We deal with reactive machines and limited memory AI. Both categories of the only type of AI that currently exists, weak AI with narrow function. Let’s look at another example of extant AI and the sort of journalism it attracts. | The text, Categories based on functionality is written in large black text, centered on the blue colored background. The categories are listed as follows as he speaks: One, Reactive machines: with the three sub points, 1. Respond to immediate inputs, 2. Follow reprogrammed rules and 3. Do not memorize or learn. The Second category is, Limited Memory: with the two sub points that are, 1. it Retains recent data and 2. No self-generated long-term memory. Category Three, Theory of mind: with the sub points, 1. Relates to general intelligence and 2. Recognizes and displays emotional and social intelligence. Category Four, Self-Aware: with the two sub points, 1. Relates to super intelligence and 2. A conscious super-entity.” |
| Jake Ward: Okay. Yes, it’s a killer robot with an AI brain. But hear me out. It kills weeds. | Footage showcases the underside of a large, reddish-brown agricultural machine. Multiple rows zapping white lights briefly illuminate the ground below. The perspective looks upward at the complex mechanical structure and zapping system. On-screen text identifies the segment as an NBC News report from 2023, titled: “A.I. meets agriculture with new farm machines to kill weeds and harvest crops.” A YouTube link to the video is displayed below. |
| On a 2,000-acre organic farm in Central California, this $1.2 million machine does the work of 30 people, 24 hours a day. | The machine moves over rows of green plants in brown soil, steam rising from beneath as it kills weeds. An aerial view shows rectangular fields in green and brown, with one machine working below, white vehicles on a dirt road and a mountain range under a blue sky. |
| Paul Mikesell: The machine is thinking. It’s learning. It’s understanding what it’s seeing. | The reporter, Jake Ward, walks around a large tractor with Paul Mikesell. |
| Jake Ward: What AI is great at is telling the difference between things. In this case, the difference between chard and a weed and then killing the weeds with lasers. CEO, Paul Mikesell, has invented a system that fries weeds too small for a human hand to grab in bursts that last only milliseconds. | The reporter, wearing glasses and a dark jacket, speaks in a field of green plants, with a red-and-white farm machine in the background under a gray sky. |
| The field smells like burnt popcorn. | Smoke rises from where the weeds have been burned as the machine passes by. |
| Paul Mikesell: The whole trick here is the lasers disrupt the cellular cycle within the plant with heat energy. Jake Ward: It takes a rack of servers to recognize 40 crops and 80 types of weeds. Paul Mikesell: This machine has got more computing power than 24 Teslas in it. It’s essentially a mobile data center. Jake Ward: The farm’s owner says the laser weeder will pay for itself in a single year and that it solves his single biggest problem, which is finding workers. | The reporter and Paul Mikesell are crouched next to a row of crops. |
| Rod Braga: We’re just not getting the influx of new folks that want to come in to this deal. Now, in a job like behind us on this tractor, where someone could be making, you know, a really good wage, $30 an hour, he’s got a laptop, that’s a little easier to find that person. | Rod Braga, Braga Farms President and CEO, stands in a field wearing dark-rimmed glasses and a light blue button-up shirt, speaking and gesturing toward the young green crops. |
| Jake Ward: Labor unions say they want tech to make the work easier as long as we don’t simply toss people aside after decades of brutal labor. | A tractor moves through a field, zapping weeds as it goes. |
| Antonio De Loera-Brust: To the extent that automation can make life better for farm workers by making those jobs less physically demanding, safer, more dignified, we welcome it. | Several people are crouched in a vast field of green crops manually plucking the weeds and tending the crops. Then, in a video call screen, a man, Antonio De Loera-Brust, speaks, wearing an olive-green jacket over an orange shirt, seated indoors. |
| Our concern is that automation will allow employers to basically discard them. | Workers walk through a field. |
| Jake Ward: AI is spreading through farming. There’s the broccoli bot, built by Oregon college students to harvest the vegetable. And John Deere has been plowing ahead with several kinds of AI technology. | A robot harvests broccoli, cutting it at the root. An aerial view shows a green John Deere tractor in a large, light brown tilled field, pulling a wide implement with long booms extending left and right in a zigzag pattern. |
| Jake Ward: For Paul Mikesell, it’s about more than just zapping weeds. Paul Mikesell: The data that comes out of these images will be incredibly valuable for farmers to be able to predict what’s going to happen in the future based on past action. Jake Ward: So you’re not just killing weeds, but actually harvesting data that we can use to make crops better. Paul Mikesell: Yeah, that’s right. If you talk to any farmer, they’ll tell you it’s not just about the data, but about the insight that you can get from the data. Jake Ward: Mikesell expects AI will allow robots to work in fields and factories in entirely new ways. Paul Mikesell: The capabilities are too great, and the wave is only starting right now. Jake Ward: Jake Ward, NBC News, Soledad, California. | The reporter and Paul Mikesell are crouched next to a row of crops. |
| Dr. Alex Kirkpatrick: Weak artificial intelligence with narrow functions. Machines that can react to imports or might have limited memory. Realistic AI in the here and now. Science fact, not science fiction. | Dr. Alex speaks before he reappears on screen against the colorful gradient background. |
| [ Music ] | ‘The takeaways’ written in large black text, centered on a baby blue colored background. |
| Dr. Alex Kirkpatrick: So we’ve introduced the seven broad categories of AI: weak AI or artificial narrow intelligence, artificial general intelligence or strong AI, artificial superintelligence, reactive machines, limited memory AI, and self-aware AI. But importantly, we’ve reasoned to keep the conversation focused on what is and not what might or might not be in the future. So, this is where our conversation about ag AI will remain focused throughout this toolbox. And this is where I strongly suggest that you keep your own professional conversations focused when talking about AI in agriculture. Weak artificial intelligence with narrow functions and parameters that might be purely reactive to real-time imports or utilize only short-term memory limited to the task at hand. | A presentation slide is divided vertically, with Dr. Alex on the left and a list of artificial intelligence categories on the right. The right side lists six types of AI: Artificial narrow intelligence in brackets ‘weak A.I.’, Artificial general intelligence, AGI; Artificial super intelligence, ASI; Reactive A.I.; Limited memory A.I.; and Theory of mind A.I. |
| [ Music ] | ‘The closer’ written in large white text, centered on the black background. |
| Dr. Alex Kirkpatrick: These definitions are useful to know, and they may have some utility to you when communicating ag AI to your audiences. However, they are very broad canopies and subsume many different real-world applications. Across the rest of this course, we’ll introduce more specific models of AI as we go. Thanks for engaging. | Dr. Alex Kirkpatrick fully reappears on screen against the soft gradient backdrop. |
| [ Music ] | ‘Catch you later’ written in large white text, centered on the black background. A white informational slide displays funding acknowledgments and disclosures. Text at the top reads, National Institute of Food and Agriculture. U.S. Department of Agriculture. On the left are the USDA logo, the Western SARE logo, and the Ag AID Institute logo. To the right, black text explains the grants and required statements. Western SARE, Sustainable Agriculture Research and Education. This material is based upon work that is supported by the National Institute of Food and Agriculture, U.S. Department of Agriculture, under award number 2023-8640-39571 through the Western Sustainable Agriculture Research and Education program under project number WPDP 24013. USDA is an equal opportunity employer and service provider. Any opinions, findings, conclusions, or recommendations expressed in this publication are those of the author and do not necessarily reflect the view of the U.S. Department of Agriculture. |