GPT, RAG, Agentic, Generative, AGI, huh?
This Hot Take was written by Joel Goodman
I spent the holiday break doing some experimentation with AI. No, I didn’t restart my OpenAI subscription and start building Custom GPTs. That’s all well and good for beginners, but I am not a beginner. Instead, I pulled some open source models down to my computer and started working on building a RAFT to pair with my AI agents.
Did I lose you? This newish frontier can be overwhelming. Tech moves so fast that I’m already having backchannel conversations with web pros who have gotten saddled with a half-baked AI product their marketing leadership was convinced to buy at AMA. This community needs a bit of easy-to-digest, foundational information about AI terms so that you can call out the BS with knowledge.
An AI Terminology Glossary #
Think of this as your glossary as of January 2025 — no doubt more terms will be coined in the coming weeks. But the next time you see someone calling their chatbot “Agentic AI,” you’ll know they’ve mixed up their terms. Who knows? Maybe this will help you make better AI purchasing decisions this year (ask me about what we’re working on 👀).
LLM — Have you heard of Large Language Models? This is what powers all of this AI stuff. At a basic level, LLMs are made up of billions of data points, sourced from documents, websites, books, audio transcripts, and just about any written media, and then fed to an algorithm. That algorithm is designed to detect patterns in language and then predict what words go together based on the subset of data it’s trained on. The larger the model, the more general its “knowledge.”
AI — Yeah, you know this one. It stands for Artificial Intelligence. But did you know this is a branding term more than anything? Today’s AI is repackaged Machine Learning, which has been around for years, just on a massive scale. In particular, AI can work multi-modally, employing LLMs for text generation and computer vision for “seeing” images and video. It can also leverage symbolic reasoning for problem-solving. AI systems are generally capable of more varied inputs and are typically designed to specialize in specific tasks.
Example: Isn’t it funny how we all had ChatGPT, but we couldn’t make it super-useful until OpenAI released Custom GPTs? That’s a different AI on top of the models OpenAI offers.
AGI — You might have seen Artificial General Intelligence being thrown around as the next big thing. Sam Altman from OpenAI has consistently moved the goalpost on his definition of AGI. It’s a hypothetical form of AI that possesses human-like cognitive abilities across a wide range of tasks. Current AI models can’t reason — they fake it because they’re good at predicting text. Still, most machine learning engineers and researchers who are not CEOs of multi-billion dollar AI companies agree that AGI can’t happen until we crack quantum computing… and that’s a ways off.
Generative AI — This is most of what you interact with. Generative AI is a flavor of AI built (or trained) on top of large foundational models. Foundational models tend to have broad datasets and little to no labeling. The GenAI model recognizes patterns and relationships within this training data and generates new content based on user input. Every AI-powered chat interface, image, video, or audio generator is Generative AI. If it’s creating something for you, it’s Generative AI.
RAG — Have you seen this one? My guess is maybe not. It’s definitely more technical, but it’s how Custom GPTs and a lot of the products that learn from your data work. If you’ve ever chatted with your PDFs, then that’s RAG. It stands for Retrieval Augmented Generation. This is essentially a database of your information that gets searched and sent along to the LLM so that the AI has knowledge of your data and everything the LLM contains. It’s one way to add some focus to a general model’s knowledge base. (And secretly, this is what you should be looking at for your institution’s operational efficiencies).
Agentic AI — AI Agents, or Agentic AI, are the next big wave of features. But I see this term tossed around a lot while being applied to the wrong things. Agentic AI is not a chatbot. In fact, in most cases, you shouldn’t even notice what AI Agents are doing until it’s finished. Agentic AI is an advanced artificial intelligence system designed to operate autonomously, make complex decisions, and achieve specific goals with minimal human intervention. They’re like background workers.
Example 1: Many companies are deploying AI Agents to manage data flows for them. Say someone interacts with a form on your website. An AI Agent could be set up to take the information submitted in that form, determine who it needs to be sent to, update your CRM data, and then customize the email drip campaign you would normally have triggered. Do we have that already? Sure. But this takes a lot less programming.
Example 2: Let’s say an admissions counselor at your institution takes a phone call. They could record a transcript and get the person’s email, then an AI Agent could access the transcription service’s API, update the CRM data, schedule a follow-up reminder, and start the perfect drip email campaign. All without prompting the Agent to do anything.
I use AI Agents for all kinds of things — keeping code up to date when I make changes, sourcing and analyzing client competitor data, then generating reports, and I’m working on building out a crew of Agents to help with internal data reporting here at Bravery.
That was a lot—here’s a #protip #
Lots of information, I know. This is the thing — higher ed spends A LOT of money on new tech, usually without good foundational knowledge. I remain analytical about all of this stuff. There is great potential in deploying AI at our institutions, but I’ll be bold and say it should never replace the human touch.
If you’re putting AI in place of admissions counselors or using it to write your content, you’re thinking too small. Your writing, your content creation, and your attention to a person’s humanity are what make your institution different from others. When you replace those, you hurt your brand. And this is not the year to do that.
What’s next #
New technology demands a strategy. Goals before tools, like we always say. If you need help figuring out what that strategy is, let’s talk. Put me in touch with your CFO and CIO. Let’s talk about what’s possible. Maybe your team needs some training on how to put these tools to use (or even what tools are out there) — I’ve used a lot of them and we can get you some time back to do really great work.