Welcome to DevScript Saga’s AI learning path. In this very first post I’ll explain the key terms and concepts. No jargon—just plain ideas.
What is Gen AI (Generative AI)?
Gen AI is a type of AI that generates new content. It generates content like text, images, audio, video, code etc. using Artificial interlligence.
What is an LLM (Large Language Model)?
LLM is powerful artificial intelligence models designed for understanding and generating human like text.
It learns patterns of language—grammar, facts, reasoning, style.
What is a Prompt?
A prompt is the text you send to the model. It can be a question, an instruction, or a few examples. The model reads the prompt and generates a response based on it. Good prompts are clear and specific; vague prompts often lead to vague or off-target answers.
What are Embeddings?
Embeddings are a way to turn text (or other data) into a list of numbers (a vector). Similar meanings get similar vectors. So “dog” and “puppy” might have nearby vectors; “dog” and “laptop” would be farther apart. We use embeddings to: Search by meaning (semantic search): meaning, find sentences or chunks that are similar in meaning to a query, not just matching keywords.
What is RAG (Retrieval-Augmented Generation)?
RAG is a way to make an LLM answer using your own data (e.g., a PDF, a knowledge base) instead of only its training data.
- Retrieval: finding right facts from your own knowledge source.
- Augmented: Adding those facts to your questions.
- Generation: Letting AI write a response based on that fresh data.
What is Agentic AI?
Agentic AI refers to AI systems that can autonomously plan, decide and execute actions to achieve specific goals without constant human intervention..
Example: “Book a flight and a hotel for next week” → the agent might search, compare, and then use booking tools. That’s agentic behavior.
What are Transformers?
Transformers are the type of neural network architecture that most modern LLMs (including GPT and many others) are based on. The main idea is “attention”: the model looks at all the words in the input together and learns which ones to focus on when producing each next word. That helps it handle long text and capture context and relationships between words. So when people say “transformer models” or “transformers,” they often mean these large, attention-based language models that power ChatGPT and similar systems.
Quick Recap
AI — Software that does tasks that need human-like intelligence.
Gen AI — AI that generates new content (text, images, etc.).
ChatGPT — A chatbot product from OpenAI, built on an LLM.
LLM — A large model trained on text that can continue or generate text from a prompt.
Prompt — The text you send to the model (question or instruction).
Embeddings — Numerical representations of text used for semantic search and similarity.
RAG — Retrieve relevant docs, add them as context, then let the LLM generate an answer.
Agentic AI — AI that plans and uses tools over multiple steps to reach a goal.
Transformers — The architecture behind most modern LLMs (attention over sequences).
In the next blog, we will start building RAG chatbot and learn the things parallelly.


Leave a comment