THE
Generative AI
Practitioner
Build Real AI Applications ยท Master Prompt Engineering ยท Deploy with Python
Muhammad Rustam ยท Founder & Lead Instructor
AI By Tech Academy ยท Karachi, Pakistan ยท aibytec.com
Table of Contents
Click any chapter to jump directly to that section
Welcome from AI By Tech
aibytec.com ยท Karachi, Pakistan ยท Batch 2026
AI By Tech (aibytec.com) is Pakistan's applied AI education platform, dedicated to training the next generation of AI practitioners, developers, and architects. We don't just teach theory โ every class, every chapter, and every project in this book is built around real tools, real workflows, and real deployments.
This book โ The Generative AI Practitioner โ is the complete companion text for Certificate 1 of the AI Career Ladder program. It is designed for one purpose: to take any motivated person from zero knowledge of AI to building and deploying their own AI-powered applications, in eight weeks.
Knowledge increases by sharing, not by saving.
โ Muhammad Rustam, Founder ยท AI By Tech AcademyThe AI revolution is happening now. It is not coming. It arrived. The students who understand Generative AI at the practitioner level today will be the architects, developers, and entrepreneurs who shape the next decade. This book is your on-ramp.
- Each chapter maps to one or more class sessions from the 8-week curriculum.
- Every concept is followed immediately by a Try It Now hands-on exercise.
- All code examples run in Google Colab โ no installation required.
- Chapter summaries, key terms, and review questions end every chapter.
- Basic computer literacy โ you can use a browser, create files, send emails.
- A Google account โ for Google Colab (free, runs in browser).
- No prior coding experience required. Python basics are taught from scratch.
- Curiosity and willingness to experiment. AI rewards learners who try things.
Foundations
What is Generative AI?
The Revolution That Is Reshaping Everything
We are living through a genuine inflection point. Not another hype cycle โ a structural shift in how humans create, communicate, and build. Generative AI is the engine of this shift. This chapter gives you the mental model and vocabulary you need to understand it clearly.
1.1 The Three Waves of AI
| Era | Wave | What Changed | Example |
|---|---|---|---|
| 1950sโ1990s | Rule-Based AI | Programmers wrote explicit rules. Computers followed them. No learning. | Chess programs, expert systems |
| 1990sโ2015 | Machine Learning | Computers learned patterns from data. No explicit rules needed. | Spam filters, image recognition |
| 2017โPresent | Generative AI | Computers don't just recognize patterns โ they create. | ChatGPT, Claude, Midjourney |
Generative AI is a class of artificial intelligence that creates new content โ text, code, images, audio, video, or data โ by learning patterns from enormous amounts of existing human-created content.
It does not just analyze or classify. It generates. It creates. It produces. These systems have learned so much about human language, logic, and creativity that they can produce outputs indistinguishable from โ and sometimes superior to โ human work.
1.2 Why 2025 Is Different โ The Convergence
Three independent trends converged simultaneously for the first time in history:
1.3 The Generative AI Family
1.4 The Models You Will Use
| Model | Strength | When You'll Use It |
|---|---|---|
| ChatGPT (OpenAI) | General purpose, huge ecosystem | Writing, coding, analysis โ Chapters 2โ4 |
| Claude (Anthropic) | Long context, structured output | Document analysis, complex reasoning โ throughout |
| DeepSeek | Efficient, strong at coding | Code generation, cost-conscious apps |
| Llama 3 (Meta) | Open source, free | Via Groq API for high-speed apps โ Chapter 5 |
| Mixtral (Mistral) | Fast, multilingual | Production apps needing speed & low cost |
| Gemini (Google) | Multimodal โ text + images | Image understanding, Google Workspace integration |
- Freelancing: AI developers earn $25โ$100/hour on Upwork/Fiverr for chatbots and automation.
- Enterprise AI Consulting: Pakistani companies are investing in AI automation โ consultants are in demand.
- AI-Augmented Services: Marketing, content, translation, legal โ every service becomes 10x faster.
- Remote Jobs: AI skills qualify you for remote roles paying international salaries.
๐ Chapter 1 Summary
How Large Language Models Work
Transformers, Tokens, Temperature & Training
You don't need a PhD to use Generative AI effectively. But understanding how LLMs work at a conceptual level โ what they see, how they think, what their limitations are โ makes you dramatically better at using them. No mathematics required.
2.1 Tokens โ What AI Actually Reads
When you type a message to ChatGPT or Claude, the AI doesn't see your words. It sees tokens. A token is roughly 3โ4 characters, or about 75% of a word on average.
"Hello" โ 1 token | "Generative" โ 2โ3 tokens | "Muhammad Rustam" โ 4 tokens
Rule of thumb: 1,000 tokens โ 750 words โ about 1.5 pages of text.
GPT-4 has a context window of 128,000 tokens โ about 200 pages of text at once.
2.2 The Transformer Architecture
Every major LLM today โ GPT-4, Claude, Gemini, Llama โ is built on the Transformer architecture, invented by Google researchers in 2017 in a paper titled "Attention Is All You Need."
Before Transformers, AI models read text sequentially โ word by word, left to right. By the time they reached word 100, they had nearly forgotten word 1. Transformers changed this by introducing attention โ the model looks at all words simultaneously and decides which are most relevant to each other.
2.3 Temperature โ Creativity vs Consistency
| Temperature | Effect | Best Use |
|---|---|---|
0.0 | Fully deterministic โ same input = same output every time | Fact extraction, classification, data parsing |
0.3โ0.5 | Mostly consistent with slight variation | Summarization, translation, professional writing |
0.7โ0.8 | Balanced โ the default for most apps | General chat, Q&A, content generation |
1.0โ1.2 | Creative and varied | Brainstorming, story generation, ideation |
2.4 Hallucination โ The Most Critical Limitation
A language model doesn't know what it doesn't know. It predicts plausible text โ not verified truth. Confidence in the output does not indicate accuracy.
โ AI Practitioners' MaximHallucination is when an AI model generates information that sounds confident but is factually incorrect or completely fabricated. It is the most important limitation to understand.
- Ask the model to cite sources โ then verify those sources independently.
- Use RAG (Chapter 9) to ground the model in your verified documents.
- Set temperature to 0 for fact-critical tasks.
- Include in your prompt:
"If you are not certain, say I don't know." - Use AI for drafting and ideation; verify final facts with authoritative sources.
๐ Chapter 2 Summary
Python for AI Practitioners
The Language That Runs the World's AI
Python is the language of AI. Not because it is the fastest โ it isn't. Python dominates AI because it is readable, expressive, and has the most mature ecosystem of AI libraries ever assembled. If you learn one programming language for AI, it's Python.
- Go to colab.research.google.com
- Sign in with your Google account
- Click "New Notebook"
- Type code in a cell and press
Shift+Enterto run - Results appear immediately below โ no installation required!
3.1 Variables & Data Types
# Variables store values โ Python figures out the type automatically name = "Muhammad Rustam" # string (text) age = 28 # integer (whole number) rating = 4.8 # float (decimal) enrolled = True # boolean # f-strings โ the cleanest way to format text (essential for AI prompts!) print(f"Hello, {name}! Your rating is {rating}/5.0")
3.2 Lists, Loops & Conditions
# Lists โ store multiple items ai_tools = ["ChatGPT", "Claude", "Gemini", "Llama"] ai_tools.append("Mistral") print(ai_tools[0]) # ChatGPT (index starts at 0) # For loops for tool in ai_tools: print(f"Tool: {tool}") # If / elif / else score = 85 if score >= 90: grade = "A" elif score >= 80: grade = "B" else: grade = "Fail" print(f"Grade: {grade}") # Grade: B
3.3 Functions & Error Handling
# Define reusable functions def ask_ai(question, model="llama3"): """Send a question to an AI model and return the answer.""" try: response = client.chat.completions.create( model=model, messages=[{"role": "user", "content": question}] ) return response.choices[0].message.content except Exception as e: print(f"Error: {e}") return "Could not get a response. Please try again." answer = ask_ai("What is the capital of Pakistan?") print(answer) # Islamabad
3.4 Key AI Libraries
| Library | Install | What It Does |
|---|---|---|
openai | pip install openai | Official OpenAI SDK โ call GPT-4, DALL-E, Whisper |
anthropic | pip install anthropic | Official Anthropic SDK โ call Claude models |
groq | pip install groq | Groq API โ Llama & Mixtral at ultra-fast speed |
streamlit | pip install streamlit | Build interactive AI web apps in pure Python |
gradio | pip install gradio | AI demo interfaces with drag-and-drop UI |
transformers | pip install transformers | Hugging Face models โ run 500k+ models locally |
pandas | pip install pandas | Data manipulation โ read CSV, Excel, JSON |
๐ Chapter 3 Summary
Tools & Techniques
Prompt Engineering
Talk to AI Like a Pro โ The Developer Superpower
The quality of AI output is determined almost entirely by the quality of the input. Invest in the prompt, reap in the result.
โ Muhammad Rustam, AI By TechPrompt Engineering is the art and science of crafting inputs to AI models that consistently produce high-quality, accurate, and useful outputs. Companies pay $50โ$150 per hour for skilled prompt engineers. This chapter makes you one.
4.1 Why Prompting Matters
"Write about AI"
Output: Vague, generic 3-paragraph summary. Not actionable.
"Write a 500-word blog post for Pakistani IT students about how learning Generative AI in 2025 leads to freelancing on Upwork. Use a motivational tone, include 3 specific job titles with estimated hourly rates, and end with a clear call to action."
4.2 The CRAFT Framework
Every excellent prompt contains five elements. Use CRAFT as your checklist:
4.3 Prompting Techniques
Zero-Shot โ Simple, direct, no examples
Classify the sentiment of this text as Positive, Negative, or Neutral:
"The new AI course from AiBytec is absolutely amazing!"Few-Shot โ Give examples, get better results
Classify reviews as Positive, Negative, or Neutral. Examples: "Best laptop I've owned." โ Positive "Terrible battery life." โ Negative "Arrived on time." โ Neutral Now classify: "Content is deep but platform loads slowly." โ
Chain-of-Thought โ Make AI reason step-by-step
A company has 3 AI developers. Each completes 4 projects/month.
They earn PKR 25,000 per project.
To triple monthly revenue, how many developers do they need?
Think through this step by step before giving your final answer.4.4 System Prompts โ Program Your AI App
system_prompt = """
You are EduAI, an intelligent tutoring assistant for AI By Tech Academy.
Your role:
- Help students understand Generative AI concepts clearly
- Explain technical topics using simple, relatable examples from Pakistan
- When students share code, debug it and explain the fix
- Encourage students and celebrate their progress
Constraints:
- Always respond in English with occasional Urdu phrases for warmth
- Keep explanations under 200 words unless the student asks for more
- Never write complete assignments โ guide students to the answer
- If you don't know something, say so clearly
Tone: Friendly, encouraging, teacher-like, culturally aware
"""๐ Chapter 4 Summary
Working with AI APIs
Connecting Your Apps to the World's Most Powerful AI Models
Imagine you're in a restaurant. You (the client) tell the waiter (the API) your order. The waiter goes to the kitchen (the AI model on remote servers) and brings back your food (the generated text). You never see the kitchen. You never touch the stove.
An AI API works identically. Your Python code sends a request, the AI model processes it on a massive GPU cluster, and the response comes back โ in milliseconds.
5.1 API Keys โ Security First
# โ WRONG โ Never hardcode API keys in your code client = OpenAI(api_key="sk-abc123...") # Anyone who sees your code can use your key! # โ CORRECT โ Use environment variables import os from dotenv import load_dotenv load_dotenv() # Loads from a .env file client = OpenAI(api_key=os.getenv("OPENAI_API_KEY")) # .env file (add to .gitignore, NEVER commit to GitHub): # OPENAI_API_KEY=sk-abc123... # GROQ_API_KEY=gsk_abc123...
5.2 The Groq API โ 10x Faster Inference
| Model | Speed on Groq | vs Standard |
|---|---|---|
| Llama 3 8B | ~700 tokens/second | 10โ15x faster |
| Mixtral 8x7B | ~500 tokens/second | 8โ12x faster |
| Llama 3 70B | ~300 tokens/second | 5โ8x faster |
from groq import Groq import os client = Groq(api_key=os.getenv("GROQ_API_KEY")) def fast_ai(prompt, model="llama3-8b-8192"): response = client.chat.completions.create( model=model, messages=[{"role": "user", "content": prompt}], temperature=0.7, max_tokens=1024 ) return response.choices[0].message.content answer = fast_ai("What are the top 5 AI skills for Pakistani developers in 2025?") print(answer)
5.3 Multi-Turn Conversations
conversation_history = [] def chat(user_input, system="You are a helpful AI assistant."): if not conversation_history: conversation_history.append({"role": "system", "content": system}) conversation_history.append({"role": "user", "content": user_input}) response = client.chat.completions.create( model="llama3-8b-8192", messages=conversation_history ) ai_reply = response.choices[0].message.content conversation_history.append({"role": "assistant", "content": ai_reply}) return ai_reply print(chat("My name is Rustam and I am from Karachi.")) print(chat("What is my name and where am I from?")) # AI remembers!
๐ Chapter 5 Summary
Hugging Face & Open-Source Models
The GitHub of AI โ 500,000+ Free Models for Every Task
Hugging Face is to AI what GitHub is to code โ the world's central repository for machine learning models, datasets, and demos. With over 500,000 models, 150,000 datasets, and 100,000 interactive demos, it is the most important resource in the open-source AI ecosystem. And most of it is completely free.
6.1 The Transformers Library โ Your AI Swiss Army Knife
from transformers import pipeline # TEXT GENERATION generator = pipeline("text-generation", model="gpt2") result = generator("The future of AI in Pakistan will", max_length=100) print(result[0]["generated_text"]) # SENTIMENT ANALYSIS classifier = pipeline("sentiment-analysis") result = classifier("This AiBytec course is the best investment I've made!") print(result) # [{'label': 'POSITIVE', 'score': 0.9998}] # TRANSLATION โ English to Urdu translator = pipeline("translation", model="Helsinki-NLP/opus-mt-en-ur") result = translator("Artificial Intelligence is transforming education in Pakistan.") print(result[0]["translation_text"]) # QUESTION ANSWERING qa = pipeline("question-answering") result = qa(question="Where is AiBytec based?", context="AiBytec is Pakistan's premier AI education platform based in Karachi.") print(result["answer"]) # Karachi
6.2 Embeddings โ Teaching AI to Understand Meaning
Embeddings are numerical representations of text that capture meaning. Two sentences that mean similar things will have embeddings that are mathematically close. Embeddings are the foundation of semantic search, RAG systems, and recommendations.
from sentence_transformers import SentenceTransformer from sklearn.metrics.pairwise import cosine_similarity model = SentenceTransformer("all-MiniLM-L6-v2") sentences = [ "How do I learn Python for AI?", "What is the best way to start coding in Python?", "What is the capital city of Australia?" ] embeddings = model.encode(sentences) sim_01 = cosine_similarity([embeddings[0]], [embeddings[1]])[0][0] sim_02 = cosine_similarity([embeddings[0]], [embeddings[2]])[0][0] print(f"Similarity (Python questions): {sim_01:.3f}") # ~0.85 print(f"Similarity (unrelated): {sim_02:.3f}") # ~0.12
For privacy-sensitive projects, Ollama lets you run powerful AI models entirely on your local machine โ no internet, no API costs, no data leaving your computer.
- Download from ollama.ai (Windows, Mac, Linux)
- Run:
ollama pull llama3 - Chat:
ollama run llama3 - Or use the Python API:
import ollama
Minimum specs: 8GB RAM. GPU helps but CPU works fine for small models.
๐ Chapter 6 Summary
Building Applications
Build AI Web Apps with Streamlit
From Python Script to Deployed Web App in Hours
Streamlit converts Python scripts into web applications. No HTML. No CSS. No JavaScript. Pure Python โ and the results look professional. Over 1 million apps have been deployed on Streamlit Community Cloud.
7.1 Your First Streamlit App
import streamlit as st from groq import Groq import os st.set_page_config(page_title="AiBytec AI App", page_icon="๐ค") st.title("๐ค AI Assistant โ AiBytec") client = Groq(api_key=os.getenv("GROQ_API_KEY")) question = st.text_area("Ask me anything", height=120) model = st.selectbox("Model", ["llama3-70b-8192", "mixtral-8x7b-32768"]) temp = st.slider("Temperature", 0.0, 1.5, 0.7) if st.button("โจ Get Answer", type="primary"): with st.spinner("AI is thinking..."): response = client.chat.completions.create( model=model, messages=[{"role": "user", "content": question}], temperature=temp ) answer = response.choices[0].message.content st.success("Done!") st.markdown(answer) st.download_button("๐พ Download Answer", answer, "answer.txt") # Run with: streamlit run app.py
7.2 Core Streamlit Components
| Component | Code | Use For |
|---|---|---|
| Text Input | st.text_input("Label") | Short text โ names, search queries |
| Text Area | st.text_area("Label", height=150) | Long text โ documents, prompts |
| Select Box | st.selectbox("Label", ["A","B"]) | Dropdown โ model choice, options |
| Slider | st.slider("Label", 0.0, 2.0, 0.7) | Numeric range โ temperature |
| File Upload | st.file_uploader("Upload", type=["pdf"]) | Documents, images for AI analysis |
| Columns | col1, col2 = st.columns(2) | Side-by-side layout |
| Sidebar | with st.sidebar: | Settings, API key input |
| Spinner | with st.spinner("Thinking..."): | Show loading while AI responds |
- Push your code to GitHub (include
app.py+requirements.txt) - Go to share.streamlit.io โ sign in with GitHub
- Select repo, add API keys as Secrets, click Deploy โ your app is live!
๐ Chapter 7 Summary
Build AI Interfaces with Gradio
Drag-and-Drop AI Demos โ Images, Audio, Chat, and More
Gradio is Hugging Face's app-building framework, designed specifically for AI demos. Where Streamlit excels at dashboards, Gradio is optimized for input-output AI demos. Build a voice-enabled chatbot in under 20 lines.
8.1 Gradio vs Streamlit โ When to Use Each
- Building AI input-output demos
- Voice, image, or video interfaces
- Sharing ML models on Hugging Face Spaces
- Quick prototypes to share with stakeholders
- Building data dashboards and analytics
- Multi-page web applications with state
- Complex workflows with database connections
- Production apps with rich layouts
8.2 Chatbot with Memory
import gradio as gr from groq import Groq import os client = Groq(api_key=os.getenv("GROQ_API_KEY")) SYSTEM = """You are AiBytec Assistant โ a helpful AI tutor. You help Pakistani students learn Generative AI and Python. Be encouraging, give clear explanations, use local examples.""" def chat(user_message, history): messages = [{"role": "system", "content": SYSTEM}] for user, assistant in history: messages.append({"role": "user", "content": user}) messages.append({"role": "assistant", "content": assistant}) messages.append({"role": "user", "content": user_message}) response = client.chat.completions.create( model="mixtral-8x7b-32768", messages=messages ) return response.choices[0].message.content demo = gr.ChatInterface( fn=chat, title="๐ AiBytec AI Tutor", examples=[ "What is the difference between supervised and unsupervised learning?", "How do I build my first AI app in Python?", "What AI skills should I learn for a job in 2025?" ], theme=gr.themes.Soft() ) demo.launch(share=True) # share=True โ public URL!
- Create a free account at huggingface.co
- Click "New Space" โ select "Gradio" as the SDK
- Upload
app.pyandrequirements.txt, or connect GitHub - Add API keys as Repository Secrets (Settings โ Secrets)
- Your app is live at:
huggingface.co/spaces/your-username/your-app
๐ Chapter 8 Summary
RAG โ Retrieval-Augmented Generation
Teaching AI to Know Your Documents
Without RAG, your AI knows everything before its training cutoff but nothing about your company, your documents, or yesterday's news. RAG fixes all three.
โ AI Practitioners' Maxim9.1 The RAG Pipeline
INDEXING (Done Once, Offline):
- Load your documents (PDF, DOCX, TXT, websites)
- Split documents into chunks (500โ1000 characters each)
- Embed each chunk using an embedding model (text โ vectors)
- Store vectors in a vector database (FAISS, ChromaDB, Pinecone)
RETRIEVAL + GENERATION (Every Query):
- User asks a question โ embed the question
- Search vector database โ retrieve top-k most similar chunks
- Build prompt: [System] + [Retrieved context] + [Question]
- Send to LLM โ get grounded, cited answer
9.2 RAG from Scratch โ Complete Implementation
import faiss, numpy as np from sentence_transformers import SentenceTransformer from groq import Groq import os embedder = SentenceTransformer("all-MiniLM-L6-v2") llm_client = Groq(api_key=os.getenv("GROQ_API_KEY")) class SimpleRAG: def __init__(self): self.chunks = []; self.index = None def add_text(self, text, source="manual", chunk_size=500): words = text.split() for i in range(0, len(words), chunk_size): chunk = " ".join(words[i:i+chunk_size]) if len(chunk.strip()) > 100: self.chunks.append({"text": chunk, "source": source}) def build_index(self): texts = [c["text"] for c in self.chunks] embs = embedder.encode(texts) self.index = faiss.IndexFlatL2(embs.shape[1]) self.index.add(embs.astype(np.float32)) def answer(self, question, top_k=3): q_emb = embedder.encode([question]).astype(np.float32) _, ids = self.index.search(q_emb, top_k) context = "\n\n".join([self.chunks[i]["text"] for i in ids[0]]) response = llm_client.chat.completions.create( model="llama3-70b-8192", messages=[{"role": "user", "content": f"Answer using ONLY the context below.\n\nCONTEXT:\n{context}\n\nQUESTION: {question}"}], temperature=0.1 ) return response.choices[0].message.content rag = SimpleRAG() rag.add_text("AiBytec offers Certificate 1 (Gen AI, PKR 15,000), Certificate 2 (Agentic AI, PKR 45,000). Based in Karachi.") rag.build_index() print(rag.answer("How much does Certificate 2 cost?"))
๐ Chapter 9 Summary
Advanced & Professional
AI Safety, Ethics & Responsible Use
Building AI That Helps, Not Harms
With great capability comes great responsibility. Every AI application you build either adds to the world's trust in AI or subtracts from it. Build accordingly.
โ Muhammad Rustam, AI By Tech10.1 The Core Risks of Generative AI
| Risk | Description | Example Harms |
|---|---|---|
| Hallucination | AI generates confidently stated false information | Medical misdiagnosis, fabricated citations |
| Bias & Discrimination | AI reflects and amplifies biases in training data | Biased hiring tools, unfair credit scoring |
| Privacy Violation | AI trained on private data leaks it, or enables surveillance | Personal data exposure, deepfake abuse |
| Misinformation | AI generates convincing fake news and propaganda | Election interference, reputation destruction |
| Copyright Infringement | AI trained on copyrighted work reproduces it | Plagiarized code, copied artwork |
| Over-Reliance | Users stop critical thinking and blindly trust AI | Errors in critical systems, skill atrophy |
10.2 The 7 Principles of Responsible AI
- Transparency โ Users should know they are interacting with AI. Don't disguise AI as human.
- Fairness โ AI systems should produce equitable outcomes across genders, races, religions, and social groups.
- Accountability โ Every AI system should have a human responsible for its actions and outcomes.
- Privacy โ Collect only necessary data. Never train on sensitive personal data without explicit consent.
- Safety โ AI in high-stakes domains (healthcare, legal, financial) requires human oversight and validation.
- Beneficence โ Design AI to benefit users and society, not just maximize profit or engagement.
- Non-maleficence โ If a feature could cause harm, don't build it โ regardless of how technically possible it is.
Healthcare AI: AI assists clinicians but must NEVER replace human medical judgment.
โ Always add: "This is not medical advice. Consult a qualified doctor."
Legal AI: AI can draft documents but cannot provide legal advice.
โ Always add: "This is not legal advice. Consult a qualified lawyer."
Financial AI: AI can analyze data but cannot guarantee returns.
โ Always add: "This is not financial advice. Past performance does not guarantee future results."
๐ Chapter 10 Summary
Your Gen AI Portfolio โ Capstone Projects
Build. Deploy. Showcase. Get Hired.
Your GitHub portfolio is your new resume. Every deployed app is a job interview that runs 24/7. Build things that show what you can do โ don't just talk about it.
โ Muhammad Rustam, AI By Tech11.1 Eight Capstone Project Ideas
11.2 Career Paths After Certificate 1
๐ Ready for Certificate 2?
You've completed Certificate 1: Gen AI Practitioner. Certificate 2 โ Agentic AI Developer โ teaches you to build autonomous AI agents, integrate MCP servers, and apply Spec-Driven Development with Claude Code.
Duration: 3 Months | Fee: PKR 45,000 | Batch: After Eid 2026 (InshAllah)
Enroll at aibytec.com โ๐ Chapter 11 Summary โ Congratulations! ๐
Appendices
Python Quick Reference
The Most Important Syntax at a Glance
# โโโ DATA TYPES โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ s = "Hello" # string s = f"Name: {name}" # f-string (always use this) i = 42 # integer f = 3.14 # float b = True # boolean # โโโ LISTS โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ lst = [1, 2, 3] lst.append(4) # [1, 2, 3, 4] lst[0] # 1 โ first item lst[-1] # 4 โ last item len(lst) # 4 # โโโ DICTIONARIES โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ d = {"key": "value", "num": 42} d["key"] # "value" d["new"] = "added" # add new key d.keys() / .values() / .items() # โโโ CONTROL FLOW โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ if x > 0: print("positive") elif x < 0: print("negative") else: print("zero") for item in my_list: # iterate list print(item) for i in range(10): # 0 to 9 print(i) # โโโ FUNCTIONS โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ def add(a, b=10): # default param return a + b # โโโ ERROR HANDLING โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ try: result = risky_api_call() except Exception as e: print(f"Error: {e}") finally: cleanup() # always runs # โโโ LIST COMPREHENSION โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ squares = [x**2 for x in range(10)] evens = [x for x in range(20) if x % 2 == 0]
Top 50 Prompt Templates
Copy, Adapt, Deploy โ Instantly Useful
Writing & Content
- Blog Post
- "Write a [length]-word blog post about [topic] for [audience]. Use [tone] tone. Include intro, [N] main points with subheadings, conclusion, and CTA."
- LinkedIn Post
- "Write a LinkedIn post about [topic/achievement] for a [role]. Max 300 words. Include 3โ5 hashtags. Make the first line stop-scrolling."
- Professional Email
- "Write a [tone] email from [sender] to [recipient] about [purpose]. Context: [background]. Desired outcome: [what you want]. Max [N] words."
- Product Description
- "Write a compelling product description for . Target: [audience]. Features: [list]. Tone: [persuasive]. Length: [N] words."
Analysis & Research
- SWOT Analysis
- "Perform a SWOT analysis of [company/idea]. Present as a structured table with 3โ4 points per quadrant. Be specific, not generic."
- Competitor Comparison
- "Compare [A] vs [B] vs [C] across these dimensions: [list]. Present as a table. Include a recommendation at the end."
- Summarization
- "Summarize the following [doc type] in [N] words. Focus on: key decisions made, action items, and open questions."
- Research Brief
- "Create a research brief on [topic]. Include: current state, key players, recent developments, major challenges, 5 questions for further research."
Coding & Technical
- Code Generation
- "Write a Python function that [description]. Include: type hints, docstring, error handling, and 3 usage examples."
- Code Review
- "Review this [language] code for: bugs, security issues, performance problems, and style. For each issue, show the fix."
- Debug
- "This [language] code produces [error]. Expected: [x]. Actual: [y]. Here's the code: [code]. Identify and fix the bug."
- API Design
- "Design a RESTful API for [application]. Specify: endpoints, HTTP methods, JSON formats, error codes, and authentication."
Education & Learning
- ELI5 Explainer
- "Explain [complex topic] to someone who [background]. Use an analogy from [familiar domain]. Under [N] words. Include a 1-sentence summary."
- Quiz Generator
- "Create a [N]-question quiz on [topic] for [level] learners. Mix: multiple choice, true/false, short answer. Include answers."
- Study Plan
- "Create a [N]-week study plan for [subject]. I have [hours/day] available. Current level: [beginner]. Include weekly goals and daily tasks."
- Socratic Guide
- "Don't give me the answer. Ask me guiding questions to help me figure out [topic] myself. I am a [level] learner."
AI Career Paths & Resources
Where to Go After This Book
Essential Resources
| Resource | URL | What You Get |
|---|---|---|
| AI By Tech Academy | aibytec.com | Certificate 1, 2, 3 โ live instructor-led courses with Muhammad Rustam |
| Hugging Face | huggingface.co | 500k+ models, datasets, free GPU via Inference API, Spaces hosting |
| OpenAI Platform | platform.openai.com | GPT-4, DALL-E, Whisper API โ $5 free credit to start |
| Anthropic Console | console.anthropic.com | Claude API โ long context, structured outputs, free tier available |
| Groq Console | console.groq.com | Free ultra-fast Llama & Mixtral inference โ generous free tier |
| Google Colab | colab.research.google.com | Free Python + GPU in browser โ use for all exercises in this book |
| Streamlit Cloud | share.streamlit.io | Free Streamlit app hosting โ deploy in 60 seconds from GitHub |
| Author's GitHub | github.com/MuhammadRustamShomi | Code examples and projects from this book and AiBytec courses |
Certificate 1 โ Gen AI Practitioner โ
(This book)
Duration: 2 months | Fee: PKR 15,000โ20,000 | Audience: Complete beginners
Certificate 2 โ Agentic AI Developer โ
Duration: 3 months | Fee: PKR 45,000 | Audience: Certificate 1 graduates
Certificate 3 โ AI Systems Architect โ
Duration: 2 months | Fee: PKR 60,000โ75,000 | Audience: Certificate 2 graduates
Full Ladder: All 3 certificates ยท 7 months ยท PKR 1,20,000โ1,40,000
Graduates receive the most comprehensive applied AI credential in Pakistan.

