Skip to main content

Prompt Engineering Techniques

Β· 4 min read
Dinesh Gopal
Technology Leader, AI Enthusiast and Practitioner

πŸ“˜ Introduction​

Prompt engineering is the art and science of crafting inputs that guide Large Language Models (LLMs) to produce desired outputs. While anyone can write a prompt, effective prompt engineering requires an understanding of LLM behavior, configuration tuning, and iterative testing.

Based on Google’s 2024 whitepaper, this guide breaks down strategies, parameters, and real-world examples to help you get the most from any LLM.


βš™οΈ LLM Configuration Essentials​

SettingDescriptionTips
TemperatureControls randomness in outputUse 0 for deterministic, 0.9+ for creativity
Top-KSample from top K tokensLower K for focus, higher for diversity
Top-PSample from top tokens within cumulative probability P0.9–0.95 is a good balance
Token LimitControls max length of outputImpacts cost and clarity

βœ… Recommended defaults: temperature=0.2, top-P=0.95, top-K=30.


πŸ§ͺ Prompting Techniques (with Examples)​

πŸ”Ή Zero-Shot Prompting​

Use: Simple tasks where the model can generalize well.

Prompt:

Translate this to French: "Where is the nearest restaurant?"

πŸ”Ή One-Shot Prompting​

Use: When the model needs guidance on format or tone.

Prompt:

Example:
Input: "What is the capital of France?"
Output: "The capital of France is Paris."

Now, answer this:
Input: "What is the capital of Japan?"

πŸ”Ή Few-Shot Prompting​

Use: Tasks with variability; adds consistency by showing patterns.

Prompt:

Q: What is 5 + 3?
A: 8

Q: What is 12 - 4?
A: 8

Q: What is 9 + 6?
A:

πŸ”Ή System Prompting​

Use: Guide output format, tone, or persona via system-level instruction.

Prompt:

You are a JSON API assistant. Always respond in valid JSON format.
User: "Tell me the current weather in London"

πŸ”Ή Role Prompting​

Use: Assign a personality or function to steer response style.

Prompt:

You are a helpful personal finance advisor.
What’s a good way to save for retirement in your 30s?

πŸ”Ή Contextual Prompting​

Use: Provide real data or background to ground the answer.

Prompt:

Here’s the company’s 2023 HR policy: [insert excerpt]

Based on this policy, can an employee carry over unused vacation days to next year?

πŸ” Advanced Prompting Strategies​

πŸ”Έ Step-Back Prompting​

Prompt:

Let’s first reflect on the broader question:
"What factors should we consider before choosing a new CRM platform?"

Now, given those, which platform is best for a mid-sized SaaS startup?

πŸ”Έ Chain-of-Thought (CoT)​

Prompt:

Q: Jane has 5 apples. She buys 7 more, then gives 3 to her friend. How many apples does she have now?
A: Let's think step by step...

πŸ”Έ Self-Consistency​

Approach:

  • Run the same prompt multiple times.
  • Use majority voting to find the most consistent answer.

Prompt (run 3x):

What’s the next number in this sequence: 2, 4, 6, 8, ?

πŸ”Έ Tree-of-Thought (ToT)​

Use: Let the model explore multiple branches of reasoning.

Prompt:

You are solving a puzzle. First, generate 3 different strategies to solve it. Then evaluate which one is most effective and explain why.

πŸ”Έ ReAct (Reason + Act)​

Use: Combine reasoning with tool use.

Prompt:

User: What’s the weather in Tokyo right now?

Assistant Thought: I need to look up the weather using the weather API.

[Call: GET https://api.weather.com/tokyo]

Action: Retrieve weather info
Observation: It’s 22Β°C and sunny
Answer: The weather in Tokyo is currently 22Β°C with clear skies.

πŸ’» Prompting for Code Tasks​

Prompt engineering also works well with LLMs like Gemini or Claude for tasks like:

  • Writing Bash scripts
  • Explaining code
  • Refactoring
  • Translation (e.g., Python to JavaScript)

Prompt Example:

Convert the following Python list comprehension to a standard for loop:
[ x**2 for x in range(10) if x % 2 == 0 ]

βœ… Best Practices Summary​

  • 🎯 Be clear, concise, and direct
  • 🧱 Use examples where helpful
  • πŸ’¬ Keep format structured
  • πŸ§ͺ Test and iterate
  • 🧰 Abstract common prompts with variables or templates
  • πŸ”’ Stay aligned with model safety and bias guidelines

πŸ“Œ Final Thoughts​

Prompt engineering is your superpower when working with LLMs. It's part design, part trial-and-error, and part understanding the model’s training behavior. With the right strategy, even complex workflows become simple, reusable, and reliable.


πŸ“š References​