Combining CoT prompting with other techniques
CoT can be combined with other prompting techniques to further enhance LLM performance. Let’s implement a function that combines CoT with few-shot learning (FSL):
def few_shot_cot_prompt(question, examples): prompt = "Solve the following problems step by step:\n\n" for example in examples: prompt += f"Problem: {example['question']}\n\n" prompt += f"Solution: {example['solution']}\n\n" prompt += f"Problem: {question}\n\nSolution:" return prompt def solve_with_few_shot_cot(model, tokenizer, problem, examples): prompt = few_shot_cot_prompt(problem, examples) inputs = tokenizer(prompt, return_tensors="pt") &...