This is a beta feature according to Algoliaâs Terms of Service (âBeta Servicesâ).
- Summarize this user review
- Describe who this product is for
- Translate this productâs description into Spanish
Keep prompts simple
Simplicity ensures that your prompts are logical and easy to maintain, allowing the model to focus effectively on the task without confusion or misinterpretation. Simple prompts are also more generic. They are easier to maintain by your team as your data and your goals evolve. If your prompt becomes too complex:- Use clear language: avoid jargon unless itâs necessary for the task at hand. Use straightforward, everyday language that the model can understand.
- Break complicated tasks into sub-tasks. For example, instead of âSummarize the political strategy in this presidential discourseâ, you could create the following subtasks:
- Create a first prompt: âExtract a bullet point list of the key policies in this discourse.â
- Followed up with: âSummarize the political strategy based on these bullet points.â
Keep prompts short
Short prompts create better AI experiences by being clear and focused. They direct the LLMâs processing efficiently, avoiding unnecessary details that could complicate understanding or reduce accuracy. This results in faster, more accurate responses tailored to the task. If your prompt is getting too long:- Focus on essential elements: remove unnecessary details as they might reduce the quality of answers.
- Use examples or templates: if similar prompts are available, use them as guides to structure your prompt effectively. This ensures consistency and accuracy across different scenarios.
- Test with shorter versions: experiment with condensing parts of the prompt during testing before committing to a more detailed version. This helps identify unnecessary parts of the prompt.
- Consider AI rephrasing: using a LLM to rephrase a prompt can sometimes make it more generic and compact.
Make prompts specific
Specificity makes your prompts more efficient at doing one thing and doing it well. It ensures clarity and reduces ambiguity. Specific prompts make LLMs perform more accurately and efficiently for tasks like those in RAG APIs. If your prompt is getting too ambiguous, try to:- Narrow the scope: define what you want the model to focus on. Instead of asking a broad question like âTell me about the company.â, specify which aspect of the company youâre interested in. For example, âProvide a summary of the companyâs financial performance.â
- Provide context or constraints: offer context to guide the modelâs response. For example, instead of asking âWhat are the benefits of this exercise plan?â, you could say, âExplain the benefits of this exercise plan for stress management in people over 50.â
- Use explicit instructions: directly tell the model what you need. For example, âSummarize the following article in three bullet points.â, or âGive me a list of five specific benefits of meditation for stress reduction.â
Be explicit about your expectations
- Do Make your expectations explicit: Write: âDescribe what kind of audience this product is for. Is it appropriate for new visitors, for power users, or for our longstanding members?â
- Donât Keep the expectations implicit in your prompt. Donât write âDescribe who this product is for.â
Be specific about the expected output
-
Do
- Explain what output you accept. Write: âAnalyze the sentiment in these user reviews. Return only a single label: either âpositiveâ, ânegativeâ, or âneutralâ.â
- Describe what structure you need. Write: âGenerate five questions about this product. Return a list of questions in XML. For example, âWhat sizes are available? Is this jacket suitable for cold weather?ââ
-
Donât
- Keep the options implicit. Donât write: âAnalyze the sentiment in these user reviews.â
- Keep the structure you expect implicit. Donât write âGenerate five questions about this product.â
Provide fallback options
If you include a fallback option in your prompt, the LLM tends to follow the fallback instruction rather than generate an inaccurate response.-
Do
- Offer an alternative (which could be to contact a human). Write: âAnswer the userâs question as best as you can from the product data. If the context doesnât let you answer with certainty, answer âIâm not sure I have the answer to this: contact support@acme.com for helpâ.â
-
Donât
- Request a reply at any cost (unless this is what your UX needs). Donât write: âAnswer the userâs question as best as you can from the product data or your internal knowledge.â
See also
- PromptingGuide.AI by DAIR.AI
- Prompt engineering overview by Anthropic
- Prompt engineering by OpenAI