Goldilocks Docs
Train

Testing Your Knowledge

Use the inline search to verify your content is retrievable

The Training page includes an inline search panel that lets you test your knowledge. Query your content to see what the AI would find and use to answer questions.

  1. Navigate to Training in the sidebar
  2. Locate the search panel below the content tabs
  3. Enter a question or topic in the search box
  4. Select your search mode
  5. Click Search or press Enter
  6. Review the results

Search Modes

The search panel offers three modes:

Hybrid (Default)

Combines semantic and keyword search for best results:

  • Finds content by meaning AND keywords
  • Recommended for most queries
  • Matches how the AI actually searches

Semantic

Finds content by meaning, not exact words:

  • Understands synonyms and concepts
  • Good for natural language questions
  • May miss keyword-specific content

Keyword

Traditional text matching:

  • Exact word matching
  • Good for specific terms or codes
  • Won't find conceptual matches

Understanding Results

Retrieved Content

Search results show the content snippets that would be used to answer the question. Each result includes:

  • Source - The document, website, or FAQ it came from
  • Content Type - Document, Website, or FAQ badge
  • Snippet - The relevant text portion
  • Score - How relevant the system considers this content

Relevance Scoring

Content is scored based on:

  • Semantic similarity - How closely the meaning matches
  • Keyword matching - Presence of specific terms
  • Source type - FAQs get a slight priority boost

Higher scores appear first and are more likely to be used in responses.

What to Test

Common Questions

Test the questions customers ask most frequently:

  • Do relevant results appear?
  • Is the most helpful content ranked first?
  • Are there irrelevant results mixed in?

Question Variations

Customers phrase questions differently. Test variations:

  • "How do I cancel?" vs "Cancel my account" vs "I want to stop my subscription"
  • "What's your return policy?" vs "Can I return this?" vs "How do returns work?"

Edge Cases

Test tricky scenarios:

  • Questions about topics you haven't documented
  • Very specific technical questions
  • Questions with typos

Negative Testing

Test what should not return results:

  • Questions about competitors
  • Inappropriate requests
  • Topics you don't support

Improving Results

Content Not Found

If relevant content isn't retrieved:

  1. Check the document - Is it set to Active status?
  2. Check wording - Does your content use similar language to the query?
  3. Add variations - Include common phrasings in your content
  4. Create an FAQ - For specific questions, FAQs are more precise

Wrong Content Retrieved

If irrelevant content is retrieved:

  1. Review the content - Is it actually off-topic for this query?
  2. Refine your documents - Make content more focused
  3. Remove or draft irrelevant content - If it's not useful

Good Content Ranked Low

If good content appears but not at the top:

  1. Create an FAQ - FAQs have higher priority
  2. Improve content clarity - Make the topic clearer
  3. Use customer language - Match how customers phrase things

Testing Workflow

Before Launch

Before deploying your widget:

  1. Test your top 20 most common questions
  2. Verify each returns appropriate content
  3. Fix any gaps in your knowledge base
  4. Test again

Regular Testing

Schedule regular testing sessions:

  • Weekly: Test new content you've added
  • Monthly: Re-test common questions
  • After updates: Test affected topics

Filtering by Content Type

When testing, you can use the content tabs to focus your search:

  • Search in All to see all matching content
  • Search in Documents to only find documents
  • Search in Websites to only find crawled pages
  • Search in FAQs to only find FAQs

This helps identify if a specific content type is missing coverage.

Best Practices

Document Test Results

Keep a log of:

  • Questions tested
  • Expected vs actual results
  • Changes made

Create a Test Suite

Build a list of standard test questions that cover:

  • All major topics
  • Common phrasings
  • Edge cases

Run through this list after major content changes.

Get Team Input

Have different team members test:

  • Support team knows real customer questions
  • Product team knows technical details
  • New employees represent uninformed users

Use Real Customer Questions

Pull actual questions from:

  • Support tickets
  • Chat logs
  • Customer emails
  • FAQ page analytics

These are more valuable than made-up test cases.