Skip to content
#

hallucination-detection

Here are 318 public repositories matching this topic...

UpTrain is an open-source unified platform to evaluate and improve Generative AI applications. We provide grades for 20+ preconfigured checks (covering language, code, embedding use-cases), perform root cause analysis on failure cases and give insights on how to resolve them.

  • Updated Aug 18, 2024
  • Python

국가법령정보MCP | 법제처 41개 API → 16개 MCP 도구. 법령·판례·조례·조약을 AI로 검색·조회·분석 + LLM 환각 방지 인용 검증 | 41 Korean legal APIs → 16 MCP tools with AI citation hallucination guard

  • Updated Apr 25, 2026
  • TypeScript

Security scanner MCP server for AI coding agents. Prompt injection firewall, package hallucination detection (4.3M+ packages), 1000+ vulnerability rules with AST & taint analysis, auto-fix.

  • Updated Apr 16, 2026
  • JavaScript

iFixAi. The open-source diagnostic for AI misalignment. 32 tests across fabrication, manipulation, deception, unpredictability, and opacity. Provider-agnostic. Runs against OpenAI, Anthropic, Bedrock, Azure, Gemini, and more. Letter grade in under 5 minutes, content-addressed manifest for bit-identical replay. Powered by iMe.

  • Updated May 1, 2026
  • Python
verifAI

VerifAI initiative to build open-source easy-to-deploy generative question-answering engine that can reference and verify answers for correctness (using posteriori model)

  • Updated Oct 5, 2025
  • Jupyter Notebook
qwed-verification

AISecOps (AI Security Operations) framework for deterministic verification of AI systems. QWED verifies LLM outputs using math, logic, and symbolic execution — creating an auditable trust boundary for agentic AI systems. Not generation. Verification.

  • Updated May 2, 2026
  • Python

Official repo for the paper PHUDGE: Phi-3 as Scalable Judge. Evaluate your LLMs with or without custom rubric, reference answer, absolute, relative and much more. It contains a list of all the available tool, methods, repo, code etc to detect hallucination, LLM evaluation, grading and much more.

  • Updated Jul 10, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the hallucination-detection topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the hallucination-detection topic, visit your repo's landing page and select "manage topics."

Learn more