Codequiry Documentation

How checks work, what to upload, and how to read results. No fluff—just what you need.

How Codequiry works

Codequiry analyzes source code for similarity using two complementary approaches: peer-to-peer comparison within your assignment group and optional web scanning against public sources. The system parses code structure, identifiers, and patterns to find meaningful matches—not just exact text.

Peer comparison
  • Compares all submissions in the check against each other
  • Best for detecting collaboration or code sharing in a class
  • Fast turnaround (typically a few minutes)
Web scanning
  • Matches against public repositories and code snippets online
  • Use when external copying is suspected
  • Takes longer due to breadth of search

Workflow: from folder to results

  1. Create a folder for your course or assignment group.
  2. Create a check inside the folder and choose detection options (peer, web, or both).
  3. Upload submissions (ZIP per student). See file requirements below.
  4. Start analysis and monitor progress. Results appear when processing completes.

Tip: For large classes, run peer comparison first, then selectively enable web scanning for high‑risk cases.

File preparation

ZIP requirements
  • Format: .zip only
  • Size: keep each ZIP reasonably small (avoid binaries)
  • One ZIP per student submission
  • Include only source files (no executables)
Recommended structure
  • Top‑level source files and folders only
  • Remove compiled artifacts (e.g., bin/, target/)
  • Use clear names (student name or ID)

Common languages are supported, including Python, Java, C/C++, JavaScript/TypeScript, C#, Go, Ruby, PHP, Kotlin, and Swift.

Detection methods

Group Similarity (peer)
  • All‑against‑all comparison within the check
  • Highlights likely collaboration chains
  • Low cost and fast
Web Check (internet)
  • Searches public code sources and forums
  • Best used in addition to peer checks
  • Higher coverage, longer runtime

Interpreting results

Similarity percentages are indicators. Always review matched regions in context before taking action.

Score ranges
  • 0–20%: typically original work
  • 21–50%: review required; check common patterns
  • 51–80%: high similarity; investigate and document
  • 81–100%: critical; likely extensive copying
What to review
  • Matched files and line ranges
  • Whether structure/logic is copied vs. trivial boilerplate
  • Source composition (peer vs. external)

Best practices

Do
  • Run peer checks first for quick signal
  • Use web checks to confirm external copying
  • Document decisions and keep context
  • Consider legitimate reuse and starter code
Don't
  • Judge by percentage alone
  • Ignore project‑provided templates
  • Skip manual review of highlighted matches

FAQ

How long do checks take?

Peer comparison usually finishes in a few minutes. Web checks take longer depending on scope.

Can I mix methods?

Yes. Run peer first, then enable web scanning for selected submissions or for the entire check.

What if students used the same starter code?

Review matches in context. Reused boilerplate often appears across many submissions and should be discounted.

What file types are supported?

Upload source files inside a ZIP. Avoid binaries and build output. Most common programming languages are supported.