Codequiry Documentation
How checks work, what to upload, and how to read results. No fluff—just what you need.
On this page
How Codequiry works
Codequiry analyzes source code for similarity using two complementary approaches: peer-to-peer comparison within your assignment group and optional web scanning against public sources. The system parses code structure, identifiers, and patterns to find meaningful matches—not just exact text.
- Compares all submissions in the check against each other
- Best for detecting collaboration or code sharing in a class
- Fast turnaround (typically a few minutes)
- Matches against public repositories and code snippets online
- Use when external copying is suspected
- Takes longer due to breadth of search
Workflow: from folder to results
- Create a folder for your course or assignment group.
- Create a check inside the folder and choose detection options (peer, web, or both).
- Upload submissions (ZIP per student). See file requirements below.
- Start analysis and monitor progress. Results appear when processing completes.
Tip: For large classes, run peer comparison first, then selectively enable web scanning for high‑risk cases.
File preparation
- Format: .zip only
- Size: keep each ZIP reasonably small (avoid binaries)
- One ZIP per student submission
- Include only source files (no executables)
- Top‑level source files and folders only
- Remove compiled artifacts (e.g.,
bin/
,target/
) - Use clear names (student name or ID)
Common languages are supported, including Python, Java, C/C++, JavaScript/TypeScript, C#, Go, Ruby, PHP, Kotlin, and Swift.
Detection methods
- All‑against‑all comparison within the check
- Highlights likely collaboration chains
- Low cost and fast
- Searches public code sources and forums
- Best used in addition to peer checks
- Higher coverage, longer runtime
Interpreting results
Similarity percentages are indicators. Always review matched regions in context before taking action.
- 0–20%: typically original work
- 21–50%: review required; check common patterns
- 51–80%: high similarity; investigate and document
- 81–100%: critical; likely extensive copying
- Matched files and line ranges
- Whether structure/logic is copied vs. trivial boilerplate
- Source composition (peer vs. external)
Best practices
- Run peer checks first for quick signal
- Use web checks to confirm external copying
- Document decisions and keep context
- Consider legitimate reuse and starter code
- Judge by percentage alone
- Ignore project‑provided templates
- Skip manual review of highlighted matches
FAQ
Peer comparison usually finishes in a few minutes. Web checks take longer depending on scope.
Yes. Run peer first, then enable web scanning for selected submissions or for the entire check.
Review matches in context. Reused boilerplate often appears across many submissions and should be discounted.
Upload source files inside a ZIP. Avoid binaries and build output. Most common programming languages are supported.