Codequiry Usage Guide

Learn more about how to use and take advantage of Codequiry's code plagiarism detection service.


Codequiry provides full plagiarism detection services for source code. Our solution is built from the ground up for source code similarity and code plagiarism detection. We provide services to thousands of users in various industries from educators to enterprise.

Welcome to the platform usage guide. The goal of this guide is to give you all the information you need to use Codequiry in a proper manner and to make full use from the features provided to you. You may click below to skip to relevant sections or use the sidebar to your left.

All about the account

Before you start checking source code for plagiarism, you will need to create an account. To do this you may visit the link below to do so.

Creating an account

When creating an account you will need to enter payment and basic information such as your name, email, role, and password. This information is never shared with anyone and its purposes are to assign ownership of your data as well as providing you individualized support.

You may change your account information anytime located under "Account Settings" on the top right menu of your dashboard.

Start exploring

Once your account is created, and your trial is started, it's time to start exploring and getting familiar with the dashboard layout. Before we can start checking source code for potential plagiarism you will need to proceed to the instructions below.

Getting started with Codequiry

These tabs contain the most important information on getting started. If you wish to go ahead, the rest is optional!

Folders and creating your first folder

The first action to take after creating your account is to visit the dashboard home page and create a folder. A folder is merely a way for you to organize the checks you wish to perform. This step is self explanatory, just click on New Folder on the top right of your page. You will be prompted to name the folder (minimum of 3 characters).

Checks and creating your first check

On the home page of your dashboard you will be able to create a check into a designated folder, this will allow you to specify certain information such as the parser to be used and the type of check you wish to run.

Uploading submissions

This step is the most important, as if it is done incorrectly, your checks may not return correctly.

Before uploading, some important information about your files

Please note that for each submission, you must provide a .ZIP file containing all the source code for that submission. This means if you have 10 submissions you wish to check, you will need to upload 10 individual .zip files containing the source files for those submissions. Providing a .zip file containing more than one submission will NOT work, it will be counted as one submission! Also ensure that your submissions contain at least one file for the parseable language, for instance, if selected Java each submission must have at least one .java file contained inside the .zip file. We recommend using WinZip or WinRAR for compressing your files.

It's time to start checking submissions for plagiarism

Once you have ensured you have followed the instructions and are ready to check your check. You may click the "Start checking files" button located in the Upload/Submissions tab of your check.

Checking types

Congratulations, you have completed everything required to start checking, your results are almost here. This section will explain the various check types you may run with your submissions.

Group Similarity

The peer check is the default required test, which will check all provided submissions against each other for similarity. This test yields local similarity results within the grouped submissions.

Web Check

The web check will additionally check all provided submissions against 20+ billion sources live on the web. This includes popular sites of code copying such as Stackoverflow and Chegg.

Once you have selected the tests you wish to run, you can start the checking process! Depending on the amount of submissions and code provided, as well as server load your assignment checking speed will vary. However, you should expect results instantly or within minutes.

Sharing your account

You are able to share your account plan, if you have multiple people in your organization who need to be able to check file. This page is located on the "Account Sharing" page under your account.

Managing your data

Codequiry believes that you should be in control of your data. This is why we give you full access and transparency to your data that is accumulated on our platform. You may export and download data related to your account, as well as view and clear data that you do not wish to keep in our databases.

Codequiry does not use your files for checking purposes, neither is it shared with third parties.

Interpreting the results

Now that your assignment has been successfully checked, it's time to breakdown those results so you can understand how to interpret the data and visualizations. You will find two new pages available for viewing, "Overview" and "Results". Below you will find more information on what you will see in these pages and how to interpret what you are seeing.

    Insights Page

    The insights page, will also organize in order the degree of estimated similarity between peer and web samples. These samples will be automatically configured, with the highest similarity matches towards the top.

  • Match Composition

    The Match Composition diagram indicates the sources of code samples. This diagram will indicate what percentage of matches came from a particular source like Github.

  • Overall Variance

    The Variance detector graph monitors the level of variance detected between both web and group similarity. Variance is a normalization metric which is useful when analyzing a check with multiple submissions. This will help you understand the level of disparity between submissions, this metric is mostly useful when using group similarity checks.

  • Overview Page

    The overview page will show you a birds eye view of similarity and scores for local checking (Peer Check).

  • Cluster Graph

    A big part of this page is taken up by the cluster graph, which gives you a nice visual representation of how similar submissions are to one another. The big thing to spot here is the distance between submissions. This shows how similar overall the submission is. An instant red flag is a cluster of submissions, which can potentially show collaboration or sharing of code.

  • Score Bar Chart

    The score bar chart displays all submissions with their highest comparison score shown.

  • Similarity Table

    The similarity table shows all comparisons made with the similarity score for each submission comparison. You can sort through by the highest similarity to see potential peer code plagiarism.

  • Results Page

    The results page is the place to see a submission's results in detail. You will be able to see highlighted matches as well as matches from Peer, Database, and Web Check.

  • File Viewer

    The file viewer allows you to cycle through parsed source code files along with highlighted matches.

  • Match Explorer

    The match explorer will display all matches found for the current selected file. Peer matches are displayed first along with a similarity score and then web matches. Clicking a match will open the file side by side with the match found, if it is a web match you will also be provided with the URL of the web page found.

  • Plagiarism Score

    The plagiarism score is an estimated probability of plagiarism based upon the submissions level of unoriginality. The score is broken into three main sections, peer score, web score, and score average. This is only an estimate and may not be accurate, so you will need to examine in more detail if the percentages seem alarming.

  • Source Makeup

    The source makeup pie chart will give an overall estimate of the composition makeup for the selected submission. It includes the percentage of similar unique content, similar peer content, and similar external content. This is only an estimate and may not be accurate, so you will need to examine in more detail if the percentages seem alarming.

All data from checking results should be taken with a grain of salt. Codequiry never guarantees if a submission has been plagiarised, it is up to the person performing the tests to determine if the submission has, in fact, been plagiarised. Percentages and scores are an estimate according to our similarity threshold and do not represent actual values.

Start checking code for originality

Our Mission

Codequiry aims to achieve an equally fair environment for fields relating to computer science by preventing the use of unoriginal code. The first step to preserving academic integrity and original source code starts here.


© 2018-2024 Codequiry, LLC.