Docs – CoderPad https://coderpad.io Online IDE for Technical Interviews Fri, 27 Mar 2026 10:39:05 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 https://coderpad.io/wp-content/uploads/2024/01/cropped-coderpad-favicon-32x32.png Docs – CoderPad https://coderpad.io 32 32 Screen preparation guide https://coderpad.io/resources/docs/for-candidates/screen-preparation-guide/ Thu, 19 Feb 2026 13:15:20 +0000 https://coderpad.io/?post_type=doc&p=8831 Below you’ll find a list of resources to help you prepare for your Screen assessment:

✅Want some coding practice to prepare for your Screen assessment? You can access free gamified coding challenges on our CodinGame sibling site.

Onboarding tutorial

When you click on the invitation link, you’ll access a tutorial showing what a Screen test looks like.

Screen tutorial page with the options to start tutorial or start the test.

You can take it as many times as needed. A tutorial’s questions change from one test to another, depending on the technologies you’re evaluated on and the type of questions chosen by your recruiter. Two tutorials based on two different tests will not show the same questions. 

✅ As we’re an online testing platform, the solutions are not available anywhere online. We are happy to help if you are facing a technical difficulty, but we will not provide any actual answers to the tutorial or test questions. If you encounter difficulties, do not hesitate to contact support or chat with us.

The steps to complete the question are shown: 1. Read the question and analyze, 2. Write your answer, 3. run tests, and 4. check the solution.
The coding question tutorial

Also on the invitation screen you will also see how many questions you will have to work on and the maximum time that the test is expected to take you.

Scoring

Once you finish your tutorial, you will see your score for the questions you answered to help you understand how the scoring works. 

The tutorial end screen with a review of the answers.

✅ Some multiple choice questions (MCQs) have several correct answers.

The validators we use to see if your code is correct are not the same as the ones you see to test your solution in the test environment. This means that your attempt might “pass” even though your solution does not work properly.

If the question times out, you do not fail the question automatically; the code is immediately submitted at the time out. If your attempt works and gives you points, they will be added to your global score. You do not always have zero points on a timed-out question. 

❗Your recruiter is the only one who can decide whether to send your score to you or not. CoderPad cannot access your test or provide your score. If you wish to know your score, please contact your recruiter. 

Preparing for a Screen test

Since no one Screen test is the same, the best way to prepare is to keep your skills sharp by practicing software development as much as possible. To this end you can create an account with our CodinGame Community where you can improve your dev skills by solving puzzles and playing games that allow you to practice specific languages and technologies.

Question types

  • Multiple choice questions: With only several seconds to answer, these questions are intended to test your language knowledge.
  • Text questions: Same as above. These questions are made to test your language knowledge, and you will only have several seconds in order to type the text for your answer. 
  • Programming questions: With this type of question, you will have more time (several minutes or longer). You will be facing a problem with a starter code and will have to solve the problem in the most efficient way in order to get all the points. For these questions, you can make use of all available resources like StackOverflow or Google — however you may not have someone else actively solve the question for you.
  • Gaming questions: Same as above. These are programming questions, but with a different outcome, as the visual rendering will be represented as a video game. 
  • Project questions: Projects provide you with a complete VS Code development environment to showcase your skills on real-world coding projects. Check out this specific guide to learn more about Project exercises.
  • Audio/Video questions: You’ll be presented with a text prompt and you’ll either answer as recorded audio or video via your computer’s webcam and microphone.

Keyboard shortcuts

At Screen, we use the Monaco Editor shortcut system in our editor. You can check out their website for more information.

Request a test retake

Some companies will allow you to retake a test.

If a company participates in this feature, you will see this screen at the bottom of your test results email:

A results email with the request retake option shown at the bottom. the text reads: "Not happy with your score? You can ask the recruiter for a second chance".

Simply click on the Request retake button and you’ll be taken to a screen to explain the reason(s) you’d like to retake the test:

Request retest screen. There is a field for the candidate to explain why they'd like to retake the test. At the bottom they submit the form by clicking on a "request retake" button.

The request will be sent to the test administrator. They will then confirm or deny your retake request.

If your request is approved, you will be sent another test.

]]>
Creating custom questions https://coderpad.io/resources/docs/screen/customizing-questions/creating-custom-questions/ Thu, 19 Feb 2026 13:14:39 +0000 https://coderpad.io/?post_type=doc&p=8907 Select an item from the list for more information on custom question creation:


View current custom questions

To view your existing list of custom questions from your organization, select the Custom tab on the top right of the screen. This will change you over to the Custom questions tab.

Create a custom question in the question editor

To add a new custom question, click Create question in the top right corner of the Questions list.

Custom Question list screen with an arrow pointing to the "create question" button at the top right of the screen.

In the Select question type pop up, select a Question type. There are four primary options:

The "select question type" menu with 4 options: multiple choice, free text, coding exercise, and project.

If you select Coding exercises, you’ll be able choose from Function-based or Language-specific questions.

Once you select the question type and any type-specific configurations, you’ll then need to select the question language(s) and click Create question.

New question modal with "language" selection drop down and "create question" and "back" buttons.

For coding exercise questions, you’ll also have the option of selecting a template that will pre-fill the question editor with a CoderPad-validated programming exercise.

New question modal with "language" selection and "Template" drop down and "create question" and "back" buttons.
The template drop down is expanded with options for "Default template", "accounting test: calling a service", and "custom pipe knowledge" options shown.

Create multiple choice questions

  • Input the Title (the candidate does not see the title).
  • Write out the question in the Statement box on the left.
  • Text options include common styling options, code inputs, LaTeX formulas, and adding images and attachments.
  • On the right, input the answer options.
  • Indicate the correct answer(s) with the switch.
  • Select Multiple answers possible if there is more than one answer.
  • Select Randomize order during the test to randomize the answer list order if required.
  • The candidate must select ALL valid answers, and no other answers, to receive the maximum points available, otherwise marks are proportional.
Multiple choice creation screen consisting of title field, statement field, and answers field.

Settings

Question settings section to select domain, skill, difficulty, duration, and points.

Add the following details:

  • The domain (i.e. technology or programming language) of your question where you can select an existing domain or create a new domain
  • The skill to test.
  • The question difficulty.
  • Question time limit.
  • Total points available.
  • Whether to add the question to automatically generated tests.
  • Whether to authorize Screen to add this question to the Screen question library.

Click SAVE to save the question.

AI-generated multiple choice questions

If you’re looking to quickly create multiple choice questions for a test, you can use the AI-generated question creation feature.

Simply click AI-generated questions for the question type, and then follow the instructions to generate your questions.

Interface titled ‘Select question type’ showing five options: Multiple-choice, Coding exercise, File upload, Multiplayer question (admin), and AI-generated questions, each displayed as selectable buttons with icons.
Modal titled ‘AI-generated multiple-choice questions’ with fields for ‘Skill or topic to assess’ (set to Prompt engineering) and ‘Additional instructions’ (text about assessing ability to use AI for coding). Options are shown for target experience level (Junior, Senior selected, Expert) and number of questions (3, 5 selected, 10, 25). A dropdown labeled ‘Team that can edit these questions’ shows CoderPad Inc., and a ‘Generate questions’ button appears at the bottom

Create free text questions

Input the Title and write out the question in the Statement box below.

Empty text question creation page. Consists of title and statement fields.

Add the Settings as before.

Settings for test question including fields for domain, difficulty, duration, and points.

Select (1) Automatic or (2) Manual validation in the Validation section.

1. Automatic validation

Select the skill and input the answer to the question. Candidate answers must match exactly, but are case insensitive.

Validation section with "automatic" option selected and the skill and answers field empty.

✅ Check the Regular expression box and input a regex code that allows a more flexible range of answers. Check the Oracle Java Regex description for more information.

2. Manual validation

Input a Label for the report. Select the Skill and Weight. The Points value auto-populates.

Validation section with "manual" option selected. The fields for label, skill, weight, and points are displayed.

After a candidate completes a test with a question that needs manual validation, you will receive an email inviting you to manually validate the answer to the question. The system then calculates a final score.

Create language-specific exercise

Once you’ve selected Language-specific from the Coding exercise question type menu, you’ll be taken to the language selection window.

A list of programming languages arranged alphabetically.

Once you select a language, you’ll be taken to additional settings. Here you can select the language for the question. Additionally, the Template dropdown contains existing Screen questions that you can use as a basis for your own.

New question modal with a question type of programming exercise and a programming language filed with "English" selected and a "template" field displayed.

You can edit the details of the existing question, including adding a zip file, before choosing the settings.

Template shown with title and statement fields.

In Settings, input the points for the question.

Settings section with domain selected as "language independent", difficulty as "easy", duration as 12 minutes, and points as 100.

✅ The domain field may have more options depending on the language selected.

If you want, you can add your question to the Screen library by checking the authorization box. Hover over the information circle to see the details:

By checking this box, you authorize CoderPad to add this question to their library of available questions. Any CoderPad customers will be able to add your question to their tests. If a large pool of candidates attempt to answer your question as part of their coding test, the comparative score (used to benchmark candidates based on their skills) for your questions will be more precise

Next, you can set the execution time limit to focus the question on code optimization or allow extra time for long-running processes. You’ll also have the option to add custom files in the External dependency section. By clicking the Add external dependency button, you can add CSV files, text files, JSON files, or any other kind of files you’d like the candidate to work with.

The image shows an execution environment configuration screen, indicating that an Angular 2+ environment is running. There's an execution timeout setting, which is currently set to 20 seconds. Additionally, there's an option to add an external dependency, but it appears inactive or disabled at the moment. There are also advanced settings that can be expanded for further configuration.

⚠ Your dependency files must be uploaded as `.zip` format.

Scroll down to see the Initial candidate answer code and the Initial candidate test code input boxes. This is the editable code your candidate sees at the start of the test.

Initial candidate answer code box on the left, and initial candidate test code box on the right.

The Code validator section stores validation code which runs against the candidate’s solution in order to assess it.

Code validator screen with some code displayed.

Underneath the Code validator is a Validator Mapping section where you can set the criteria for validation.

In all language-specific coding exercises, there is a parent validator and possibly one or more child validators. The child validators are only evaluated if the parent validator passes — if a candidate fails the parent validator, they automatically get 0 points for that question.

ℹ The reason for utilizing this parent/child logic is that you may want to test specific edge cases (e.g. input is null), but you don’t want to award points if the candidate has only implemented the edge cases (if input is null then x).

If you really want to disable this logic, you can put a test that always evaluates to true in the parent validator.

Validator mapping section with label, method, skill, weight, points, and status columns displayed.

✅ The method field in the Validator Mapping must match the method name in the Code validator.

You can also try out a Possible solution and test your code. Click Preview to test the question:

Possible solution section with code box on the left and a "validate the solution" button on the right.

You will see the question as the candidate sees it:

A candidate question preview with question on the left, candidate's answer on the right, and test code on the bottom right.

After testing and submitting, click Save to save the question.

Create project exercise

Project exercises are in-depth coding questions that involve a full IDE. You can find more information on creating project questions here.

Create a function-based question

Function-based questions allow you to create language-agnostic exercises where the candidate can answer in the programming language they are most comfortable with.

Once you’ve selected Function-based from the Coding exercise question type menu and selected the questions language, you’ll be taken to the configuration page.

First, in the Instructions section, you’ll need to describe the goal of this question using the input box. You can use text, images, links, formulas — whatever you need — to describe this question to your candidates.

The instructions section with the goal input field displayed.

✅ You can change the language at the top right of the screen to see what the question would look like in other languages.

Also in the Instructions section you’ll see the Implementation of the functions. This section will be filled out automatically as you complete the rest of the steps in the question creation process.

The implementation section shows details of the function including the description, parameters, return value, constraints, and an example.

Next, you’ll fill out the inputs in the Exercise section.

You’ll add the name of the function you want the candidate to write (1), as well as the name (2), type (3), and description (4) of both the input and output parameters.

The exercise screen. On the left is the parameters section with a 1 next to the function name, a 2 next to the paramater name, a 3 next to the parameter data type, and a 4 next to the description. the input parameters are on the top left of the screen, and the output parameters are in the bottom left of the screen.

You can add more parameters by clicking Add parameter (8). Additionally, certain input data types will allow you to add constraints (9), including min values, max values, and pattern matching.

The parameters screen is shown with a 9 next to the "add constraints (optional)" link and the "Add parameter" button.

On the right side of the screen, the initial code will automatically fill out as you add the different parts of the function description.

The initial code screen is shown with some generated code displayed.

Next, in the Validation section, you can add test cases (ungraded, for candidates’ use only) and validators (graded tests) that will run against the code to let you know if the candidate was able to solve the question.

For test cases, you simply need to select Test from the Type drop-down, add a label (test name), the input, and the expected output. If you already have a possible solution, you can click the Generate from solution button to have the correct output automatically generated from your code. To add another test, simply click Add test case.

The image shows a screenshot of an online coding platform used to create and validate coding test cases. The interface is divided into sections:

1. **Left Panel:**
   - Labeled "Test cases" with a notification in red that says, "At least one validator (visible or hidden) should be defined."
   - A test case labeled "Simple test" is shown with the input "numbers" which contains the list `[1, 2, 3]`.
   - The expected output for this test case is `6`.
   - There is a button labeled "Add test case" at the bottom.

2. **Right Panel:**
   - A text area labeled "Possible solution" where a code can be written to solve the problem.
   - A green button labeled "Validate the solution" is located near the top of this panel.

3. **Top Bar:**
   - The top bar contains options: Instructions, Function, Validation, Settings, and Preview.
   - The Validation tab is currently active.
   - A dropdown menu at the top right indicates the programming language selected is Java.
   - A yellow "Save" button is visible at the top right corner.

This screenshot illustrates the process of setting up a coding test case, including defining inputs, expected outputs, and writing a possible solution for validation.

For validators, you’ll need a select the Type (hidden or displayed to the candidate), a label (validator name), a skill that the validator is testing, and the weight of the question — the higher the weight, the more points will be awarded for passing that particular validator. If you already have a possible solution, you can click the Generate from solution button to have the correct output automatically generated from your code. To add another validator, simply click Add test case.

The validators section is shown with the type, label, skill, weight, points, input box, output box, "Generate from solution" button, "validate the solution" button, and "Add validator" buttons displayed.

Once your possible solution is written, simply click the Validate the solution button to see the results.

The validation section is shown with the possible solution box and validate the solution button highlighted.

Next you’ll fill out the question settings in the Settings section. Here you can edit:

The settings section is shown with the the following fields: Question title, Question domain, Difficulty level, 
Duration, Point value, Language, Allow use of the question for test generation, Allow CoderPad to use the question for our question library.

Lastly, you’ll be able to view your question from the candidate’s perspective in the Preview section at the bottom.

The question preview screen is shown with the instructions and console output on the left, and the coding section on the right.

Create an audio or video question

Audio and video questions can be added at different stages of your hiring workflow, depending on the signals you want to capture.

Use video questions after a coding exercise

Add a short (3–5 minute) video question after a take-home or coding task when you want to:

  • Confirm the candidate understands the solution they submitted
  • Evaluate how they think about trade-offs (e.g., performance vs. readability)
  • Assess reasoning and communication skills before investing live interview time
  • Ask how they would adapt their solution under different constraints
  • Reduce time spent in early technical interviews by filtering for clarity and depth

✅ Keep the prompt focused (e.g., “Walk us through your approach and the trade-offs you considered”) and limit responses to a few minutes.

Use video questions as a standalone screening step

Add a video question early in your funnel when you want to:

  • Evaluate communication skills at scale
  • Replace or reduce live HR pre-screen calls
  • Ask structured questions about past experience
  • Assess motivation and clarity of thinking before scheduling interviews
  • Reduce scheduling overhead for your team
  • Shorten time-to-hire in high-volume roles

✅ Use 1–2 targeted prompts (e.g., “Describe a recent technical challenge and how you approached it”) to keep the experience focused and candidate-friendly.

When video questions add the most value

Video questions are especially effective when:

  • You receive a high volume of applicants
  • Interview bandwidth is limited
  • Communication is critical for success in the role
  • You want stronger signal before committing engineering time
Screenshot of a the custom video question's “Recording” setup page in a multi-step workflow (Instructions, Recording, Evaluation, Settings). The “Recording” tab is active, showing format options with “Video (records camera and microphone)” selected and “Audio only (records microphone only)” unselected. Below, an “Evaluation criteria (100 pts in total)” section displays a single criterion labeled “Criterion” with Skill set to “Problem solving,” Weight set to 1, and Points set to 100, along with an option to add another criterion. A “Preview” button and an “Actions” dropdown appear in the top right.
In addition to choosing what type of recording they answer in, you’ll also be able to set-up manual grading criteria and basic question settings like the maximum length of the recording and the total points the question is worth.

Still have questions about how to create video questions? Checkout this detailed guide:

Create a file upload question

File upload questions allow you to assign a project to candidates to do on their own time using their own IDE or other technologies/applications they like.

You provide any starting instructions or files they need, and then once they’ve completed the assignment they’ll upload their results for you to review.

Here’s an appropriate **alt text** for the image you provided:

> Screenshot of a CoderPad interface showing a custom question titled “Question 1 / 1 - Vibe coding.” The timer shows 00:10 elapsed out of 30:00 minutes. The instructions say “Use your favorite LLM and vibe code an enterprise application.” On the right side, there’s an upload area with a download icon and a message saying “You can drag & drop your project here. You must upload before the timer ends.” Below it are buttons for “Browse files,” an optional comment box, and “Submit” and “Preview” buttons at the bottom right.

Add translations

Whenever you create a new question, if you select more than one language, you will find tabbed areas for inputting the text in other languages.

Multipl choice question translation screen with "Chinese" tab opened and title, statement, and answers fields displayed.

Custom question video tutorial

For more information on creating custom questions, check out this Creating custom question with CoderPad Screen tutorial video:

]]>
AI Tools for Screen https://coderpad.io/resources/docs/screen/tests/ai-tools-for-screen/ Mon, 09 Feb 2026 14:51:39 +0000 https://coderpad.io/?post_type=doc&p=43922

✅ Your account administrator can make AI Assist available in tests by default in the Test Template settings. You can also turn this feature on or off for an individual test in the test’s Settings under the General tab.

ℹ AI Assist is only available for Project questions.

Your candidates now have access to current large language models to utilize as a tool in Project questions, including:

  • GPT
  • Claude
  • Gemini
  • Llama

To use it, simply click on the AI Assist window, select your language model and context, and enter your prompt.

Screenshot of a CoderPad coding environment showing a React project. The screen is split into three main panels: a file explorer and instructions on the left, a live preview of a webpage titled “Global Alliance for Literacy” with a speaker icon in the center, and an AI assistant chat on the right discussing code efficiency improvements. A terminal at the bottom shows a development server running.

This feature is particularly useful for candidates to showcase their prompt engineering skills so you can see how they utilize this new technology in a realistic environment.

✅ All of the candidate’s prompts and AI output will be saved for review in playback for the question within the detailed test report.

Include Code in Context

To further understand a candidate’s ability to use generative AI, you can also add context in the AI Assist window.

This allows CoderPad’s AI assistant to read the code in the IDE, similar to tools like GitHub Copilot or Cursor.

]]>
Cheating prevention and detection in Interview https://coderpad.io/resources/docs/cheating-prevention-in-interview/ Thu, 04 Dec 2025 11:48:14 +0000 https://coderpad.io/?post_type=doc&p=43584 To reduce instances of cheating during live interviews—and to keep the process fair, transparent, and reflective of real-world work—we recommend combining preventive controls, thoughtful interview design, and active interviewer engagement using the following CoderPad features and best practices.


1. Use multi-file projects

Always use a multi-file project template instead of a single-file pad.

Large language models perform significantly worse when they must reason across multiple interdependent files, navigate structure, and maintain context—closely mirroring real production work. Multi-file setups also make it easier to assess a candidate’s ability to understand unfamiliar code and reason holistically.


2. Design questions and sessions for reasoning, not recall

Structure interviews to surface thinking, trade-offs, and adaptability rather than polished final answers.

Best practices include:

  • Multi-part or progressive problems that evolve over time
  • Asking candidates to explain their approach verbally as they code
  • Introducing follow-up changes mid-solution to test flexibility

Helpful probing questions include:

  • “Why did you choose this data structure?”
  • “What are the trade-offs of this approach?”
  • “If performance became an issue, how would you optimize this?”
  • “What would happen if we changed X to Y?”

Skilled developers can reason about complexity, edge cases, and constraints. AI-generated answers often struggle under these follow-ups or rely on vague, textbook-style explanations.


3. Enable video and audio

Enable video and audio for live interviews to confirm the candidate is the one coding and to monitor engagement.

  • Video helps establish presence and accountability
  • CoderPad’s video feature does not allow backgrounds or background filters
  • Video and audio can be toggled in the pad settings when launching the session

These signals help interviewers correlate on-screen activity with verbal reasoning in real time.


4. Leverage environment awareness and respond in the moment

CoderPad alerts interviewers when a candidate:

  • Pastes code from an external source
  • Leaves the IDE (e.g., tab-switching)

These alerts help surface potentially unmonitored activity, but they are most effective when paired with active interviewer response.

Best practices:

  • Ask the candidate why they pasted code or left the IDE when alerts appear
  • Treat alerts as prompts for clarification, not immediate disqualification
  • Watch for patterns such as:
    • Perfect code pasted after long silence
    • Long pauses before answering follow-up questions
    • A mismatch between verbal fluency and technical depth

For higher-security interviews, you may also ask the candidate to share their screen if off-pad activity is suspected.

If you have Interview Summary and Outline enabled, notes and transcripts are automatically generated so you can stay focused on the candidate rather than documentation.


5. Use playback and post-interview verification

Every pad records a complete timeline of the session, including:

  • Code edits
  • Runs
  • Cursor movements
  • Copy/paste events
  • IDE exit notifications

After the interview:

  • Review playback to verify behavior and pacing
  • Look for bursts of activity inconsistent with normal typing
  • Use AI-generated summaries and transcripts to support consistent, fair post-interview review

Playback provides objective context and helps reduce bias in hiring decisions.


6. Enable and frame in-app AI usage transparently

If AI is part of your developers’ real-world workflow, interviews should reflect that reality.

  • Enable AI Assist for candidates so all AI usage is visible within the platform
  • Clearly communicate expectations:
    • Any AI usage should remain within the AI Assist tab
    • Candidates should be prepared to explain, critique, and adapt AI-generated output

This allows interviewers to evaluate how candidates use AI, not just whether they use it.


7. Use collaborative and pair-programming techniques

Consider interview formats that emphasize collaboration and real-time problem solving:

  • Pair programming or guided live coding
  • Debugging exercises
  • Verbal reasoning prompts during implementation

Features like name-tagged cursors and “Follow Candidate” mode make it easier to observe how candidates think, communicate, and respond to feedback—skills that are difficult to fake with AI assistance.


8. Control access with the candidate waiting room

Use the candidate waiting room to prevent candidates from entering the pad before you are ready.

This ensures candidates cannot:

  • Pre-read instructions
  • Explore the file structure early
  • Prepare offline solutions before the interview officially begins

When you admit the candidate, you control exactly when the interview environment becomes visible.


9. End interviews to lock the session

Use the “End Interview” action to immediately revoke editing access once the session is complete.

Ending the interview:

  • Prevents post-session edits or overwrites
  • Finalizes the session timeline
  • Ensures playback remains fully reliable for review

Final note

Effective cheating prevention is not just about restrictions—it’s about designing interviews that reward reasoning, transparency, and real-world skills, while giving interviewers the tools and confidence to respond thoughtfully in the moment.

]]>
The CoderPad Screen Score Guide https://coderpad.io/resources/docs/the-coderpad-screen-score-guide/ Fri, 14 Nov 2025 11:48:00 +0000 https://coderpad.io/?post_type=doc&p=43558 The CoderPad Score GuideDownload ]]> Projects exercises https://coderpad.io/resources/docs/for-candidates/screen-preparation-guide/projects-exercises/ Thu, 11 Sep 2025 12:55:21 +0000 https://coderpad.io/?post_type=doc&p=43155 When you work on a project exercise, you’ll have access to a complete VS Code development environment. This guide will help you prepare for success and make the most of the available tools and features.

Tutorial

Before starting your test, we strongly encourage you to complete the interactive tutorial. The tutorial will help you familiarize yourself with:

  • The VS Code interface and layout
  • Available tools and features
  • How to customize the environment for your preferences

You can take the tutorial as many times as you want. It is not part of your assessment and won’t affect your results.

Getting started with VS Code

Screenshot of the CoderPad Screen IDE during a React project exercise. The left sidebar shows the project structure under 'PROJECT [CODERPAD]' with folders (.vscode, node_modules, src) and files including index.html, instructions.md, config files, and package.json. The main editor displays instructions.md, which describes the project: a React app with a text and button for testing speech synthesis. The right side shows the Simple Browser preview with a page titled 'Global Alliance for Literacy' and a speaker button. At the bottom, the terminal shows npm run dev output with Vite server running on port 5173. A notification suggests installing the Jest extension. The bottom right corner has a yellow 'Submit' button.

Activity bar

The Activity Bar on the far left gives you access to the core elements of the IDE. You’ll land in the Explorer view, which lists all folders and files in your project, with instructions.md open by default. Read this file first; it describes your goals, constraints, and any setup steps. You can then navigate to other files from the Explorer.

Recommended extensions

Recruiters may include recommended extensions for your project. If so, a notification will appear in the bottom-right corner of the IDE. Click Install to add them.

Terminal

VS Code has an integrated terminal that can be accessed from the Panel area. The integrated terminal is crucial for many project tasks (installing packages and dependencies, running development servers and build tools, executing test suites and scripts, etc.).

Command Palette

The Command Palette provides quick access to all VS Code functions. To open it:

  • Windows/Linux: Ctrl + Shift + P
  • Mac: Cmd + Shift + P

VS Code settings

You can customize almost every part of VS Code by configuring settings. You can use the Settings Editor to modify the settings in VS Code or directly modify the settings.json file. We recommend adjusting the settings during your tutorial session so that the IDE matches your preferred configuration in terms of shortcuts, themes, accessibility, etc.

For a comprehensive introduction to VS Code, you can check out the official documentation.

CoderPad-specific settings

Quick Access

Within VS Code Activity Bar, you will see a Quick Access entry with a star icon. This menu will help you access key resources for your projects, including:

  • Instructions
  • Terminal
  • Web Preview, if relevant for your project
  • AI Assistant, if enabled for your project

Some of these will open automatically when you start your project. If you close them, reopen them from Quick Access at any time.

AI Assistant

If your test enables the AI Assistant, a chat panel will open by default. Use it the way you would at work—to ask questions, make updates to your code, generate snippets, or explain errors—and then review and adapt its suggestions. Recruiters enable this because they want to see how you collaborate with AI, so please use the built-in assistant rather than external tools.

Submitting your work

The Submit button sends your completed project for evaluation.

⚠Please note that only saved files are included in your submission

Before submitting:

  1. Save all files: Use Ctrl/Cmd + S on each file or save all with Ctrl/Cmd + K, S
  2. Check for unsaved changes: Look for white circles next to filenames
  3. Test your solution: Run your code to ensure it works as expected
  4. Review requirements: Double-check that you’ve addressed all project requirements

You can save files with one of the following methods:

  • Individual file: Ctrl/Cmd + S
  • All files: Ctrl/Cmd + K, then S
  • Auto-save: Enable in settings for automatic saving
]]>
Custom Project questions https://coderpad.io/resources/docs/screen/customizing-questions/creating-custom-questions/custom-project-questions/ Thu, 11 Sep 2025 12:50:21 +0000 https://coderpad.io/?post_type=doc&p=43224 To create a custom project, navigate to the Questions page and click Create question, then select Project. This opens the project creation workflow, where you’ll design a complete coding environment tailored to your specific assessment needs.

Screenshot of the CoderPad Screen interface showing a pop-up window titled 'Select question type.' The options listed are Multiple choice, Free text, Coding exercise, File upload, and Project. The 'Project' option is labeled with a red 'New' tag. A large red arrow points directly to the 'Project' option

Selecting a template

Templates are preconfigured minimal projects built around specific technology combinations, designed to serve as starting points for your custom projects.

Templates are intentionally minimal to provide a clean foundation while giving you complete flexibility to customize the environment for your specific requirements.

/quote

Screenshot of the 'Select template' screen in CoderPad. Eight template options are displayed in a grid:

Default template – Minimalist Node.js template.

React – Basic React 19 counter app with Vite and Node.js.

Angular – Basic Angular 19 app with Vite and Node.js.

Python – Environment with Poetry, Volta, and Node.js.

Java – Environment with Maven, Gradle, Volta, and Node.js.

Go – Environment with Volta and Node.js.

.NET – Environment (C#, F#, Vb.net) with Xunit, Volta, and Node.js.

Next.js – Full stack Next.js 15.5 with Node.js, Prisma, and PostgreSQL.


Editing your project in the IDE

Once you select a template, scroll down to the Project section and click Edit your project in IDE. You’ll enter a full-featured VS Code environment where you can design your project.

Screenshot of the CoderPad Screen interface in the 'Questions' section. A highlighted box labeled 'Project' includes a folder icon and a button labeled 'Edit your project in IDE.' Below, the 'Settings' panel displays fields: Domain set to Node.js, Difficulty set to Easy, Duration 40 minutes, Points 200, and Team set to CoderPad Inc.


What’s available in the IDE

Screenshot of the CoderPad Screen 'Project edition' interface. The left sidebar shows a project folder named 'PROJECT [CODERPAD]' containing files: .coderpad, .vscode, .gitignore, and instructions.md (selected). The main editor displays the instructions.md file with markdown text explaining how to use the project template, save work, and define instructions. A right-hand panel labeled 'AI Assist' shows the ChatGPT logo with a prompt field. At the bottom right, a yellow button labeled 'Update project' is visible

Your custom project environment includes all standard VS Code features:

  • Integrated terminal with full command-line access for package installation, script execution, and system operations
  • IntelliSense providing intelligent code completion, parameter hints, and error detection
  • AI assistant to get help refining your project
  • Extension marketplace access to install language-specific tools and productivity enhancers
  • Built-in debugger supporting multiple programming languages with breakpoints, variable inspection, and call stack analysis
  • Git integration for version control workflows and change tracking

ℹ Environment configurations

  • Projects run inside isolated Linux x64 containers.
  • Project size (maximum size of the repository): 50MB
  • Disk space (storage available to candidates during their project): 5GB
  • Memory (RAM): 2GB

Candidate instructions

Every project must include an instructions.md file containing your problem statement and setup guidance. It uses markdown syntax. This file is automatically opened and rendered when the candidate starts the question.

Use it for:

  • Problem statement and acceptance criteria
  • Setup/run steps and environment notes
  • Any constraints, expectations, or deliverables

coderpad/settings.json file

The .coderpad/settings.json file controls critical project behavior and must be configured properly for optimal candidate experience.

Example

{
    "files.exclude": ["solution.md"],
    "workbench.defaultOpenFiles": ["src/App.tsx"],
    "autograding": {
        "runCommand": "rm -f result.tap && npm ci && npx jest --ci --reporters=default --reporters=jest-tap-reporter > result.tap",
        "reports": {
            "TAP": ["result.tap"]
        }
    }
}

Configuration options

  • Open files: Specify which files should be open by default when candidates start the project (in addition to instructions.md). This helps direct their attention to starting points or key files.
  • Excluded files: List files and directories that won’t be included in the candidate’s project environment.
  • Autograding logic: Define how automated tests should run and report results (detailed in the auto-grading section below).

Recommend extensions

Use the standard VS Code file .vscode/extensions.json to recommend extensions. Example:

{
  "recommendations": [
    "Orta.vscode-jest"
  ]
}

When candidates start your project, they’ll receive notifications suggesting these extensions, helping them set up an optimal development environment quickly.

Pre-install extensions

You can define a list of extensions that will install automatically at project startup, for both recruiters and candidates.

To set this up, add your extensions to the vscodeExtensions.installedByDefault field in your project’s .coderpad/settings.json file. Here’s an example:

{
    "vscodeExtensions.installedByDefault": [
        "ms-python.python",
        "esbenp.prettier-vscode"
    ]
}

Allowed & restricted extensions

You can also block candidates from installing certain extensions.

⚠ If the setting is not configured, all extensions are allowed. If the setting is configured, all extensions that are not listed are blocked from installing

For example, the following means that all extensions are not allowed, except GitHub and Microsoft ones.

{
     "extensions.allowed": {
          "github": true,
          "microsoft": true
     }
}

For more information on how to allow or disallow certain extensions (including different versions), check out the VS Code documentation here.

AI Assistant

The AI Assistant can be enabled or disabled at the test level through test settings. If enabled:

  • Candidates see an AI assistant panel and can chat with an available model.
  • AI conversations appear in playback for reviewers.

Front-end render

For web development projects, the VS Code Simple Browser module automatically renders your application, providing candidates with immediate visual feedback.

  • When your development server starts on any available port, the Simple Browser opens automatically
  • The Ports view in the VS Code panel shows all forwarded ports and their status

Web preview

For web development projects, a web preview component can be used to render your application, providing candidates with immediate visual feedback. To enable a web preview in your projects, configure the exposed service in the .coderpad/settings.json file using the following fields:

  • mode: Use this mode to enable the web preview
  • openByDefault: Determines whether the preview opens automatically when the project starts
  • port: Specifies which port the preview will map to
  • name: Set a custom display name for your app

For example:

"exposed": [
    {
        "mode": "browser", 
        "openByDefault": true, 
        "port": 5173, 
        "name": "MyApp" 
    }
]

Connecting to a database

You can attach PostgreSQL or MySQL databases to any project through the Execution environment section of the question editor.

Screenshot of the “Execution environment” settings in CoderPad showing a running environment with Node 22.9 and PostgreSQL 16.4. The “Resources” dropdown is highlighted, indicating PostgreSQL 16.4 is selected.

Connection credentials are provided through environment variables. For example:

  • username: process.env.POSTGRES_LOGIN
  • password: process.env.POSTGRES_PASSWORD
  • host: 'screen-execute-environment-node-postgres'

ℹ Database RAM limitations

  • PostgreSQL: 200MB
  • MySQL: 500MB

Git integration

Projects include a default .gitignore to keep ephemeral or build artifacts out of version control. As you edit your project, you will be able to keep track of all changes through the Source Control module.

When you start editing your project, a branch is created. When you click Update project, your changes are automatically committed and pushed to the remote project repository.

Once you save the question, the pushed commits are merged on main, and a release is created. Advanced users can manage commits manually through VS Code’s Source Control panel.

How to save your work

  1. Save all files in VS Code (Ctrl/Cmd + S)
  2. Click Update project at the bottom of the screen
  3. Close the IDE
  4. Click Save at the bottom of the question page

⚠ Your temporary branch containing your updated project will only be merged into the main branch once you have saved the question from the question editor.

Auto-grading

A properly configured .coderpad/settings.json file is required for auto-grading functionality. The configuration defines how tests run and where results are stored.

Configuration example

{
  "autograding": {
    "reports": {
      "TAP": ["result.tap"]
    },
    "runCommand": "rm -f result.tap && npm ci && npx jest --ci --reporters=default --reporters=jest-tap-reporter > result.tap"
  }
}

Run command requirements

The runCommand must be designed to work in a fresh container environment and should:

  • Install dependencies: Use npm ci, pip install -r requirements.txt, or equivalent
  • Execute tests: Run your testing framework with appropriate reporters
  • Generate reports: Output results in TAP or JUNIT format
  • Handle cleanup: Remove old result files to prevent conflicts

Running Multiple Test Suites (Backend + Frontend)

If your Project runs multiple test suites (for example, backend tests and frontend tests), we strongly recommend using one wrapper script (e.g., run-all-tests.sh) instead of running commands sequentially using operators like &&.

  • command1 && command2 prevents the second suite from running if the first suite fails.
  • Some templates clean the junit-reports directory at the start of each script, which can erase reports produced earlier.

Recommended pattern

Create a wrapper script:

#!/bin/bash
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
PROJECT_ROOT="$SCRIPT_DIR"

# Clean up reports once
rm -rf "$PROJECT_ROOT/junit-reports"
mkdir -p "$PROJECT_ROOT/junit-reports"

./run-backend-tests.sh
./run-frontend-tests.sh

Update .coderpad/settings.json:

"runCommand": "./run-all-tests.sh"

This ensures all test suites run and produce the expected combined test results.

Correct Cleanup Behavior

Clean the reports directory once at the beginning, not inside each test script.

Avoid this inside test scripts:

rm -rf junit-reports

If cleanup occurs inside multiple scripts, one suite may delete the reports produced by another.

PATH Differences Between IDE and Auto-grader

The PATH inside the auto-grader container may differ from what you see in the IDE terminal.

If a command works in the terminal but fails during auto-grading:

  • Use the absolute path of the command. Example for Node.js Projects:
/home/coderpad/.volta/bin/npm install
/home/coderpad/.volta/bin/npm test

This ensures consistent behavior between Preview, Sync, and candidate grading.

Skipped Tests Are Not Imported

Auto-grading can only import test cases that appear in JUnit/XUnit/TAP reports.

If a test is skipped (e.g., pytest.skip()):

  • It may not appear when clicking Sync from project
  • It will not be included as an evaluation criterion

Instead of skipping, force a failing assertion:

assert False, "Table X does not exist"

This ensures visibility during setup and accurate imports.

Supported report formats

Your test command should generate test reports in either TAP or JUnit format. The reports variable defines where test reports are written and in which format (TAP or JUNIT).

Evaluation criteria configuration

Once you’ve configured your settings.json file, you can customize how automated tests affect your reports through the Evaluation criteria section in the question editor.

In the Automatic section, click Sync from project to import test cases from your project. The system will boot a fresh environment, run your runCommand, parse the generated reports, and display individual test cases for configuration.

For each imported test, you can customize:

  • Label:
    • Provides a human-readable description displayed in reports
    • Helps reviewers understand what each test validates
  • Skill:
    • Groups points under specific skill categories (e.g., Problem Solving, Reliability)
    • Each evaluation criterion contributes points to its assigned skill
  • Points:
    • Adjust the weight from 0 to 5 to allocate more or fewer points to each test
    • Higher weights = more points allocated
    • Total question points are distributed proportionally across all criteria based on their weights

Manual grading

You can complement automated testing with manual criteria for qualitative signals (e.g., code structure, readability, tests quality, security).

If manual criteria are defined, when candidates complete their test, you’ll receive an email to manually grade their work against these criteria.

Each project must include at least one evaluation criterion (automatic or manual).

File visibility

You can hide files/folders using the hiddenValidationFiles setting in .coderpad/settings.json.

Example:

{
  "autograding": {
    "hiddenValidationFiles": ["docs/**"]
  }
}

ℹ hiddenValidationFiles uses glob patterns, not literal folder names.

This:

"hiddenValidationFiles": ["docs"]

hides only the folder itself, not its contents.

To hide the folder and all children, use:

"hiddenValidationFiles": ["docs/**"]

This pattern hides:

  • docs/
  • docs/file.md
  • docs/subfolder/...

More glob pattern documentation: https://code.visualstudio.com/docs/editor/glob-patterns

Does file hiding work in Preview mode?

Yes — Preview reflects candidate visibility.

If hidden files still appear:

  • Confirm you used /**
  • Save the Project and click Update project
  • Remove + re-add the question to the test (to refresh cached structure)
  • Ensure the path matches the workspace root

Additional examples

{
  "files.exclude": [
    ".env",            // hide a single file
    "docs/**",         // hide entire folder
    "scripts/*.sh",    // hide matching files
    "**/*.spec.js"     // hide all test files anywhere
  ]
}

Sync from project

When you click Sync from project, CoderPad:

  • Bootstraps a fresh environment
  • Runs your runCommand
  • Parses any generated test reports
  • Shows test cases that appear in those reports

However, Sync from project does not:

  • Verify that your command works for all candidate submissions
  • Guarantee correct PATH or environment behavior
  • Ensure test suites run in the correct order
  • Validate the correctness of your scoring
  • Detect cleanup issues or glob mismatches

In other words, Sync verifies report availability, not complete correctness.

Testing your question without using quota (Preview mode)

Before assigning a Project to candidates:

  1. Open the Project question.
  2. Click Preview.
  3. Apply your intended solution or partial solution.
  4. Submit the preview.
  5. Examine:
    • Points awarded
    • Parsing of JUnit/TAP files
    • Report rendering
    • Expected pass/fail behavior

Preview submissions do not consume quota and are the recommended method for validating Projects end-to-end.

Re-grading

Currently, it is not possible to automatically re-run auto-grading for submissions that were completed before a Project was updated.

If configuration issues were fixed after candidates submitted:

You can:

]]>
Projects https://coderpad.io/resources/docs/screen/tests/projects/ Thu, 11 Sep 2025 12:46:43 +0000 https://coderpad.io/?post_type=doc&p=43145 Projects are multi-file, job-relevant coding exercises that provide candidates with a complete Visual Studio (VS) Code-based development environment with terminal access, extensions, debugging, and all the default VS Code features. They let candidates build, run, test, and debug code just as they would at work—using packages, tools, and workflows they already know.

Key features of projects include:

  • Full VS Code IDE experience with terminal access, debugging tools, and extension support
  • Multi-file support for complex, realistic coding scenarios
  • Custom packages and libraries installation capabilities
  • AI-enabled environment with optional AI chat assistance
  • Auto-grading capabilities with comprehensive test reporting
  • Live front-end rendering for web development projects
  • Database support for backend and full-stack assessments
  • Git integration for version control workflows when creating projects

Projects transform coding assessments from isolated puzzles into authentic development experiences where candidates can demonstrate the full spectrum of their technical abilities.

Why Projects matter in the age of AI

  • Richer, job-relevant signals. Projects mirror real engineering work (multi-file repos, tooling, debugging, tests) so you can assess the skills that matter on the job—not just algorithm skills.
  • Skills are shifting. As AI becomes more powerful and embedded in daily workflows, developers spend less time writing basic snippets and more time on higher-level work: reviewing code, debugging, optimizing performance, stitching together multiple AI-generated suggestions, and making sound architectural decisions.
  • Reduce cheating with real-world tasks. Project questions make copy-paste answers harder to pass. AI may help with pieces of the work, but realistic constraints (configs, tests, data, build steps) require genuine understanding to reach a complete solution.
  • (Optional) Measure AI effectiveness. When enabled, Projects let you see how candidates prompt, evaluate AI output, and integrate it responsibly—providing signal on an increasingly critical skill: coding with AI.

How to add Projects to a test

There are two ways to incorporate projects into your tests:

Method 1: From the Questions page

  1. Navigate to the Questions page
  2. Use the filter dropdown to select Project exercise as the question type
  3. Browse our curated library of ready-to-use projects
  4. Click Create test for your selected projects
Screenshot of the CoderPad questions dashboard showing filters for searching questions. The 'Type' filter dropdown is expanded, displaying options: Multiple-choice question, Free text question, Coding exercise, and Project exercise. A large red arrow points to the 'Project exercise' option

Method 2: From within a Test

  1. Open the test where you want to add a project
  2. Click the Add a question button
  3. Filter the question type to Project exercise
  4. Select from available projects
Screenshot of the CoderPad Screen test creation interface for a Full-stack (JavaScript, Node.js, React, SQL) senior assessment. The 'Questions' tab is selected. On the right, the advanced search filter is open with 'Type' set to 'Project exercise.' A large red arrow points to this filter option. Below, the question list shows project exercises such as 'Verify and host avatar,' 'Static site generator from…,' 'Text-to-speech button,' and 'wqp gpes testing,' with associated points and time durations.


How to create a custom Project

For maximum flexibility and alignment with your specific requirements, you can create completely custom Projects tailored to your organization’s tech stack and coding challenges. You can find more documentation on custom Project questions here.

How to review submissions with Projects

IDE-based review

  1. From a candidate’s detailed report, click Open in IDE for any submitted project
  2. Navigate to the source control icon to see all changes made during the assessment
  3. Review file modifications, additions, and deletions
Screenshot of Visual Studio Code showing Git source control and code changes. On the left, the Source Control panel lists two modified files under ‘Changes’: App.tsx and SpeechButton.tsx. An arrow points to these files. The center shows the SpeechButton.tsx diff view: the old code on the left has a function with a simple console.log message, now deleted. The new code on the right imports useState and useCallback from React, adds state for isReading, and expands the handleTextToSpeech function to get selected text, check if text exists, and verify browser support for speechSynthesis before proceeding.


Enhanced playback system

Projects feature an enhanced playback system that captures not just code changes, but every interaction within the IDE:

  • Complete session recording showing mouse movements, clicks, and navigation patterns
  • AI interaction logs displaying exactly how candidates prompted and used AI assistance
  • Debugging session insights showing problem-solving approaches and troubleshooting methods
Screenshot of a coding assessment environment in Visual Studio Code. The left panel shows the project explorer with files such as App.tsx, SpeechButton.tsx, and instructions.md. The center panel displays instructions.md with context and goal text explaining that a React button should use the browser’s speechSynthesis feature to read selected text aloud. The right panel shows the App.tsx file with React JSX code importing SpeechButton and rendering a header and content. A playback control bar at the bottom shows candidate’s answer recording at 1 minute 8 seconds of 20 minutes, with the play/pause controls visible

✅ View playback in full-screen mode for the most comprehensive review experience.

More resources

]]>
Accessibility https://coderpad.io/resources/docs/accessibility/ Fri, 21 Mar 2025 10:58:57 +0000 https://coderpad.io/?post_type=doc&p=42234 At CoderPad, we believe that technical assessment tools should be accessible to all candidates, regardless of ability. We are committed to providing an inclusive platform that enables companies to evaluate technical talent without creating barriers for people with disabilities.

Frequently Asked Questions

What accessibility standards does CoderPad follow?

On Screen IDE, we have made special effort to ensure WCAG 2.2 AA compliance. The Web Content Accessibility Guidelines (WCAG) 2.2 represent the current industry standard for digital accessibility, and AA compliance is the level recommended for most organizations and products.

We’re actively working to ensure CoderPad Interview IDE meets the same compliance standards.

ℹ Interested in learning more about accessibility? Checkout the Web Content Accessibility Guidelines v2.2 here.

How does CoderPad ensure accessibility?

Our approach to accessibility includes:

  • Regular Testing: We conduct systematic accessibility testing as part of our development process
  • Customer Partnerships: We have collaborated with several enterprise customers on thorough accessibility audits of our IDE
  • Continuous Improvement: We maintain an ongoing accessibility roadmap to address issues and enhance features
  • Expert Consultation: We work with accessibility specialists to review and improve our platform

What specific accessibility features does CoderPad offer?

We have built our tools following accessibility best practices, so most accessibility features (keyboard navigation, etc.) are already built-in, including:

  • Keyboard Navigation: All core functionality can be accessed without a mouse
  • Screen Reader Compatibility: Our IDE works with popular screen readers including JAWS, NVDA, and VoiceOver
  • High-Contrast Mode Support: CoderPad is compatible with system high-contrast settings
  • Adjustable Font Sizes: Code editors include text scaling options
  • Reflow Standard Compliance: Content properly reflows when zoomed up to 400%
  • Multiple Timer Options: Flexible timing accommodations for candidates who need them
  • Semantic HTML Structure: Proper HTML5 structure for assistive technology navigation
  • Sufficient Color Contrast: Text and interactive elements meet contrast requirements
FeatureScreenInterview
Extending session time✅Extend time per questionNot applicable in a live interview setting
Keyboard interactivity✅Hotkeys to run code, autocomplete, Option + tab and Shift + tab to navigate to all elements✅Hotkeys to run code, autocomplete, Option + tab and Shift + tab to navigate to key elements
Zoom in / out to increase size✅Adaptive design, scalable UI✅Adaptive design, scalable UI
Contrast ratio✅✅
Screen reader compatible✅We support all major screen readers ✅ NVDA

Can candidates request accommodations?

Yes, our platform supports various accommodations. Recruiters can configure assessment settings to provide extended time, remove time limits altogether, or make other adjustments as needed. We recommend companies include information about requesting accommodations in their communications with candidates.

Why is accessibility important for technical assessment tools?

Accessibility in technical assessment tools is crucial for several reasons:

Legal Compliance:

  • Many jurisdictions require digital accessibility under laws like the ADA (Americans with Disabilities Act)
  • Educational institutions and government agencies have specific accessibility requirements
  • Non-compliance can lead to legal action and financial penalties

Business Benefits:

  • Access to a wider talent pool by removing barriers for candidates with disabilities
  • Demonstration of commitment to diversity and inclusion
  • Improved candidate experience for all users, not just those with disabilities
  • Enhanced brand reputation as an inclusive employer

Ethical Considerations:

  • Equal opportunity in hiring practices
  • Elimination of unnecessary barriers in the recruitment process
  • Focus on evaluating technical skills rather than unrelated abilities

What if we identify an accessibility issue?

We value feedback on accessibility. If you or your candidates encounter an accessibility barrier, please report it to our support team. We prioritize accessibility issues in our development process and work to address them promptly.

Our Ongoing Commitment

Accessibility is not a one-time effort but an ongoing commitment. At CoderPad, we continuously work to improve the accessibility of our platform through regular testing, customer feedback, and staying current with evolving standards and technologies.

For specific questions about our accessibility features or to discuss your organization’s accessibility requirements, please contact our support team.

]]>
Test Insights https://coderpad.io/resources/docs/test-insights/ Fri, 28 Feb 2025 12:31:55 +0000 https://coderpad.io/?post_type=doc&p=42172 The test Insights dashboard is a powerful tool that helps you monitor the quality and effectiveness of your tests over time. It provides actionable metrics on candidate activity, engagement, and performance.

The dashboard displays aggregated metrics based on the candidates visible in your test’s Candidates tab. You can view the following metrics in the dashboard:

  1. Candidate breakdown
    • Breakdown of candidates across different stages (Invited, Started, Completed, Reviewed).
    • You can switch between breakdowns by review status or custom tags.
  2. Open rate
    • Percentage of invited candidates who started the test.
    • Calculation: Started candidates ÷ Invited candidates (excluding bounced/canceled invitations).
  3. Completion rate
    • Percentage of candidates who finished the test after starting.
    • Calculation: Completed candidates ÷ Started candidates.
The image shows a CoderPad dashboard for an Autonomous Driving internship program for 2024 candidates. The interface displays the "Insights" tab with a breakdown of candidate statistics:

Total Candidates: 598
- Completed: 412 (69%)
- Expired: 165 (28%)
- Abandoned: 21 (4%)
- Pending: 0 (0%)
- In progress: 0 (0%)

On the right side, there are two metrics displayed:
- Open rate: 72% (shown in a blue circular progress indicator)
- Completion rate: 95% (shown in a green circular progress indicator)

The dashboard has navigation options for Candidates, Questions, Settings, and Insights, with a Refresh button in the top-right corner. It's currently showing items 1-5 of 6 in the pagination.

  • Candidate rating
    • Average rating (out of 5 stars) provided by candidates after completing the test.
    • The rating is optional.
  • Score distribution
    • Distribution of candidate score across different ranges.
    • The score taken into account is the point score, not the comparative score.
  • Average duration
    • Average time candidates spend completing the test.
    • Excludes candidates who have not finished their tests.
  • Test usage over time:
    • Number of invitations sent over the past 12 months.
This image shows additional metrics for the Autonomous Driving internship candidates in CoderPad:

At the top:
- Average rating: 4.2/5 (displayed in a green circular gauge)
- 198 candidates out of 433 rated the test
- Average score: 43%
- Average time spent on the test: 1 hour 19 minutes

The middle section displays a score distribution histogram showing how many candidates achieved scores in different percentage ranges:
- 0-10%: 51 candidates
- 10-20%: 32 candidates
- 20-30%: 54 candidates
- 30-40%: 56 candidates
- 40-50%: 57 candidates
- 50-60%: 61 candidates (highest bar)
- 60-70%: 49 candidates
- 70-80%: 39 candidates
- 80-90%: 24 candidates
- 90-100%: 10 candidates

The bottom section shows "Invitations over the past 12 months" in a bar graph:
- April: 14 invitations
- May: 8 invitations
- June: 5 invitations
- All other months (Feb, Mar, Jul-Jan): 0 invitations

This suggests the recruitment drive was concentrated in April-June of the previous year.

Frequently asked questions

  1. How often is the dashboard updated?
    • Data refreshes automatically whenever you navigate to the page.
    • For immediate updates, click the Refresh button at the top right of the dashboard.
  2. What happens if I delete or anonymize candidates?
    • Deleted candidates: They will be removed from all dashboard metrics.
    • Anonymized candidates: They will still be included in aggregated metrics.
  3. Who can access this page?
    • Any user with permissions to view candidate data for the specific test.
]]>