< All Topics
Print

Creating and Managing Marking Schemes

Overview

Marking schemes define the criteria used to evaluate bids. This guide explains how to create, edit, and manage marking scheme rules, and how to use them when analysing your bids.

What Are Marking Schemes?

Sets of criteria for scoring bids:

  • Categories: Groups of related criteria (e.g., “Technical Approach”, “Project Management”)
  • Criteria: Specific items evaluated (e.g., “Methodology”, “Risk Management”)
  • Scoring Scale: How scores are assigned (e.g., 0-100, pass/fail)
  • Weightings: Importance of different criteria

Understanding Marking Schemes

What They Do

Help you understand evaluation, focus efforts, improve scores, and track performance

When to Use Them

Use when you have client evaluation criteria, want to self-score, need to understand what evaluators look for, or want to improve against specific criteria

When to Create Your Own Rules

Create your own when:

  • The client provides particular evaluation criteria
  • Working with standard evaluation methods (e.g., public sector tenders)
  • You know the successful criteria from past bids
  • Client specifies exact scoring scales
  • Your organisation has standard evaluation criteria

Use automatic generation when:

  • No specific criteria provided
  • Need a starting point
  • Exploratory analysis
  • Time constraints

Best Practice: Even with automatic generation, review and customise to match your needs and client requirements

Accessing Marking Schemes

From Report Page:

  1. Go to Bid Library
  2. Open a bid and click on a question
  3. Go to the Report page
  4. Find the Marking Scheme section and click on the eye icon to view/edit the report

From Bid Creation: Enable marking scheme analysis, select rule, reports generated during analysis

Creating a New Marking Scheme

Deciding How to Create

Before creating, consider:

  1. Client evaluation criteria → Create manually to match exactly
  2. Specific scoring requirements → Create manually with custom scales
  3. Need starting point → Use automatic generation, then customise
  4. Want consistency → Create manually to standardise

Step 1: Open Create Dialogue

  1. On the Report page, find the Marking Scheme section
  2. Click Create
  3. Popup opens

Step 2: Choose Creation Method

Automatic Generation

  1. Click Generate tab
  2. Bid Bot generates based on the question, answer, and files (if uploaded)
  3. Review the generated marking scheme
  4. Edit and customise to match your needs
  5. Refine categories and criteria

When to Use: Need a starting point or don’t have specific criteria

Important: Always review and customise automatically generated schemes

Manual Entry

  1. Click the Manual tab
  2. Enter: Name, categories, criteria (rules), scoring scale

When to Use: Specific client criteria, exact match needed, organisation standards, historical success data

Step 3: Define Your Marking Scheme

For Manual Entry

Name: Enter descriptive name (e.g., “Project X Marking Scheme”)

Categories:

  1. Click Add Category
  2. Enter category name and description (optional)
  3. Add multiple as needed
  4. Remove using the Remove button

Criteria (Rules):

  1. Expand category
  2. Click Add Rule
  3. Enter:
  • Question: What does this evaluate (e.g., “Does the answer demonstrate a clear project management methodology?”)
  • Weighting: Importance (0-100 or 0-1)
  • Category: Auto-filled
  1. Add multiple per category
  2. Remove using the Remove button

Important: Ensure rule weightings sum appropriately (may need to sum to 100 or 1.0)

For Automatic Generation

Reviewing Generated Scheme:

  1. Review generated categories, criteria/rules, scoring scales
  2. Customise: Edit names/descriptions, add/remove categories/rules, edit questions/weightings, adjust scoring scale
  3. Switch to Manual for extensive changes

Rules

What Are Rules?: Specific evaluation items. Each asks a question about your bid content and is scored individually.

When to Add:

  • Client-specific criteria
  • Important aspects
  • Missing coverage
  • Custom requirements

How to Add:

  1. Click Add Rule within the category
  2. Enter question, weighting, category
  3. Add multiple per category
  4. Remove using Remove

Best Practices: Be specific, match client criteria, set appropriate weightings, cover key areas, avoid redundancy, use clear language

Scoring Scale (Optional):

  1. Click Add Scale
  2. Enter description (e.g., “Excellent”, “Good”, “Satisfactory”, “Poor”) and score/range
  3. Examples:
  • Excellent: 90-100
  • Good: 70-89
  • Satisfactory: 50-69
  • Poor: Below 50
  1. Remove using Remove if needed

Note: Scoring scale optional – you can use numerical scores only

Step 4: Save

  1. Review all entries
  2. Verify the weightings sum correctly
  3. Check completeness
  4. Click Save
  5. Select from the dropdown when generating reports

Editing Existing Marking Schemes

  1. On the Report page, find the Marking Scheme dropdown
  2. Select scheme
  3. Click Edit
  4. Make changes: Name, categories, criteria, scoring scale
  5. Click Save

Using Marking Schemes

Selecting

  1. On the Report page, find the Marking Scheme dropdown
  2. Select scheme
  3. Linked to your bid

Generating Reports

  1. Ensure the marking scheme is selected
  2. Enable Marking Scheme in report options
  3. Click Generate Report
  4. View scores against criteria

Understanding Reports

Shows: Overall score, category scores, criteria scores, improvement suggestions

Best Practices

Creating Marking Schemes

  1. Use real criteria when possible
  2. Be specific and measurable
  3. Set appropriate weightings
  4. Keep updated

When to Create Custom Rules

Create your own when: Client provides criteria, standard evaluation, organisation standards, specific requirements, known success patterns

Start with automatic when: No specific criteria, exploratory, quick start, learning

Hybrid Approach (Recommended):

  1. Start with automatic generation
  2. Review against client requirements
  3. Customise and refine
  4. Save for reuse

Creating Effective Rules

Guidelines:

  • Match client language
  • Be specific
  • Make actionable
  • Make measurable
  • Set appropriate weightings based on client importance, historical patterns, and strategic importance

Example Good: “Does the answer demonstrate a clear project management methodology?” – specific, measurable, matches common criteria

Example Poor: “Is it good?” – vague, not measurable, doesn’t match criteria

Using Effectively

  1. Select early in development
  2. Review regularly
  3. Focus on low scores
  4. Track progress

Managing Multiple Schemes

  1. Name clearly
  2. Organise by type
  3. Reuse when possible
  4. Keep current

Understanding Scores

How Calculated

  • Criteria scores: Individual per criterion
  • Category scores: Average or weighted per category
  • Overall score: Weighted average of all criteria

Interpretation

  • High (80+): Strong performance
  • Medium (50-79): Adequate but could improve
  • Low (<50): Needs significant improvement

Using to Improve

  1. Identify weak areas
  2. Understand weightings
  3. Address gaps
  4. Track improvement

Troubleshooting

  • Can’t create: Check required fields, verify categories/criteria, ensure permissions, try refresh
  • Not saving: Check required fields, verify validity, check scoring scale, retry
  • Can’t select: Ensure created, refresh dropdown, check access, contact admin if missing
  • Reports not generating: Ensure selected, verify “Marking Scheme” checked, confirm enough content, wait for completion
  • Generated scheme not suitable: Edit automatically generated, switch to manual, adjust as needed, save custom version
Go to Top