Bug report: Duplicate SEO Link — Patch

Another ChtGPT assisted bugfix - I've been getting errors daily from the cron job related to sys_seo_links duplicates. I hope this will be useful to anyone else with this issue. The error looks like this from cron:

Database error in sitename

MySQL error: Duplicate entry 'bx_persons-view-persons-profile-59007' for key 'module_page_param'

This happens when UNA attempts to generate SEO links in the sys_seo_links table.

It is not isolated to one module — the system was frequently trying to insert SEO links that already existed, causing site errors and breaking pages that render profile cards, search results, or timeline entries.

Cause of the Problem

The core function insertSeoLink() uses a plain SQL insert:

INSERT INTO sys_seo_links SET ...

The table sys_seo_links has a UNIQUE index on:

(module, page_uri, param_name, param_value)

When UNA tries to create an SEO link for an item that already has one, MySQL throws a duplicate-key error, which results in a PHP exception and interrupts the page load.

This can happen when:

  • Permalinks are regenerated
  • Slugs change (for example, profile name changes)
  • Cache returns outdated values
  • Modules request permalink creation multiple times
  • Search/timeline/pages re-trigger permalink logic

The result is frequent, noisy site errors.

The Fix — Update When Duplicate Exists

Instead of using a plain INSERT and letting MySQL halt the page, we can change the logic so that:

  • New SEO links are inserted normally
  • Existing SEO links are updated instead of causing errors
  • No exceptions break the page
  • SEO slugs stay current
  • No silent failures occur

This is done by using:

INSERT ... ON DUPLICATE KEY UPDATE

This tells MySQL:

  • Insert the row if it’s new
  • Update the existing row when a duplicate is detected
  • Avoid all duplicate-key errors

Existing (UNA core code)

static public function insertSeoLink($sModule, $sPageUri, $sSeoParamName, $sSeoParamValue, $sUri)
{
    return BxDolDb::getInstance()->query("INSERT INTO `sys_seo_links` SET `module` = :module, `page_uri` = :page_uri, `param_name` = :param_name, `param_value` = :param_value, `uri` = :uri, `added` = :ts", [
        'module' => $sModule,
        'page_uri' => $sPageUri,
        'param_name' => $sSeoParamName,
        'param_value' => $sSeoParamValue,
        'uri' => $sUri,
        'ts' => time(),
    ]);
}

This throws an error if the SEO link already exists.

Fixed (Improved version with update logic)

static public function insertSeoLink($sModule, $sPageUri, $sSeoParamName, $sSeoParamValue, $sUri)
{
    return BxDolDb::getInstance()->query("
        INSERT INTO `sys_seo_links`
            (`module`, `page_uri`, `param_name`, `param_value`, `uri`, `added`)
        VALUES
            (:module, :page_uri, :param_name, :param_value, :uri, :ts)
        ON DUPLICATE KEY UPDATE
            `uri` = VALUES(`uri`),
            `added` = VALUES(`added`)
    ", [
        'module' => $sModule,
        'page_uri' => $sPageUri,
        'param_name' => $sSeoParamName,
        'param_value' => $sSeoParamValue,
        'uri' => $sUri,
        'ts' => time(),
    ]);
}

Why not just ignore it?

Another solution might be to use

INSERT IGNORE

But that approach has major drawbacks:

  • It silently drops the insert
  • The slug never updates even if the content changes
  • Old or incorrect SEO URLs remain forever
  • Debugging becomes harder because errors are hidden

The improved solution updates only the values that should change:

  • uri (the SEO-friendly slug)
  • added (timestamp)

Nothing else in the unique key is modified.

ON DUPLICATE KEY UPDATE is the correct method because it:

  • Keeps SEO links current
  • Maintains consistency across all modules
  • Prevents site-breaking errors
  • Does not hide failures
  • Properly handles repeated insert attempts
  • Avoids creating a mess of stale or outdated slugs

This matches the recommended “upsert” pattern for MySQL.

  • 47
  • More
Replies (0)
Login or Join to comment.