@unateam @CluBDeveloper @Direct Line Development @Exnihilo Developments @CluBDeveloper @Developer Experiance Hub @HQ-Development @JCo Web Development @Online Audience Hosting & Web Developments @OuroborosDevelopment @TCCDevelopment @tsdevelopment @Web development Institute @jaytech developers @Web application development services @Royal Developer @Developers App India @Andrey Yasko @Alex T⚜️ @Anton L @LeonidS @Rene @Clubhouse
Very soon, the UNA community will become much more active again, and we’ll start to see a rise in new members who are excited by what I’m working on. I’m putting the final touches on a modular‑app builder and a UNA‑focused AI assistant tailored specifically for developers and agencies. This toolkit is designed to make creating custom modules faster, more visual, and far less reliant on deep framework knowledge—letting you scaffold, configure, and iterate UNA modules in minutes instead of hours. As this tool rolls out, I expect it will help attract new UNA users, both in the open‑source community and among professional developers launching custom communities. I’ll be sharing the first demos right here, along with GitHub links and early access options, and I’d love your feedback so we can shape it together and grow the UNA ecosystem around it
Background
I've just finished a consultation call with Amazon regarding S3 storage for our UNA CMS video timeline system. They've directed me to the AWS Pricing Calculator, but I need the community's expertise to ensure I'm calculating this correctly.
Calculator Link:
What We Know So Far
- Service Required: S3 Standard storage
- Region: Europe (London) / EU-West-2
- Use Case: Storing and delivering video timelines to users
- Free Tier Available: 5GB storage, 20K GET requests, 2K PUT requests monthly
Key Questions I Need Help With
1. Data Transfer Per Timeline Pull
When a user pulls/loads a timeline in UNA CMS:
- How much data is typically transferred?
- Does this include video files, thumbnails, metadata, or all combined?
- Are videos streamed or downloaded in full?
- What's the average timeline size in your experience?
2. Monthly Data Transfer Estimates
For calculating the AWS costs accurately:
- How should I estimate total monthly data transfer?
- What's the typical ratio of uploads (PUT) vs downloads (GET)?
- How many timeline pulls should I expect per active user per month?
3. Calculator Configuration
When using the AWS calculator for S3 Standard (London region), what values should I input for:
- Storage amount (GB): How much video storage per user/timeline?
- PUT requests: Upload frequency?
- GET requests: Download/view frequency?
- Data Transfer OUT: This seems to be the biggest cost factor - how do you calculate this?
What I'm Looking For
From the community:
- Real-world usage patterns from existing UNA video timeline implementations
- Typical storage and bandwidth consumption metrics
- Best practices for estimating AWS costs for video delivery
- Any tips on reducing data transfer costs (CDN, optimization, etc.)
Specific Help:
- Walk me through the calculator with realistic numbers for a small-to-medium UNA community
- Should I be considering alternatives like Cloudflare R2 (zero egress fees) or DigitalOcean Spaces?
Additional Context
I'm particularly concerned about data transfer OUT costs, as the AWS rep mentioned this is typically the largest expense when users frequently pull/view timelines.
Any insights, formulas, or real-world examples would be incredibly helpful!
Thanks in advance! 🙏
UNA CMS Version: 14
Module: Comments/Reactions (Studio > Navigation > Systems > Comments)
Environment: VPS (Linux/PHP 8.x/MySQL), production community site
Bug Description:
When reactions are disabled in Studio > Navigation > Systems > Comments (or All Comments), then re-enabled, the system creates duplicate entries in the reactions database table (likely sys_comments_reactions or bx_vote_reactions).
Additionally, the frontend only displays/reads the first 3 reactions from the table, ignoring any beyond that—regardless of actual votes/reactions present.
Steps to Reproduce:
- Go to Studio > Navigation > Systems > Comments
- Disable reactions for Comments (or All Comments)
- Save & clear cache
- Re-enable reactions for the same
- Save & check DB: Duplicates appear for the same reaction types (e.g., multiple thumbs-up entries per comment_id)
- Frontend: Only first 3 reactions show (hardcoded LIMIT 3 suspected)
Database Impact:
text
-- Example duplicates found (pseudocode):
SELECT * FROM sys_comments_reactions WHERE comment_id = X;
-- Returns 6+ identical rows instead of unique (content_id, reaction_id, author_id)
Not Serialization-Related: This is a separate issue from the known BxDolVoteReactions.php BxDolVoteReactions.php unserialize() errors—purely toggle-induced duplicates + display limit.
Request:
- Confirm if this is known & provide patch (code/DB fix)
- Exact table name for reactions on comments?
- Why LIMIT 3 hardcoded? Can we override?
- SQL to clean duplicates + prevent future?
Site Context: Multi-residential community platform (live production). Need quick resolution.
Priority: Medium-High (affects user engagement on comments/posts).
Regards, Chris
Storage filling up fast with old timeline posts (images/videos). Need safe auto-cleanup solution.
CREATED this script but haven't tested yet – want UNA experts to review before running on production.
COMPLETE SCRIPT (SQL-safe with $db->escape()):
```php
<?php
/**
* UNA 14 TIMELINE CLEANUP - NEEDS EXPERT REVIEW
* SQL-Safe | cfmediahome.com | Untested
*/
$daysOld = 14; // Plan: 14 → 60 days
$dryRun = true; // false = live delete
$lockFile = '/tmp/timeline_cleanup.lock';
$logFile = '/var/log/timeline_cleanup.log';
// SAFE LOCK
if (file_exists($lockFile) && (time() - @filemtime($lockFile)) < 300) {
exit('LOCKED');
}
touch($lockFile);
require_once 'inc/header.inc.php';
$db = BxDolDb::getInstance();
logMsg("START " . date('Y-m-d H:i:s') . " (dryRun: " . ($dryRun ? 'YES' : 'NO') . ")");
$cutoff = time() - (86400 * $daysOld);
$events = $db->getAll("
SELECT `id`, `owner_id` FROM `bx_timeline_events`
WHERE `type`='post' AND `date` < {$cutoff} AND `status` != 'hidden'
");
$count = count($events);
logMsg("Found {$count} old posts");
foreach ($events as $event) {
$eventId = (int)$event['id'];
$ownerId = (int)$event['owner_id'];
// COMMENTS
if ($dryRun) {
$comments = (int)$db->getOne("SELECT COUNT(*) FROM `bx_timeline_cmts` WHERE `event_id`=" . $db->escape($eventId));
} else {
$db->query("DELETE FROM `bx_timeline_cmts` WHERE `event_id`=" . $db->escape($eventId));
$comments = $db->getAffectedRows();
}
// NOTIFICATIONS
if (!$dryRun) {
$db->query("DELETE n FROM `bx_ntfs_notifications` n JOIN `bx_ntfs_handlers` h ON h.id=n.handler_id WHERE h.event_id=" . $db->escape($eventId));
}
// FILES
$files = $db->getAll("SELECT f.id FROM `bx_files_2_timeline` ft JOIN `bx_files` f ON f.id=ft.file_id WHERE ft.timeline_id=" . $db->escape($eventId));
$fileCount = count($files);
if (!$dryRun && $fileCount) {
$fileIds = array_map('intval', array_column($files, 'id'));
$fileIdsList = implode(',', $fileIds);
$db->query("DELETE FROM `bx_files_2_timeline` WHERE `timeline_id`=" . $db->escape($eventId));
$db->query("DELETE FROM `bx_files` WHERE `id` IN ({$fileIdsList}) AND `owner_id`=" . $db->escape($ownerId));
}
// EVENT
if (!$dryRun) {
$db->query("DELETE FROM `bx_timeline_events` WHERE `id`=" . $db->escape($eventId));
}
logMsg("Event #{$eventId}: files={$fileCount}, comments={$comments}");
}
logMsg("COMPLETE: {$count} processed");
@unlink($lockFile);
function logMsg($msg) {
global $logFile;
file_put_contents($logFile, date('Y-m-d H:i:s') . ' ' . $msg . "\n", FILE_APPEND | LOCK_EX);
}
?>
**🎉 Happy New Year 2026 UNA Community! 🎉**
Warmest New Year wishes to @unateam and everyone in this amazing community!
Wishing you all a prosperous 2026 filled with successful projects and great achievements! 🥳✨
Grateful for your support and excited for the year ahead!
- 375
Want posts to auto-expire after X days (e.g. 30), then **prune/delete** automatically.
- Stories module uses `expires_at` field, auto-set on creation
- Add DateTime "expires_at" field via Studio > Forms for any content
- Pruning via UNA cron: delete where `expires_at < NOW()`
Proposed Pruning Function for other content types perhaps
Are we seeing a decline in member numbers or engagement lately? I’m getting the feeling that things have slowed down.
- 602
Good day everyone,
I’m currently developing something quite ambitious and wanted to reach out to our UNACMS community — especially those who aren’t PHP developers but are entrepreneurs, creators, and problem-solvers exploring self-hosted and open-source solutions.
Over the years, my own experience has shown that while there’s an endless supply of software and platforms out there, most of them end up taking us down the same road — reinventing the wheel or starting over from scratch.
Now, with AI tools rapidly changing how we build and create, it might seem like open-source systems like UNA are becoming less relevant. But in truth, AI comes with its own challenges and costs. The real magic happens when we combine the stability of open-source foundations (like UNA) with the adaptability and automation of AI-driven development.
That’s what I’m working toward — an AI Full Stack Development App for UNACMS, starting with:
- Automated import/export modules for system and data migration.
- Refactoring automation to help developers clean, upgrade, and maintain code more easily.
- Tools designed to help non-programmers and community builders customize and extend UNA without deep technical knowledge.
Before I go too far, I want to hear from you —
- Would this be something useful for your UNA projects?
- What features or pain points would you want this AI-assisted toolkit to solve first?
- Should we open a collaboration or early testing group?
I believe UNA still holds enormous potential — especially when we start bridging it with modern AI workflows.
Looking forward to your thoughts, feedback, and ideas!
— Chris Buys
Developer, AI Full Stack for UNA CMS