Release Update 11/10/2025

Platform

Introducing a Sleek and Modern Login Interface for Opsera

The Opsera login screen has been upgraded with a refreshed design to deliver an engaging sign-in experience. The updated interface features a streamlined, intuitive layout with clean visuals, offering users a welcoming entry point to Opsera.

Effortless Pipeline Cloning using Hummingbird AI (Beta)

With Hummingbird AI embedded in the pipeline module, users can replicate entire CI/CD workflows in seconds using plain-language commands. This capability drastically reduces setup time and ensures consistent configurations across projects.

Key Highlights

  • Instant Pipeline Duplication: Clone any existing pipeline by simply instructing the Hummingbird AI. This works in the Hummingbird AI available in both the platform and the VS code.

  • Full and Accurate Replication: Every aspect of the original pipeline such as tools, policies, settings, branch mappings and environments is preserved.

  • Accelerated Project Onboarding: For new microservices, feature branches or regional deployments, start with a cloned pipeline and skip manual setup. Teams can begin validating and deploying immediately,improving time-to-value.

Introducing Folder View with Bulk RBAC for Pipelines, Tools & Tasks

A folder based organization system has been introduced to enable teams to intuitively group, navigate and manage pipelines, tools and tasks.

With bulk role-based access control (RBAC) applied across multiple items and levels, governance and security are seamlessly built into the workspace providing structure, visibility and permissions that are enterprise-ready.

Key Highlights:

  • Advanced Folder Management & Navigation: Switch effortlessly between list-based registry views and hierarchical folder views. Create, rename or delete folders, display nested projects in the navigation panel, assign multiple items to a folder in bulk, and leverage search across projects or the entire workspace.

  • Flexible Bulk Role-Based Access Control (RBAC): Roles can be assigned at the folder level, automatically applying permissions to all contained assets or manually applied in bulk to selected pipelines, tools or tasks. This gives administrators both scale and precision in managing access.

  • Permission Overrides: Permissions can be added in bulk on top of existing ones, and also can be overridden at any level to satisfy specific access requirements.

Salesforce

Introducing Hummingbird AI Analysis for Pull Request Validation Dashboard

The AI-driven PR Validation Dashboard delivers automated, real-time insights and actionable recommendations with every validation run.

Key Highlights:

  • Automated AI Analysis for PR Validation: Each Salesforce pull request validation run is supported by detailed AI-driven analysis that surfaces code-quality issues, security findings and compliance gaps in real time.

  • Actionable Remediation Guidance: The AI-generated analysis provides prioritized remediation steps offering feedback to developers and release managers.

  • Context-Rich Validation Reporting: Stakeholders can quickly assess the state of each pull request via AI-enhanced views alongside standard validation results.

Standardized Test Mapping Checks for Salesforce Validations

A new validation rule can be enabled during unit-test mapping setup to ensure proper test coverage.

The rule triggers a failure if any Apex class in the package lacks a corresponding test-class mapping. If a pull-request consists only of configuration changes (and no Apex classes), the validation proceeds without this mapping check. This capability helps ensure reliable deployments and maintain test-coverage standards for Salesforce PRs in Opsera.

Note:

  • To activate this rule for a Pipeline, in the pipeline configuration, set the Unit Test Type to ‘Auto Include Tests’ to activate this rule for your Pipeline.

  • To activate this rule for a PR Validation, in the PR Validation Repository Rules policy for your git account and repository, set the Unit Test Type to “Auto Include Tests”.

Key Highlights:

  • Automated Enforcement: PR validation fails when a required test-class mapping is missing for any Apex class included under ‘Auto Include Tests’.

  • Coverage for New Classes: Newly created Apex classes in the branch or org must have an associated test-class mapping.

  • Configuration-Only Exclusion: Pull requests containing only configuration updates (and no Apex classes) are exempt from this rule.

Unified Insights

Introducing the AI Code Comparison Dashboard

Opsera has introduced the AI Code Comparison Dashboard, a comprehensive analytics suite that empowers engineering leaders, data teams and developers to measure the impact of AI coding tools across productivity, adoption, quality and cost efficiency.

The dashboard metrics give end-to-end visibility into how AI is driving faster development cycles, all backed by live project and developer usage data.

Key Highlights:

  • Unified Productivity & ROI View: The dashboard shows how much developer time is being saved and the resulting cost savings, enabling leadership to directly link AI tool adoption with productivity gains and financial impact.

  • Adoption & Usage Transparency: Through metrics such as Adoption Rate, License Adoption Gap and Chat vs IDE engagement split, the dashboard gives a clear picture of how developers are using AI coding tools, where licences are under-utilised and which workflows drive adoption.

  • Tool & Language Level Performance Comparison: Using the Scoreboard, Leaderboard and Language Support views, you can compare AI tools side-by-side helping pinpoint strengths, surface weaknesses and guide investment priorities.

  • Holistic Multi-Dimensional Impact Analysis: The dashboard uses a radar chart and cost-analysis modules to present AI tool performance, aligned with organisation-specific baselines so you can evaluate true business impact.

Enhancement to Executive Summary KPI in Copilot Reports

Leadership teams get a clearer view of how Copilot is being used and how it helps improve overall engineering productivity. The improvement highlights important adoption metrics and productivity outcomes to help organizations make decisions about AI enablement.

Key Highlights:

  • Contextual Industry Benchmarks and Status: Adoption metrics for AI tooling are evaluated alongside industry benchmarks. For example, AI Adoption Rate and AI-Assisted PR percentages are compared with typical industry ranges, while performance status labels clarify exactly where your teams stand relative to peers.

  • Key Adoption Metrics: Instantly see the percentage of engineers actively using Copilot and the proportion of pull requests created with AI assistance.

  • Productivity Insight: Transparent comparison of PR cycle times between Copilot users and non-users, presented alongside high-performer industry benchmarks for stakeholders to quickly understand the real productivity gains.

  • Language Performance and Team Excellence: Copilot integration has been measured across diverse programming languages, with acceptance rates and ROI tracked for organization-wide analysis.

  • Reliable Data Sources: Aggregate metrics are sourced from Copilot Contribution and Usage Reports as well as Codebase Activity analytics, ensuring the organization tracks adoption rates, PR volumes, and productivity benchmarks with confidence.

New Navigation Experience with Unified Actions on Insights Landing Page

An improved navigation experience has been introduced making it easier to create, access and manage dashboards dashboards.

Key Highlights:

  • Unified Creation Experience: All dashboard and automation actions are now consolidated under a single “Create New” button, enabling users to build any dashboard, set up notifications, and configure group mapping from one convenient location.

  • Improved Dashboard Visibility: Dashboard names are now easily accessible both in the main overview and the left navigation panel, providing quicker access and better context for users.

Persistent Time-Range Selection in GitCustodian Dashboard

The GitCustodian Dashboard automatically remembers the preferred time range, making it easier to pick up where users left off and view the most relevant data without manual re-selection.

Key Highlights:

  • Preserved Relative Time Ranges: When a user selects a relative time range (for example, “Last 7 Days”), the system now saves that preference. On subsequent visits or page refreshes, the dashboard automatically re-applies the stored relative range, ensuring the displayed data reflects the most up-to-date interval.

  • Support for Fixed Date Ranges: If a user selects a custom absolute date range, that exact range will continue to be preserved and re-loaded as before.

  • Default Setting for New Users: For users without a previously saved preference, the default view is set to “Last 30 Days”.

Improved Audit Visibility for False Positives in GitCustodian Dashboard

Teams can easily see within the GitCustodian Dashboard who marked an issue as a false positive, when the action was taken, and any accompanying reason or comment, enhancing transparency and traceability during security reviews.

Key Highlights:

  • User and timestamp details are displayed for each false positive, providing clear accountability.

  • Comments are automatically captured and stored with each false positive action, supporting accurate audit trails.

  • Sorting and filtering options have been improved, allowing quick review of recent changes or actions by specific users.

Upcoming Capabilities for Unified Insights

Unique Release Filter for Deployment Frequency KPI

A filter configuration has been introduced to enhance the accuracy of deployment frequency analytics on the Deployment Frequency dashboard. Users can now analyze release activity by counting each unique release name only once per selected period, providing clearer insights into true release velocity and eliminating duplicate counts for repeated deployments.

Key Highlights:

  • Group by Unique Releases Checkbox: Users group deployment frequency results by distinct release name. When enabled, the dashboard and charts will count multiple tags linked to the same release name as a single deployment, removing duplicates.

  • Trend Analysis: The trend chart helps users compare deployment activity with and without unique release grouping, illustrating whether release patterns are driven by new launches or frequent repeat deployments.

Additional Metrics in Pull Request Statistics Dashboard

The Pull Request (PR) Statistics KPI includes enhanced analytics to improve visibility into review activity, approval trends, and collaboration quality. These metrics bring greater precision to PR analysis, enabling teams to monitor collaboration dynamics and optimize code review workflows.

Key Highlights

  • Two new KPI metrics introduced:

    • PR Rejection Rate: (Number of PRs closed but unmerged × 100) / Total PRs

    • PR Approval Rate: (Number of PRs approved × 100) / Total PRs

  • A new table column, PR Comment Density, measures review engagement through the number of comments per reviewed PR. Formula: Total review comments / Total reviewed PRs

Last updated