Release Update 12/15/2025
Platform
Interactive and AI-Driven Experience Across the Platform for Hummingbird AI

Hummingbird AI provides dynamic, AI-assisted interactions with tables and charts across the platform. These capabilities enable users to explore, refine, and share data more efficiently through intuitive, in-context actions.
Key Highlights:
Intelligent In-Context Interactions: Users can apply AI-driven actions such as filtering, sorting, expanding, drilling down, and value selection directly on tables and charts.
Enhanced Viewing and Export Options: Expanded table views improve readability and are optimized for screenshots, reviews, and presentations. Users can export charts providing ready-to-share reports for stakeholders.
Optimized Data Presentation: Smart sorting is available with the most relevant information.
Unified Insights
Enhancements for Copilot Reports Dashboard
Copilot Dashboard: Adoption Rate Alignment in Carousel

The Adoption Rate metric in the Copilot dashboard carousel has been refined to ensure accuracy and consistency across organization and team views.
Key Highlights:
Adoption rate values update dynamically at the team level when team filters are applied.
At the organization level, adoption rates have been standardized to maintain parity with the underlying contribution data. This ensures consistent adoption metrics across dashboards, supporting both high-level and granular visibility for performance tracking.
Copilot Dashboard: Executive Summary Enhancements

Enhancements have been made to the Executive Summary to provide a more contextual analysis of Copilot adoption, productivity impact, and performance trends across teams.
Key Highlights:
The AI-assisted PR logic has been refined to more accurately represent Copilot commit contributions and cycle time improvements.
License and activity data in the Copilot Usage tab have been aligned to ensure consistency across dashboards.
Copilot User Export Enhancements: Expanded User Data and Hierarchy Integration

Improved visibility into developer engagement and productivity trends is available, as user export reports contain actionable data for users.
Key Highlights:
Enriched Copilot User Exports: User email IDs have been added to all export types — Active, Inactive, and All Users
User Data: Additional activity metrics, including last_active_date, number of commits, total_lines, total_pull_requests, and total_pull_request_size, are now included across all export options. Previously, these data points were limited to the Active Users export only.
Expanded Filter and Configuration Support for Custom Dashboards
Enhancements have been introduced to enable users to apply more granular filters and configurations within custom views, ensuring consistency and improved reporting control.
Key Highlights:
Copilot Analytics Options: Copilot PR Survey and Enterprise Analytics options are now supported in Custom Dashboards, extending Copilot adoption and usage insights into tailored views.
Sprint Threshold Controls: Sprint Dashboard threshold settings can be configured directly within Custom Dashboards, enabling more precise monitoring of sprint performance against team-defined targets.
Workflow-Based Insights for DORA Metrics

DORA Dashboard has been updated to use GitHub Actions (GHA) workflows as a primary data source, enabling more precise measurement of Deployment Frequency and Change Failure Rate for teams deploying through GHA.
Key Highlights:
Improved Accuracy with GHA Workflows: DORA Dashboard now uses GitHub Actions (GHA) workflows as a primary data source for Deployment Frequency, Change Failure Rate, and Lead Time for Changes.
Flexible Metric Configuration: Teams can configure metrics using either deployment stages or GHA workflow names, with the deployment stage filter now optional.
Reliable Deployment and Failure Insights: Deployment Frequency and Change Failure Rate are calculated directly from relevant GHA workflow runs and Jira failure issues, improving accuracy for real production activity.
Consistent Lead Time Reporting: Lead Time for Changes has been aligned to work consistently with both stage-based and workflow-based configurations, ensuring trustworthy reporting across deployment model.
Additional Capabilities
Export Capability for DevEx Dashboard An export option has been added to the DevEx dashboard, allowing users to download developer activity data directly from the DevEx Metrics view into a spreadsheet for deeper analysis and sharing. The exported report includes key contributor details and activity metrics in a tabular format, making it easy to slice, filter, and visualize the data in tools like Google Sheets or Excel.


Updated Board Filter Logic for Cycle Time KPI Cycle Time filtering has been refined so the Boards filter now shows only boards that are tied to the selected hierarchy mappings, making analysis more focused and efficient. By limiting the options to relevant, mapped boards, the feature improves data relevance, reduces noise in board selection, and streamlines Cycle Time KPI configuration for reporting.
Salesforce
Improved Status Visibility for Salesforce PR Validation
Salesforce PR validation has been enhanced to provide clearer, more reliable tracking of validation and merge status, including for pull requests created outside Opsera.
Key Highlights:
Transparent Validation Statuses:
A “NA” (Not Applicable) status is displayed for pull requests created outside Opsera when validation is intentionally disabled for external PRs. This indicates that validation was not triggered, rather than skipped due to an error.
A “Bypassed” status is displayed when a pull request is merged before validation completes. This indicates that validation was skipped because the PR was merged early, and Opsera does not update the validation result after the merge to keep reporting accurate.

Accurate Internal vs External Handling: Validation logic now cleanly differentiates between internal and external PRs, ensuring that status is reflected correctly in both Opsera and GitHub views

Real-Time Validation Progress: The Summary section includes real-time counts of components and test methods executed, helping users monitor validation progress as it runs.
Last updated

