Cosmos Week
Reconstructing the cosmic expansion with a generalized q(z) parameterization: A decelerating Universe from late-time constraints
CosmologyEnglish editionPreprintPreliminary result

Reconstructing the cosmic expansion with a generalized q(z) parameterization: A decelerating Universe from late-time constraints

We present a generalized phenomenological parameterization of the deceleration parameter $q$ that incorporates an effective radiative component in addition to a localized.

Original source cited and editorially framed by Cosmos Week. arXiv Cosmology
Editorial signatureCosmos Week Editorial Desk
Published27 Apr 2026 03: 16 UTC
Updated2026-04-27
Coverage typePreprint
Evidence levelPreliminary result
Read time4 min read

Key points

  • Focus: We present a generalized phenomenological parameterization of the deceleration parameter $q$ that incorporates an effective radiative component in
  • Editorial reading: provisional result, not yet formally peer reviewed.
Full story

We present a generalized phenomenological parameterization of the deceleration parameter $q$ that incorporates an effective radiative component in addition to a localized late-time contribution. The new analysis still awaits peer review, but it already lays out the central claim clearly.

This matters because cosmology operates at the edge of what current instruments can measure, where systematic errors and model assumptions are never trivial. Small discrepancies between independent measurements have historically pointed toward missing physics rather than simple calibration errors, and the ongoing tension in the Hubble constant is a live example of how a persistent disagreement between methods can reshape the theoretical landscape. Each new dataset that approaches this territory with independent systematics adds real information to a problem that has resisted easy resolution for more than a decade. We present a generalized phenomenological parameterization of the deceleration parameter $q(z)$ that incorporates an effective radiative component (ERC) in addition to a localized. The proposed framework extends previous two-parameter $q(z)$ reconstructions by explicitly regulating the high-redshift behavior while preserving the late-time transition dynamics.

We constrain the free parameters $(h, q_0, z_c, z_e)$ using late-time observational data from cosmic chronometers (CC), Pantheon+ Type Ia supernovae (SNIa), H\, \textsc{ii}. Within the redshift range probed by the data, the reconstructed $q(z)$ deviates from the $Λ$CDM trend, suggesting a possible reduction of the late-time acceleration.

Furthermore, the reconstruction favors a relatively high value of the Hubble parameter, $h = 0.729 \pm 0.006$. The ERC remains weakly constrained by late-time data but ensures a smooth and monotonic evolution of $q(z)$, $j(z)$, and $w_{\rm eff}(z)$ across a wide redshift range.

Within the observed interval, the model effectively reproduces the late-time behavior of the previous parametrization, while providing a controlled extension toward early epochs. Our results show that current low- and intermediate-redshift data are compatible with a reduced late-time acceleration.

The relevance goes beyond one dataset because even small shifts in measured parameters can matter when the field is testing the limits of the standard cosmological model. The Lambda-CDM framework describes the observable universe with remarkable economy, but its success rests on two components, dark matter and dark energy, whose physical nature remains entirely unknown. Any credible measurement that tightens or loosens the constraints on those components moves the entire theoretical enterprise forward, regardless of whether the immediate result looks dramatic on its own terms.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. ArXiv is committed to these values and only works with partners that adhere to them.

Because this is still a preprint, the result should be read with genuine interest and proportionate caution. Peer review is not a guarantee of correctness, but it is a process that forces authors to respond to technical criticism from specialists who have no stake in a particular outcome. Preprints that survive that process, often with substantive revisions, emerge with a stronger evidential base than the version that first appeared. Until that stage is complete, the responsible reading keeps uncertainty explicitly visible rather than treating the claims as established findings.

The next step is to see whether the effect survives when independent surveys, different calibration strategies and tighter control of systematic uncertainties enter the picture. Programmes such as Euclid, DESI and the Rubin Observatory will deliver datasets over the next several years that cover the same parameter space with largely independent methods. If the current signal persists through those tests, its theoretical implications will become impossible to set aside. Until peer review and independent follow-up address those open questions, skepticism is not a failure of appreciation for the work; it is part of how science decides what to keep.

Source