Physical Sciences
September 14, 2023

Obtaining Tsallis entropy at the onset of chaos

Tsallis entropy aims to extend traditional statistical mechanics, but some physicists believe the theory is incompatible with the fundamental principles of thermodynamics. Dr Alberto Robledo of Instituto de Física, Universidad Nacional Autónoma de México (UNAM) shows for the first time how Tsallis entropy can explain natural phenomena that turn out to be surprisingly linked to the transitions from regular to chaotic behaviours, a result that has eluded researchers so far. His discovery could lead to a deeper understanding of how thermodynamic systems behave.

Statistical mechanics is a branch of physics that accomplishes the feat of understanding and predicting how thermodynamic systems with large numbers of particles (or other microscopic objects) will evolve over time. Instead of tracking the motion of every atom and molecule in a system, it treats it as a collection of possible microstates or configurations, each with a certain probability of occurring. By analysing these probabilities, physicists can make predictions about different properties of the system, including its temperature, pressure, and entropy – a measure of its randomness and disorder.

Brazilian physicist Constantino Tsallis.
Brazilian physicist Constantino Tsallis.

Statistical mechanics has been used extensively and with success within physics for well over a century. But in more recent decades, scientists have started to employ the tools and techniques developed within statistical mechanics in various fields outside physics, such as biology, ecology, economy, and social sciences. This is nowadays called the science of complex systems, where the role of particles is played by other basic entities or individuals. Parallel to these developments, researchers began to critically examine the foundations of this so far unchanged branch of physics first laid out in the 19th century. The possibility of uncovering a limit of validity for traditional statistical mechanics, together with its modifications beyond it, might open new fields of applicability.

Introducing: Tsallis entropy

In the late 1980s, Brazilian physicist Constantino Tsallis presented a modified form of the mathematical expression used to calculate a system’s entropy from the probability of its possible states. There is an additional parameter in the expression that could be varied, but when given the value unity the traditional formula is recovered. Many features of the ordinary theory remain in the extended version, but its range of validity could only be guessed, such as highly correlated systems, long-lived memory, or long-range interactions.

Despite this, exploration of the properties derived from Tsallis entropy led to some difficulties and controversies, with some physicists questioning its validity altogether. This disagreement has sparked decades of debate, and some researchers even argue that the theory is incompatible with the fundamental principles of thermodynamics.

Transitions into chaos

One particularly intriguing aspect of this debate is the question of how systems transition into a state of chaos. When systems become chaotic, even miniscule changes to their starting conditions can lead to drastically different outcomes over time, making it incredibly difficult to predict their behaviours in the long term. Nonetheless, and precisely because of this, chaotic regimes are compatible with traditional statistical mechanics, but the exact boundary at which a system transitions from regular to chaotic behaviour is not.

Chaotic regimes are compatible with traditional statistical mechanics, but the exact boundary at which a system transitions from regular to chaotic behaviour is not.

Transitions into chaos underlie many important phenomena in nature, resulting from nonlinearity where the response of a system isn’t directly proportional to its inputs. As the system’s components interact with each other, this can cause complex patterns, including spirals, fractals, and even social structures to emerge spontaneously. Yet despite their relevance, these transitions have never been truthfully described by the Tsallis entropy equation.

Statistical mechanics aims to understand how thermodynamic systems with large numbers of particles evolve over time.
Statistical mechanics aims to understand how thermodynamic systems with large numbers of particles evolve over time.

Through their research, Robledo and his colleagues explore these transitions to chaos in systems named ‘low-dimensional nonlinear iterated maps.’ They describe the evolution of systems governed by relatively small numbers of variables, typically only one in the iconic logistic and circle maps, over discrete steps in time. In doing so, they ultimately aim to prove that Tsallis entropy is indeed the correct expression that replaces that of traditional statistical mechanics.

Ergodicity, period doubling, and quasi-periodicity

Transitions into (or, in reverse, out of) chaos in nonlinear maps are particularly interesting as they cause the breakdown of two key principles of statistical mechanics (present when the behaviour is chaotic). The first of these is called ‘ergodicity’, referring to the property where a system will eventually visit all its accessible states over long periods of time, with the amount of time spent in each state proportional to its probability. In nonlinear maps, ergodicity can break down in two possible ways.

The first is through ‘periodic orbits’, where a system revisits specific sets of states in repeating patterns, preventing the system from visiting all its possible states. When a system transitions into chaos, these stable states can branch into a stable ‘2-cycle’, containing two distinct states which undergo their own separate new set of states. From here, these separate states can themselves branch off into their own 2-cycles, creating a stable ‘4-cycle’, which may continue evolving in a cascade of branching states as the system becomes more and more chaotic. This type of chaotic transition is known as ‘period doubling’. A second, even more complex example of ergodicity breakdown is called ‘quasi-periodicity’, where the states of an evolving system never repeat, but follow intricate, non-repeating patterns.

Mixing and intermittency

The second aspect of statistical mechanics to break down in nonlinear maps undergoing chaotic transitions is named ‘mixing’. In traditional statistical mechanics, it describes how nearby points in a system will spread out and become increasingly uncorrelated as time progresses. In nonlinear maps, a breakdown in this property can lead to a behaviour named ‘intermittency’. Here, systems alternate between chaotic bursts and periods of regular behaviour – during which nearby points can remain close together.

The logistic map is a statistical-mechanical laboratory.
The logistic map is a statistical-mechanical laboratory.

Unearthing novel behaviours

For Robledo and colleagues, these interesting behaviours make nonlinear maps an ideal platform for examining the properties of the transitions to chaos – and more importantly, for assessing whether they can be accurately described using the Tsallis entropy formula. As they began to explore these systems in more detail, the researchers were surprised to find that these transitions hadn’t yet been studied in detail, as previous studies had shifted their interest to examining other types of nonlinear systems.

Through a series of recent studies, Robledo and colleagues aimed to unearth the novel behaviours of systems placed under these circumstances and discover the best conceivable way to describe their evolution in mathematical terms.

Modifying the Landau equation

In his latest research, Robledo presents a key result of this work: that transitions to chaos in nonlinear maps can be best described by a modified form of a formula named the ‘Landau equation’. Derived from the principles of statistical mechanics, physicists often use this equation to describe how phase change takes place in time, like the condensation of a gas into a liquid.

Robledo presents a compelling case that Tsallis entropy is consistent with the fundamental principles of thermodynamics.

The researchers examined the ‘Lyapunov function’ of the Landau equation – a concept often used to study the stability of dynamic systems. By assigning a real value to each possible state of the system, the function measures its ‘distance’ from a point of equilibrium, at which the system is stable and no longer evolves over time – so the longer the distance, the more unstable the system. Through these calculations, the researchers discovered that the Lyapunov function of the Landau equation is simply an expression of the Tsallis entropy formula. Crucially, the equation can be used to express period doubling, quasi-periodicity, and intermittency – all three known types of transitions into chaos in nonlinear maps.

Extending the power of statistical mechanics

Robledo’s discovery could have important implications for the ongoing debate over the validity of Tsallis entropy. By having proved that the questioned entropy formula governs the transitions from regular to chaotic behaviours, Robledo’s finding sets a direction for a firm understanding of this issue. In turn, Robledo presents a compelling case that Tsallis entropy is consistent with the fundamental principles of thermodynamics.

Robledo and colleagues’ research has already laid down the links of the onset of chaos with important questions and examples in condensed-matter physics and in complex systems phenomena. They ultimately hope that Tsallis’ theories may finally become more broadly accepted within the wider scientific community. In turn, his discoveries may pave the way for new breakthroughs in a wide range of fields within physics, and now including biology, ecology, computer science, and economics, where statistical mechanics is often sought to be applied.

Personal Response

What is the reason for the occurrence of a limit of validity of traditional statistical mechanics?
This limit can be reached when systems undergo processes that bring about a major obstruction to their previous configurations, leaving access only to small numbers of them, like in glass formation where molecules become caged by their neighbours and can mostly only rattle. In the nonlinear map representation of the system, this happens at the transitions out of chaos. When the behaviour remains chaotic, there is insufficient impediment and traditional statistical mechanics remains valid.

How can a map with only one variable describe macroscopic systems with many degrees of freedom?
For the natural phenomena we study, the variable in the nonlinear maps at transitions to (or out of) chaos already represents a thermodynamic variable of a macroscopic system (like density or energy). The properties obtained correspond to the evolution of a thermodynamic quantity in processes undergone by systems composed of large numbers of degrees of freedom; just as the usual Landau equation does for macroscopic processes observed in ordinary condensed matter systems.

Are there observable phenomena where the Tsallis entropy provides new understanding?
Within condensed matter physics: the formation of glasses, the transformation of a conductor into an insulator, and critical point fluctuations. Regarding complex systems problems: the phenomenon of self-organisation and the development of diversity (biological or social, like languages). Also the comprehension of empirical laws, like those relating to the universality of ranked data or the metabolism of plants and animals.

This feature article was created with the approval of the research team featured. This is a collaborative production, supported by those featured to aid free of charge, global distribution.

Want to read more articles like this?

Sign up to our mailing list and read about the topics that matter to you the most.
Sign Up!

Leave a Reply

Your email address will not be published. Required fields are marked *