You are searching about *1 2 2 2 3 2 N 2 Formula*, today we will share with you article about 1 2 2 2 3 2 N 2 Formula was compiled and edited by our team from many sources on the internet. Hope this article on the topic **1 2 2 2 3 2 N 2 Formula** is useful to you.

## Exploration of the Theoretical and Empirical Relationships Between Entropy and Diffusion

**Abstract:**

Knowledge and control of chemical engineering systems requires obtaining values for process variables and functions that range in difficulty of computation and measurement. The present report aimed to demonstrate the connections between entropy and diffusion and to highlight the avenues to convert data from one into the other. The correlation between the two concepts was explored at the microscopic and single-particle level. The scope of exploration was restricted to the particle level in order to identify commonalities that underlie higher-level phenomena. A probabilistic model for molecular diffusion was developed and presented to illustrate the close coupling between entropic information and diffusion. The relationship between diffusivity and configurational/excess entropy was expounded by analyzing the Adam-Gibbs and Rosenfeld relations. A modified analog of the Adam-Gibbs relation was then found to accurately predict experimental data on diffusion and translational entropy of single water molecules. The quantitative relations declared in this report enable the chemical engineer to obtain information on the abstract entropy potential by mapping from more concrete dynamical properties such as the diffusion coefficient. This correspondence fosters greater insight into the workings of chemical engineering systems granting the engineer increased opportunity for control in the process.

**Introduction:**

Systems, whether observed or simulated, consist of the complex interplay between several degrees of freedom, both of time and space. The analysis of chemical engineering systems, in particular, frequently requires knowledge of both thermodynamic potentials and dynamic state variables. The set of thermodynamic potentials that appear in the analysis of these systems include enthalpy, entropy and free energy as members. Each of these potentials is a function of system variables such as pressure, temperature and composition. This dependence on the system’s parameters allows the thermodynamic potentials, along with their first and second derivatives, to constrain the stability and equilibrium of chemical systems. The constraining ability of these potentials derives from the first and second law of thermodynamics, entropy maximization principles and arguments from mathematical analysis.

Occupation of states of equilibrium and stability is only one aspect of a system; it is also critical to understand how systems evolve towards or away from these states. Dynamic processes, such as transport phenomena, mediate this time evolution. Transport phenomena encompass the movement of conserved quantities: heat, mass and momentum. The movement of mass, heat and momentum represent the pathways systems trace out in state space. Therefore, the full description, understanding and control over chemical engineering systems necessitate knowledge of the active dynamic and thermodynamic processes, and their correlations, of the system.

This report will concentrate on the relationship between entropy and diffusion. Diffusion signifies a process that systems undergo in response to some non-uniformity or asymmetry in the system. Entropy generation can be understood as a consequence of diffusional phenomena. It is the apparent interconnection between the two concepts that this report intends to highlight and characterize. This report aims to define relations between entropy and diffusion so that it is possible to translate qualitative and quantitative information between the two.

**Theory and Procedure:**

Entropy (*S*) is recognized as a measure of the size of configuration space where configuration space is the space of all possible microscopic configurations a system can occupy with a certain probability. This is stated with Gibbs entropy formula,

*S=-k_b ∑ p_i lnâ ¡(p_i )*, *k_b* ≡ Boltzmann constant, *p_i* ≡ probability of microstate.

If the probability of each microstate is equal then,

*S=k_b lnΩ*, where *Ω* ≡ number of microscopic configurations consistent with equilibrium state. These expressions for thermodynamic entropy closely resemble the expression for information theoretic entropy and indicate that entropy can be viewed as a measure of the degree of uncertainty about a system caused by information not being communicated by macrostate variables, like pressure and temperature, alone. Microscopic configurations are determined by the vibrational, rotational and translational degrees of freedom of the molecular constituents of a system. As such, any process that increases the number of microscopic configurations available to a system will also increase the extent of the system’s configuration space, consequently, elevating its entropy.

Diffusion is defined as a process whereby a species moves from a region of high chemical potential to a region of low chemical potential; without loss of generality, the driving force for particle movement is frequently a concentration difference. This is captured with Fick’s First Law of Diffusion, *J = -D∇c* with *∇ =(d/dx,d/dy,d/dz)*, where *J* ≡ diffusive flux, *c* ≡ concentration, *D* ≡ diffusion coefficient. Fick’s Second Law asserts the time dependence of a concentration profile,

*∂c/∂t=∇âˆ™D∇c*. From the above equations, diffusion can be conceptualized as a response function, whose value is determined by a forcing function (gradient in concentration), which seeks to reduce the forcing function to zero. The translational motion of the particles will continue until a state of uniform particle distribution is achieved. Equivalently, diffusion is the process by which a system transitions from a non-equilibrium configuration towards one that more closely resembles an equilibrium state, that being, a state where the chemical potentials of all species are equivalent.

Although elementary, the theoretical information presented above identifies a unifying link between the two concepts, phase space expansion. Entropy is the control variable for this expansion whereas diffusion is the process. This connection will be exhibited by first presenting and relating probability based descriptions of particle diffusion and entropy. By evaluating the relationship between the diffusion coefficient and entropy terms, a further extension of the linkage between the two will be arrived at. Lastly, a focus on single water molecules will further illustrate and support the connectivity between diffusion and entropy.

**Results and Discussion:**

The molecular motions executed by particles were revealed to be reducible to a probabilistic model incorporating statistical mechanical arguments in Albert Einstein’s 1905 Investigation on the Theory of Brownian Movement (14-18). The assumption that each particle underwent motion, restricted to the single *x* co-ordinate, independently of neighboring particles was advanced; this was achieved by selecting time intervals of motion (*τ*) and space (*Δx*) to not be too small. A particle density function *f(x,t)* which express the number of particles per unit volume was posited. This probability density function was formed by the spatial increments particles traveled over the time interval. This function was then expanded in a Taylor series yielding,

*f(x+âˆ†x,t)=f(x,t)+âˆ† ∂f(x,t)/∂x+âˆ†^2/2! (∂^2 f(x,t))/(∂x^2 )+âˆ™âˆ™âˆ™ad inf.
f(x,t+τ)dx=dx∫_(âˆ†=m)^(âˆ†=∞)f(x+âˆ†)Ï•(Δ)dΔ*

This expansion can be integrated, since only small values of

*Δ*contribute to the function.

*f+∂f/∂tâˆ™τ=f∫_(-∞)^∞(Ï•(âˆ†)dâˆ†+∂x/∂f ∫_(-∞)^∞(âˆ†Ï•(âˆ†))dâˆ†+(∂^2 y)/(∂x^2 ) ∫_(-∞)^∞(âˆ†^2/2) Ï•(âˆ†)dâˆ† âˆ™âˆ™âˆ™*

The first integral on the right-hand side is unity by the measure of a probability space whereas the second and other even terms vanish due to space symmetry

*Ï•(x)=Ï•(-x)*. What remains after this simplification is

*∂f/∂t = (∂^2 f)/(∂x^2 ) ∫_(-∞)^∞(âˆ†^2/2τ) Ï•(âˆ†)dâˆ†∫_(-∞)^∞(Ï•(âˆ†))dâˆ†*

whereby setting the term after the second derivative to

*D*results in

*∂f/∂t = D (∂^2 f)/(∂x^2 )*which is Fick’s Second Law. Solving the above integral equation generates the particle density function,

*f(x,t) = n/√4πD* e^(-x^2/4Dt)/√t*

This is a normal distribution that has the unique property of possessing the maximum entropy of any other continuous distribution for a specified mean and variance, equal to 0 and

*√2Dt*, respectively, for the particle distribution above. Einstein later found that the mean displacement (diffusion) of particles

*λx*which depends on temperature, pressure, Avogadro’s number

*N*and the Boltzmann constant

*k_b*to be,

*λ_x = √tâˆ™√((RT∫_(-∞)^∞(Ï•(âˆ†))dâˆ†)/(3πkPN)*

It is fascinating that measurable physical properties such as the diffusion coefficient appear in a mathematical model that ensures maximization of entropy.

Equation-based relationships between diffusion and entropy have been investigated for many years. One such relation is,

*D(T) = D(T=T_0)e^(C/(TS_c ))*,

where *S_c* the configuration entropy of the system defined as,

*S_c (T) = S(T)-S_vib(T)*

and *S_vib* is the vibrational entropy of the system and *D(T_0)* is the diffusion coefficient at some higher temperature *T_0*. This is known as the Adam-Gibbs relation and explicates the strong dependence diffusion has on entropy. The Rosenfeld relation between the diffusion coefficient and entropy provides another interesting connection,

*D = aâˆ™e^(((bS_ex)/k_b ))
S_ex* is excess entropy found by subtracting the entropy of an ideal gas at the same conditions from the system’s total entropy,

*a*and

*b*act as fitting parameters and

*k_b*is the Boltzmann’s constant. These above expressions broadcast a pronounced and well-founded connection between diffusion and entropy to the extent that knowing one enables the determination of the other.

Saha and Mukherjee in their article “Connecting diffusion and entropy of bulk water at the single particle level,” implemented molecular dynamic simulations to establish a linkage between thermodynamic and dynamic properties of individual water molecules (825-832). Translational (*S_trans*) and rotational (*S_rot*) entropies were calculated at varying temperatures along with calculations of self-diffusion coefficient (*D*) thereby permitting the construction of a generalization of the Adam-Gibbs relation above to relate configurational entropy with translation relaxation (self-diffusion) time. *S_trans* was evaluated from the entropy of a solid-state quantum harmonic oscillator as shown below,

*S_trans^QH = k_b ∑_(i=1)^3((â„ ω_i)⁄(k_b T))/e^((â„ ω_i)⁄(k_b T)) – lnâ ¡(1-e^((â„ ω_i)⁄(k_b T)))*

where T indicates temperature, *k_b* is the Boltzmann constant and *â„ =h/2π*, *h* being the Planck constant. A method known as permutation reduction which considers water molecules to be indistinguishable and to reside in an effective localized configuration space was utilized to obtain a covariance matrix of translational fluctuations of each permuted molecule along the *x*, *y* and *z* co-ordinates. This produced a 3×3 matrix, whereupon diagonalization of the matrix produced 3 eigenvalues and three frequencies (*ωi*), which were input to the expression above. Diffusion was evaluated with the Vogel-Fulcher-Tammann (VFT) equation,

*D^(-1) (T) = D_0^(-1) e^[1/(K_VFT (T/T_VFT -1))]*

with KVFT denoting the kinetic fragility marker and TVFT signifying the temperature at which the diffusion coefficient diverges. The idea of thermodynamic fragility, which appears in the above analysis, quantifies the rate at which dynamical properties such as inverse diffusivity grow with temperature. Also, according to IUPAC Compendium of Chemical Terminology, self-diffusion is the diffusion coefficient *(D_i*)* of species *i* when the chemical potential gradient is zero (*a* is the activity coefficient and *c* is the concentration).

*D_i* = D_i (∂lnc_i)/(∂lna_i )*

Saha and Mukherjee fitted the variant of the Adam-Gibbs equation *D=ae^((bS_trans⁄k_b))* to their data.

The Pearson’s correlation coefficient (R), which is the covariance of two variables divided by the product of their standard deviations, attained a value of 0.98. This value indicates a directed and strong statistical association between translational entropy and translational diffusivity. Such a good fit implies that an underlying physical relation between entropy and diffusion does exist and that one can convert knowledge of dynamics, information that demands fewer computational resources, to an understanding of thermodynamics, information that is computationally more costly. As communicated by the authors, this connection was verified for a specific system and generalization of its findings to other systems should occur only upon application of the same methods to other systems. Nonetheless, if additional analysis can provably satisfy empirical and theoretical constraints, the methods detailed above can provide insight to more complicated environments.

**Conclusion:**

Controllability, a notion open to several definitions, can be thought of as the capacity to move a system between different regions of its configuration space through the application of a certain number of admissible manipulations. The ultimate objective of chemical engineering analysis is the ability to determine the output of some system through the rational and systematic control of input variables. This controllability allows optimization of processes such as separations. However, without the ability to monitor a systems response to perturbations, it becomes challenging to know in what direction or to what degree a change should be conducted. Thus, controllability implies observability of process variables; or state differently, all relevant process variables can be measured to some extent.

This report concentrated specifically on the interconnection between diffusion and entropy. Both of these entities are important in the design, characterization and control of engineering systems. A barrier to achieve full control arises from the difficulty of attaining and measuring abstract quantities such as entropy. A method to overcome this challenge is to identify a one-to-one correspondence between the intractable variable and one that is more compliant and more easily measured. Diffusion and the related diffusion coefficient represent the property that complies with computational and empirical methods and enables completion of the mapping. The equations and relations presented above are structurally diverse and apply to different conditions but show that from knowledge of a system’s dynamics (diffusivity) one obtains knowledge of the system’s thermodynamics.

**References:**

*Engel, Thomas and Philip Reid. Physical Chemistry. San Francisco: Pearson Benjamin Cummings, 2006.
Seader, J.D, Ernest J. Henley and D. Keith Roper. Separation Process Principles: Chemical and Biochemical Operation 3rd Edition. New Jersey: John Wiley & Sons, Inc., 2011
Einstein, Albert. “Investigation on The Theory of Brownian Movement.” ed. R. Furth. Trans. A. D. Cowper. Dover Publications, 1926 and 1956.
Seki, Kazuhiko and Biman Bagchi. “Relationship between Entropy and Diffusion: A statistical mechanical derivation of Rosenfeld expression for a rugged energy landscape.” J. Chem. Phys. 143(19), 2015. doi: 10.1063/1.4935969
Rosenfeld, Yaakov. “Relation between the transport coefficients and the internal entropy of simple systems,” Phys. Rev. A 15, 2545, 1977
Rosenfeld, Yaakov. “A quasi-universal scaling law for atomic transport in simple fluids.” J. Phys.: Condensed Matter 11, 5415, 1999.
Sharma, Ruchi, S. N. Chakraborty and C. Chakravarty. “Entropy, diffusivity, and structural order in liquids with waterlike anomalies.” J. Chem. Phys. 125, 2006. Doi: 10.1063/1.2390710
Saha, Debasis and Arnab Mukherjee. “Connecting diffusion and entropy of bulk water at the single particle level.” J. Chem. Sci. 129(7), 2017. Pg 825-832. Doi: 10.1007/s23039-017-1317-z
Hogg, Robert V. and Elliot A. Tanis. Probability and Statistical Inference 6th Edition. Prentice-Hall Inc., 2001*

## Video about 1 2 2 2 3 2 N 2 Formula

You can see more content about **1 2 2 2 3 2 N 2 Formula** on our youtube channel: Click Here

## Question about 1 2 2 2 3 2 N 2 Formula

If you have any questions about **1 2 2 2 3 2 N 2 Formula**, please let us know, all your questions or suggestions will help us improve in the following articles!

The article **1 2 2 2 3 2 N 2 Formula** was compiled by me and my team from many sources. If you find the article 1 2 2 2 3 2 N 2 Formula helpful to you, please support the team Like or Share!

## Rate Articles 1 2 2 2 3 2 N 2 Formula

**Rate:** 4-5 stars

**Ratings:** 8037

**Views:** 66487030

## Search keywords 1 2 2 2 3 2 N 2 Formula

1 2 2 2 3 2 N 2 Formula

way 1 2 2 2 3 2 N 2 Formula

tutorial 1 2 2 2 3 2 N 2 Formula

1 2 2 2 3 2 N 2 Formula free

#Exploration #Theoretical #Empirical #Relationships #Entropy #Diffusion

Source: https://ezinearticles.com/?Exploration-of-the-Theoretical-and-Empirical-Relationships-Between-Entropy-and-Diffusion&id=9973762