Wikis

Info-nuggets to help anyone understand various concepts of MLOps, their significance, and how they are managed throughout the ML lifecycle.

Stay up to date with all updates

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Follow Us

AI Regulations in China
AI Regulations in the European Union (EU)
AI Regulations in the US
AI Regulations in India
Model safety
Synthetic & Generative AI
MLOps
Model Performance
ML Monitoring
Explainable AI

Jensen-Shannon(JS) Divergence

The Jensen-Shannon distance measures the similarity between two probability distributions.

It is based on the Kullback-Leibler divergence but is a different, symmetric version of the Kullback-Leibler divergence measure. The Jensen-Shannon distance between two distributions, if they are similar, is 0. 

It is defined as the square root of the Jensen-Shannon divergence, which is a measure of the average divergence of the two distributions from their mean.

The formula to compute Jensen-Shannon between P and Q is:

JS(P,Q) = sqrt( [KL(P,M) + KL(Q,M)] / 2 )

Where M is the average of P and Q i.e. M = (P + Q) / 2

Then, Jensen-Shannon is the square root of the average of KL(P,M) and KL(Q,M)

The Jensen-Shannon distance is often applied to compare the similarity of two probability distributions in a variety of fields, including information theory, protein surface comparison, bioinformatics, machine learning, and natural language processing. It is a popular choice because it is easy to calculate and has several valuable properties, such as being symmetric, bounded, and satisfying the triangle inequality.

JS can be used for both categorical and numerical features. The default threshold is 0.05 in AryaXAI

Is Explainability critical for your AI solutions?

Schedule a demo with our team to understand how AryaXAI can make your mission-critical 'AI' acceptable and aligned with all your stakeholders.

AI Regulations in China
AI Regulations in the European Union (EU)
AI Regulations in the US
AI Regulations in India
Model safety
Synthetic & Generative AI
MLOps
Model Performance
ML Monitoring
Explainable AI

Jensen-Shannon(JS) Divergence

The Jensen-Shannon distance measures the similarity between two probability distributions.

It is based on the Kullback-Leibler divergence but is a different, symmetric version of the Kullback-Leibler divergence measure. The Jensen-Shannon distance between two distributions, if they are similar, is 0. 

It is defined as the square root of the Jensen-Shannon divergence, which is a measure of the average divergence of the two distributions from their mean.

The formula to compute Jensen-Shannon between P and Q is:

JS(P,Q) = sqrt( [KL(P,M) + KL(Q,M)] / 2 )

Where M is the average of P and Q i.e. M = (P + Q) / 2

Then, Jensen-Shannon is the square root of the average of KL(P,M) and KL(Q,M)

The Jensen-Shannon distance is often applied to compare the similarity of two probability distributions in a variety of fields, including information theory, protein surface comparison, bioinformatics, machine learning, and natural language processing. It is a popular choice because it is easy to calculate and has several valuable properties, such as being symmetric, bounded, and satisfying the triangle inequality.

JS can be used for both categorical and numerical features. The default threshold is 0.05 in AryaXAI

Liked the content? you'll love our emails!

Thank you! We will send you newest issues straight to your inbox!
Oops! Something went wrong while submitting the form.

Is Explainability critical for your AI solutions?

Schedule a demo with our team to understand how AryaXAI can make your mission-critical 'AI' acceptable and aligned with all your stakeholders.