Skip to main content

EntropyExplorer: an R package for computing and comparing differential Shannon entropy, differential coefficient of variation and differential expression



Differential Shannon entropy (DSE) and differential coefficient of variation (DCV) are effective metrics for the study of gene expression data. They can serve to augment differential expression (DE), and be applied in numerous settings whenever one seeks to measure differences in variability rather than mere differences in magnitude. A general purpose, easily accessible tool for DSE and DCV would help make these two metrics available to data scientists. Automated p value computations would additionally be useful, and are often easier to interpret than raw test statistic values alone.


EntropyExplorer is an R package for calculating DSE, DCV and DE. It also computes corresponding p values for each metric. All features are available through a single R function call. Based on extensive investigations in the literature, the Fligner-Killeen test was chosen to compute DCV p values. No standard method was found to be appropriate for DSE, and so permutation testing is used to calculate DSE p values.


EntropyExplorer provides a convenient resource for calculating DSE, DCV, DE and associated p values. The package, along with its source code and reference manual, are freely available from the CRAN public repository at


Shannon entropy (SE) and coefficient of variation (CV) are used to measure the variability or dispersion of numerical data. Such variability has potential utility in numerous application domains, perhaps most notably in the analysis of high throughput biological data. Variability has been applied, for example, to study gene expression data in the context of human disease [1]. Increased entropy in particular, in both gene expression and protein interaction data, has been observed to be a characteristic of cancer [2]. Numerous other examples typify the utility of entropy [38] and coefficient of variation [912].

Shannon entropy is famously rooted in information theory [13]. To avoid confusion, we emphasize that we use the term “differential entropy” to denote a difference between two Shannon entropy values. This is distinct from information-theoretic terminology, in which “differential entropy” often means the entropy of a continuous, rather than a discrete, random variable [14].

We are particularly interested in differential analysis. In [15], we studied differential Shannon entropy (DSE) and differential coefficient of variation (DCV), and found them highly effective in identifying genes of potential interest not found by differential expression (DE) alone. DSE and DCV are applicable to other types of biological data as well, such as that produced by RNA-Seq technologies, although the usual caveats about careful interpretation apply. The usefulness of DSE and DCV is of course not limited to biological data. They may be applied to any numerical data for which normalized measures of differential variability are relevant.


EntropyExplorer is implemented in R [16]. All features are wrapped into a single function call, which takes as input up to eight arguments. Two of these arguments are numerical matrices, with identical labels for each row. The output is a matrix with two, three or five columns that contains in each row two SE, CV or mean values; a DSE, DCV or DE value; and/or two p values, one raw and one adjusted. Output rows can be sorted by value, raw p value or adjusted p value, and can be filtered to show only the top-ranked rows.

Permutation testing for DSE is accomplished with the help of the R function The default number of tests to be employed is set to 1000, which the user can override. The p value for DCV is calculated by applying the Fligner-Killeen test for homogeneity of variances, implemented via the R function fligner.test, to the log-transform of the input data. The R function t.test is used to find a p value for DE. Adjusted p values are calculated using the p.adjust function in R, which provides false discovery rate and multiple testing corrections. A more thorough explanation of p value calculations is provided in the discussion section.

EntropyExplorer checks that all matrix entries are positive. This is because calculations of a DSE value/p value and a DCV p value involve taking logarithms, which are undefined on data containing zeros or negative values. Also, CV becomes less meaningful when means approach zero or are negative. Experimental data may be noisy, however, and so EntropyExplorer provides mechanisms to handle non-positive values. An optional two-value argument permits the user to add a positive bias to all elements of one or both matrices prior to performing any other calculations. The argument can also be set to make this adjustment automatically, based on the least non-positive value in each matrix.


Let \(x_{1} ,x_{2} , \ldots ,x_{n}\) represent a list of n positive numbers, and let \(x = \sum\nolimits_{i = 1}^{n} {x_{i} }\) denote their sum. The Shannon entropy of this list is

$$SE = \frac{{ - \sum\nolimits_{i = 1}^{n} {\frac{{x_{i} }}{x}\log_{2} \frac{{x_{i} }}{x}} }}{{\log_{2} n}}.$$

The coefficient of variation is

$$CV = \frac{s}{{|\bar{x}|}}$$

where \(\bar{x} = x/n\) is the sample mean and \(s = \sqrt {\frac{{\sum\nolimits_{i = 1}^{n} {(x_{i} - \bar{x})^{2} } }}{n - 1}}\) is the sample standard deviation. Given two such lists of positive numbers with Shannon entropies \(SE_{1}\) and \(SE_{2}\), coefficients of variation \(CV_{1}\) and \(CV_{2}\), and means \(\bar{x}_{1}\) and \(\bar{x}_{2}\), \(DSE = |SE_{1} - SE_{2} |\), \(DCV = \left| {CV_{1} - CV_{2} } \right|\), and \(DE = \left| {\bar{x}_{1} - \bar{x}_{2} } \right|\).

Shannon entropy falls in the range [0, 1]; DSE therefore also falls in the range [0, 1]. Lower (higher) SE corresponds to more (less) variability. CV falls in the range [0, ∞); DCV therefore also has a range of [0, ∞).


EntropyExplorer is invoked as follows:

EntropyExplorer(expm1, expm2, dmetric, otype, ntop, nperm, shift, padjustmethod)

We refer the reader to the reference manual, included as Additional file 1 and available on the project webpage, for a detailed description of all arguments and options. Included with the package is a sample mRNA microarray dataset, consisting of a few rows from a dataset obtained from the Gene Expression Omnibus (GEO) [17]. This dataset, GSE10810, contains case/control data on breast cancer [18]. Figures 1 and 2 provide example uses of EntropyExplorer on the full data.

Fig. 1
figure 1

The output of EntropyExplorer on breast cancer data. The numerical matrices m1 and m2 have been read into R. The function call has specified “dse” for differential Shannon entropy, “v” for value, and 10 to return the top 10 values

Fig. 2
figure 2

Another use of EntropyExplorer on breast cancer data. The function call has specified “dcv” for differential coefficient of variation, “bv” to specify both value and p value, and to sort by value, and 12 to return the top 12 rows


In addition to calculating DSE, DCV and DE, EntropyExplorer can calculate both raw and adjusted p values for each. ANOVA-based tests are the standard way to obtain differential expression p values. We therefore use a t-test for this purpose. Certainly more sophisticated methods exist. See, for example, [19, 20]. Thus, we emphasize that EntropyExplorer includes DE only as a simple, convenient and straightforward point of comparison with the other two metrics. For DCV p values, we observe that 11 tests of equal relative variation were compared in [21], with the conclusion that the Fligner-Killeen test [22] is usually the most appropriate. It strikes a balance between type I and type II errors, and is robust to non-normal distributions.

Obtaining reliable p values for DSE proved much more challenging. We found no known method in the literature specific to DSE p values. We therefore investigated the extent to which SE is correlated to variance. A high correlation would suggest that they may be proxies for each other, in which case the p value of an F-test or some derivation thereof might serve as suitable estimate of the DSE p value. Unfortunately, correlations between SE and variance, or between SE and a function of variance, were not high enough to justify using one as a surrogate for the other. Table 1 shows the correlation between SE and variance V, and between SE and the function \(\frac{1}{2}\ln \left( {2\pi eV} \right)\) as an attempt to linearize the relationship, using the 16 datasets from [15]. The only notably high correlation is found in the obesity dataset. The obesity data, however, contains a large number of missing values, rendering the high correlation less reliable. We conclude that standard statistical tests related to variance do not appear suitable for testing DSE.

Table 1 Correlations between SE and variance, and between SE and \( \frac{1}{2}\ln \left( {2\pi eV} \right) \), on 16 microarray gene expression datasets

We also examined the distribution of DSE on the 16 datasets, with the goal of empirically determining a suitable reference distribution for DSE. From this, we could then estimate p values analytically. We applied the Kolmogorov–Smirnov (KS) test to compare the DSE distribution of each dataset to some of the more common reference distributions, such as normal, F, t, and Chi square. When performing a KS test, p values can be overly sensitive to deviations from the reference distribution [23], so a D-statistic value below 0.1 was used to identify matching distributions. In our experiments, only the Parkinson’s dataset produced a D-statistic below 0.1 when tested against a normal or standardized t distribution (Table 2). Figure 3 shows a sample distribution of DSE, in this case using prostate cancer data.

Table 2 KS test D-statistic results comparing the DSE distribution against several common distributions
Fig. 3
figure 3

The distribution of differential Shannon entropy. The observed distribution of differential Shannon entropy in sample prostate cancer data is shown. Similar patterns were seen in all 16 data sets. None of the standard distributions tested matched the observed distributions closely enough to be considered as a reference distribution for obtaining p values

We conclude from this that none of the distributions tested are close enough approximations to the observed DSE distribution to be used as a proxy for obtaining p values. Thus, without a known distribution function or suitable surrogate, we resort to resampling in order to obtain reliable DSE p values. While computationally demanding, the following permutation test makes no assumptions about the underlying distribution of the data. Given two lists of numbers, containing n 1 and n 2 numerical elements respectively, we first calculate their DSE and then create a new list A containing all \(n_{1} + n_{2}\) numbers from the two lists. Next we randomly permute the elements of A, then recalculate DSE, treating the first \(n_{1}\) elements of A as one list and the last \(n_{2}\) elements of A as a second list. The resultant p value is simply the proportion of all recalculated DSEs that are at least as extreme as the original DSE.

In addition to raw p values, EntropyExplorer also calculates p values adjusted for multiple testing. A user can choose to adjust based on FDR, Holm or another multiple-testing adjustment.


We have produced EntropyExplorer, an R package for calculating differential Shannon entropy, differential coefficient of variation and differential expression. This package also calculates raw and adjusted p values for each metric. These measures have been shown to complement one another [15], making this package an effective tool for users in search of more expansive suites of differential analysis methods.

Availability and requirements

Project name: EntropyExplorer.

Project home page:

Operating system(s): Platform independent.

Programming language: R.

Other requirements: R version 3.0 or later is recommended.

License: GNU General Public License version 3.0 (GPLv3).

Any restrictions to use by non-academics: None.

Additional availability: EntropyExplorer is integrated into the GrAPPA toolkit at


  1. Ho JW, Stefani M, dos Remedios CG, Charleston MA. Differential variability analysis of gene expression and its application to human diseases. Bioinformatics. 2008;24(13):i390–8.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  2. West J, Bianconi G, Severini S, Teschendorff AE. Differential network entropy reveals cancer system hallmarks. Scientific reports. 2012;2:802.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  3. Berretta R, Moscato P. Cancer biomarker discovery: the entropic hallmark. PLoS One. 2010;5(8):e12262.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  4. Sherwin WB. Entropy and information approaches to genetic diversity and its expression: genomic geography. Entropy. 2010;12(7):1765–98.

    Article  CAS  Google Scholar 

  5. Masisi L, Nelwamondo F, Marwala T. The use of entropy to measure structural diversity. In: IEEE 6th International Conference on Computational Cybernetics. Stará Lesná, Slovakia.

  6. Chen B-S, Li C-W. On the interplay between entropy and robustness of gene regulatory networks. Entropy. 2010;12(5):1071–101.

    Article  CAS  Google Scholar 

  7. Furlanello C, Serafini M, Merler S, Jurman G. Entropy-based gene ranking without selection bias for the predictive classification of microarray data. BMC Bioinform. 2003;4:54.

    Article  Google Scholar 

  8. Kohane IS, Kho AT, Butte AJ. Microarrays for an integrative genomics. Cambridge, Mass.: MIT Press; 2003.

  9. Reed GF, Lynn F, Meade BD. Use of coefficient of variation in assessing variability of quantitative assays. Clin Diagn Lab Immunol. 2002;9(6):1235–9.

    PubMed  PubMed Central  Google Scholar 

  10. Weber EU, Shafir S, Blais AR. Predicting risk sensitivity in humans and lower animals: risk as variance or coefficient of variation. Psychol Rev. 2004;111(2):430–45.

    Article  PubMed  Google Scholar 

  11. Faber DS, Korn H. Applicability of the coefficient of variation method for analyzing synaptic plasticity. Biophys J. 1991;60(5):1288–94.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  12. Bedeian AG, Mossholder KW. On the use of the coefficient of variation as a measure of diversity. Organ Res Methods. 2000;3(3):285–97.

    Article  Google Scholar 

  13. Shannon CE. A mathematical theory of communication. Bell Syst Tech J. 1948;27(3):45.

    Article  Google Scholar 

  14. Cover TM, Thomas JA. Elements of information theory. 2nd ed. Hoboken: Wiley-Interscience; 2006.

    Google Scholar 

  15. Wang K, Phillips CA, Rogers GL, Barrenas F, Benson M, Langston MA. Differential Shannon entropy and differential coefficient of variation: alternatives and augmentations to differential expression in the search for disease-related genes. Int J Comput Biol Drug Des. 2014;7(2–3):183–94.

    Article  PubMed  PubMed Central  Google Scholar 

  16. R Core Team. R: A language and environment for statistical computing. In: R Foundation for Statistical Computing. Vienna, Austria; 2014.

  17. Edgar R, Domrachev M, Lash AE. Gene expression omnibus: NCBI gene expression and hybridization array data repository. Nucleic Acids Res. 2002;30(1):207–10.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  18. Pedraza V, Gomez-Capilla JA, Escaramis G, Gomez C, Torne P, Rivera JM, Gil A, Araque P, Olea N, Estivill X, et al. Gene expression signatures in breast cancer distinguish phenotype characteristics, histologic subtypes, and tumor invasiveness. Cancer. 2010;116(2):486–96.

    Article  PubMed  CAS  Google Scholar 

  19. Love MI, Huber W, Anders S. Moderated estimation of fold change and dispersion for RNA-seq data with DESeq2. Genome Biol. 2014;15(12):550.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  20. Tusher VG, Tibshirani R, Chu G. Significance analysis of microarrays applied to the ionizing radiation response. Proc Natl Acad Sci USA. 2001;98(9):5116–21.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  21. Donnelly SM, Kramer A. Testing for multiple species in fossil samples: an evaluation and comparison of tests for equal relative variation. Am J Phys Anthropol. 1999;108(4):507–29.

    Article  PubMed  CAS  Google Scholar 

  22. Fligner MA, Killeen TJ. Distribution-free two-sample tests for scale. J Am Stat Assoc. 1976;71(353):210–3.

    Article  Google Scholar 

  23. Ghasemi A, Zahediasl S. Normality tests for statistical analysis: a guide for non-statisticians. Int J Endocrinol Metab. 2012;10(2):486–9.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Authors’ contributions

KW implemented the package and performed numerous analytical tests. CAP led exhaustive software evaluations and isolated relevant data. AMS provided statistical expertise and assisted with human factors engineering. MAL directed the project and provided for its support. All authors participated in writing the paper. All authors read and approved the final manuscript.


This work has been supported in part by the National Institutes of Health under awards R01-AA-018776 and 3P20MD000516-07S1.

Competing interests

The authors declare that they have no competing interests.

Author information

Authors and Affiliations


Corresponding author

Correspondence to Charles A. Phillips.

Additional file

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, K., Phillips, C.A., Saxton, A.M. et al. EntropyExplorer: an R package for computing and comparing differential Shannon entropy, differential coefficient of variation and differential expression. BMC Res Notes 8, 832 (2015).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: