Prism

Analyze, graph and present scientific data faster than ever with Prism!

Statistical Comparisons

  • Paired or unpaired t tests. Reports P values and confidence intervals.
  • Nonparametric Mann-Whitney test, including confidence interval of difference of medians.
  • Kolmogorov-Smirnov test to compare two groups.
  • Wilcoxon test with confidence interval of median.
  • Perform many t tests at once, using False Discover Rate (or Bonferroni multiple comparisons)to choose which comparisons are discoveries to study further.
  • Ordinary or repeated measures one-way ANOVA followed by the Tukey, Newman-Keuls, Dunnett, Bonferroni or Holm-Sidak multiple comparison tests, the post-test for trend, or Fisher’s Least Significant tests.
  • Many multiple comparisons test are accompanied by confidence intervals and multiplicity adjusted P values.
  • Greenhouse-Geisser correction so repeated measures one-way ANOVA does not have to assume sphericity. When this is chosen, multiple comparison tests also do not assume sphericity.
  • Kruskal-Wallis or Friedman nonparametric one-way ANOVA with Dunn’s post test.
  • Fisher’s exact test or the chi-square test. Calculate the relative risk and odds ratio with confidence intervals.
  • Two-way ANOVA, even with missing values with some post tests.
  • Two-way ANOVA, with repeated measures in one or both factors. Tukey, Newman-Keuls, Dunnett, Bonferron, Holm-Sidak, or Fishers LSD multiple comparisons testing main and simple effects.
  • Three-way ANOVA (limited to two levels in two of the factors, and any number of levels in the third).
  • Kaplan-Meier survival analysis. Compare curves with the log-rank test (including test for trend).

Column Statistics

  • Calculate min, max, quartiles, mean, SD, SEM, CI, CV,
  • Mean or geometric mean with confidence intervals.
  • Frequency distributions (bin to histogram), including cumulative histograms.
  • Normality testing by three methods.
  • One sample t test or Wilcoxon test to compare the column mean (or median) with a theoretical value.
  • Skewness and Kurtosis.
  • Identify outliers using Grubbs or ROUT method.

Linear Regression And Correlation

  • Calculate slope and intercept with confidence intervals.
  • Force the regression line through a specified point.
  • Fit to replicate Y values or mean Y.
  • Test for departure from linearity with a runs test.
  • Calculate and graph residuals.
  • Compare slopes and intercepts of two or more regression lines.
  • Interpolate new points along the standard curve.
  • Pearson or Spearman (nonparametric) correlation.
  • Analyze a stack of P values, using Bonferroni multiple comparisons or the FDR approach to identify “significant” findings or discoveries.

Nonlinear Regression

  • Fit one of our 105 built-in equations, or enter your own.
  • Enter differential or implicit equations.
  • Enter different equations for different data sets.
  • Global nonlinear regression – share parameters between data sets.
  • Robust nonlinear regression.
  • Automatic outlier identification or elimination.
  • Compare models using extra sum-of-squares F test or AICc.
  • Compare parameters between data sets.
  • Apply constraints.
  • Differentially weight points by several methods and assess how well your weighting method worked.
  • Accept automatic initial estimated values or enter your own.
  • Automatically graph curve over specified range of X values.
  • Quantify precision of fits with SE or CI of parameters. Confidence intervals can be symmetrical (as is traditional) or asymmetrical (which is more accurate).
  • Quantify symmetry of imprecision with Hougaard’s skewness.
  • Plot confidence or prediction bands.
  • Test normality of residuals.
  • Runs or replicates test of adequacy of model.
  • Report the covariance matrix or set of dependencies.
  • Easily interpolate points from the best fit curve.

Clinical (Diagnostic) Lab Statistics

  • Bland-Altman plots.
  • Receiver operator characteristic (ROC) curves.
  • Deming regression (type ll linear regression).

Simulations

  • Simulate XY, Column or Contingency tables.
  • Repeat analyses of simulated data as a Monte-Carlo analysis.
  • Plot functions from equations you select or enter and parameter values you choose.

Other Calculations

  • Area under the curve, with confidence interval.
  • Transform data.
  • Normalize.
  • Identify outliers.
  • Normality tests.
  • Transpose tables.
  • Subtract baseline (and combine columns).
  • Compute each value as a fraction of its row, column or grand total.