Abstraction for Statistical Models

This package defines an abstract type StatisticalModel, and an abstract subtype RegressionModel.

Particularly, instances of StatisticalModel implement the following methods.

StatsBase.adjr2Function
adjr2(obj::StatisticalModel)
adjr²(obj::StatisticalModel)

Adjusted coefficient of determination (adjusted R-squared).

For linear models, the adjusted R² is defined as $1 - (1 - (1-R^2)(n-1)/(n-p))$, with $R^2$ the coefficient of determination, $n$ the number of observations, and $p$ the number of coefficients (including the intercept). This definition is generally known as the Wherry Formula I.

source
adjr2(obj::StatisticalModel, variant::Symbol)
adjr²(obj::StatisticalModel, variant::Symbol)

Adjusted pseudo-coefficient of determination (adjusted pseudo R-squared).

For nonlinear models, one of the several pseudo R² definitions must be chosen via variant. The only currently supported variant is :MacFadden, defined as $1 - (\log (L) - k)/\log (L0)$. In this formula, $L$ is the likelihood of the model, $L0$ that of the null model (the model including only the intercept), and $k$ is the number of consumed degrees of freedom of the model (as returned by dof).

source
StatsBase.aicFunction
aic(obj::StatisticalModel)

Akaike's Information Criterion, defined as $-2 \log L + 2k$, with $L$ the likelihood of the model, and k its number of consumed degrees of freedom (as returned by dof).

source
StatsBase.aiccFunction
aicc(obj::StatisticalModel)

Corrected Akaike's Information Criterion for small sample sizes (Hurvich and Tsai 1989), defined as $-2 \log L + 2k + 2k(k-1)/(n-k-1)$, with $L$ the likelihood of the model, $k$ its number of consumed degrees of freedom (as returned by dof), and $n$ the number of observations (as returned by nobs).

source
StatsBase.bicFunction
bic(obj::StatisticalModel)

Bayesian Information Criterion, defined as $-2 \log L + k \log n$, with $L$ the likelihood of the model, $k$ its number of consumed degrees of freedom (as returned by dof), and $n$ the number of observations (as returned by nobs).

source
StatsBase.coeftableFunction
coeftable(obj::StatisticalModel; level::Real=0.95)

Return a table of class CoefTable with coefficients and related statistics. level determines the level for confidence intervals (by default, 95%).

source
StatsBase.confintFunction
confint(obj::StatisticalModel; level::Real=0.95)

Compute confidence intervals for coefficients, with confidence level level (by default 95%).

source
StatsBase.devianceFunction
deviance(obj::StatisticalModel)

Return the deviance of the model relative to a reference, which is usually when applicable the saturated model. It is equal, up to a constant, to $-2 \log L$, with $L$ the likelihood of the model.

source
StatsBase.dofFunction
dof(obj::StatisticalModel)

Return the number of degrees of freedom consumed in the model, including when applicable the intercept and the distribution's dispersion parameter.

source
StatsBase.fitFunction
fit(Histogram, data[, weight][, edges]; closed=:left, nbins)

Fit a histogram to data.

Arguments

  • data: either a vector (for a 1-dimensional histogram), or a tuple of vectors of equal length (for an n-dimensional histogram).

  • weight: an optional AbstractWeights (of the same length as the data vectors), denoting the weight each observation contributes to the bin. If no weight vector is supplied, each observation has weight 1.

  • edges: a vector (typically an AbstractRange object), or tuple of vectors, that gives the edges of the bins along each dimension. If no edges are provided, these are determined from the data.

Keyword arguments

  • closed: if :left (the default), the bin intervals are left-closed [a,b); if :right, intervals are right-closed (a,b].

  • nbins: if no edges argument is supplied, the approximate number of bins to use along each dimension (can be either a single integer, or a tuple of integers).

Examples

# Univariate
h = fit(Histogram, rand(100))
h = fit(Histogram, rand(100), 0:0.1:1.0)
h = fit(Histogram, rand(100), nbins=10)
h = fit(Histogram, rand(100), weights(rand(100)), 0:0.1:1.0)
h = fit(Histogram, [20], 0:20:100)
h = fit(Histogram, [20], 0:20:100, closed=:right)

# Multivariate
h = fit(Histogram, (rand(100),rand(100)))
h = fit(Histogram, (rand(100),rand(100)),nbins=10)
source

Fit a statistical model.

source
fit(ZScoreTransform, X; dims=nothing, center=true, scale=true)

Fit standardization parameters to vector or matrix X and return a ZScoreTransform transformation object.

Keyword arguments

  • dims: if 1 fit standardization parameters in column-wise fashion; if 2 fit in row-wise fashion. The default is nothing, which is equivalent to dims=2 with a deprecation warning.

  • center: if true (the default) center data so that its mean is zero.

  • scale: if true (the default) scale the data so that its variance is equal to one.

Examples

julia> using StatsBase

julia> X = [0.0 -0.5 0.5; 0.0 1.0 2.0]
2×3 Array{Float64,2}:
 0.0  -0.5  0.5
 0.0   1.0  2.0

julia> dt = fit(ZScoreTransform, X, dims=2)
ZScoreTransform{Float64}(2, 2, [0.0, 1.0], [0.5, 1.0])

julia> StatsBase.transform(dt, X)
2×3 Array{Float64,2}:
  0.0  -1.0  1.0
 -1.0   0.0  1.0
source
fit(UnitRangeTransform, X; dims=nothing, unit=true)

Fit a scaling parameters to vector or matrix X and return a UnitRangeTransform transformation object.

Keyword arguments

  • dims: if 1 fit standardization parameters in column-wise fashion;

if 2 fit in row-wise fashion. The default is nothing.

  • unit: if true (the default) shift the minimum data to zero.

Examples

julia> using StatsBase

julia> X = [0.0 -0.5 0.5; 0.0 1.0 2.0]
2×3 Array{Float64,2}:
 0.0  -0.5  0.5
 0.0   1.0  2.0

julia> dt = fit(UnitRangeTransform, X, dims=2)
UnitRangeTransform{Float64}(2, 2, true, [-0.5, 0.0], [1.0, 0.5])

julia> StatsBase.transform(dt, X)
2×3 Array{Float64,2}:
 0.5  0.0  1.0
 0.0  0.5  1.0
source
StatsBase.informationmatrixFunction
informationmatrix(model::StatisticalModel; expected::Bool = true)

Return the information matrix. By default the Fisher information matrix is returned, while the observed information matrix can be requested with expected = false.

source
StatsBase.nobsFunction
nobs(obj::StatisticalModel)

Return the number of independent observations on which the model was fitted. Be careful when using this information, as the definition of an independent observation may vary depending on the model, on the format used to pass the data, on the sampling plan (if specified), etc.

source
StatsBase.nulldevianceFunction
nulldeviance(obj::StatisticalModel)

Return the deviance of the null model, that is the one including only the intercept.

source
StatsBase.r2Function
r2(obj::StatisticalModel)
r²(obj::StatisticalModel)

Coefficient of determination (R-squared).

For a linear model, the R² is defined as $ESS/TSS$, with $ESS$ the explained sum of squares and $TSS$ the total sum of squares.

source
r2(obj::StatisticalModel, variant::Symbol)
r²(obj::StatisticalModel, variant::Symbol)

Pseudo-coefficient of determination (pseudo R-squared).

For nonlinear models, one of several pseudo R² definitions must be chosen via variant. Supported variants are:

  • :MacFadden (a.k.a. likelihood ratio index), defined as $1 - \log (L)/\log (L_0)$;
  • :CoxSnell, defined as $1 - (L_0/L)^{2/n}$;
  • :Nagelkerke, defined as $(1 - (L_0/L)^{2/n})/(1 - L_0^{2/n})$.

In the above formulas, $L$ is the likelihood of the model, $L_0$ is the likelihood of the null model (the model with only an intercept), $n$ is the number of observations, $y_i$ are the responses, $\hat{y}_i$ are fitted values and $\bar{y}$ is the average response.

Cox and Snell's R² should match the classical R² for linear models.

source
StatsBase.scoreFunction
score(obj::StatisticalModel)

Return the score of the statistical model. The score is the gradient of the log-likelihood with respect to the coefficients.

source
StatsBase.stderrorFunction
stderror(obj::StatisticalModel)

Return the standard errors for the coefficients of the model.

source
StatsBase.vcovFunction
vcov(obj::StatisticalModel)

Return the variance-covariance matrix for the coefficients of the model.

source

RegressionModel extends StatisticalModel by implementing the following additional methods.

StatsBase.crossmodelmatrixFunction
crossmodelmatrix(obj::RegressionModel)

Return X'X where X is the model matrix of obj. This function will return a pre-computed matrix stored in obj if possible.

source
StatsBase.responseFunction
response(obj::RegressionModel)

Return the model response (a.k.a. the dependent variable).

source
StatsBase.responsenameFunction
responsename(obj::RegressionModel)

Return the name of the model response (a.k.a. the dependent variable).

source
StatsBase.predictFunction
predict(obj::RegressionModel, [newX])

Form the predicted response of model obj. An object with new covariate values newX can be supplied, which should have the same type and structure as that used to fit obj; e.g. for a GLM it would generally be a DataFrame with the same variable names as the original predictors.

source