Title: | Utility Functions for 'MixtComp' Outputs |
---|---|
Description: | Mixture Composer <https://github.com/modal-inria/MixtComp> is a project to build mixture models with heterogeneous data sets and partially missing data management. This package contains graphical, getter and some utility functions to facilitate the analysis of 'MixtComp' output. |
Authors: | Vincent Kubicki [aut], Christophe Biernacki [aut], Quentin Grimonprez [aut, cre], Matthieu Marbac-Lourdelle [aut], Étienne Goffinet [ctb], Serge Iovleff [ctb], Julien Vandaele [ctb] |
Maintainer: | Quentin Grimonprez <[email protected]> |
License: | AGPL-3 |
Version: | 4.1.6 |
Built: | 2024-11-09 04:53:52 UTC |
Source: | https://github.com/modal-inria/mixtcomp |
MixtComp (Mixture Composer, https://github.com/modal-inria/MixtComp) is a model-based clustering package for mixed data originating from the Modal team (Inria Lille).
It has been engineered around the idea of easy and quick integration of all new univariate models, under the conditional independence assumption. Five basic models (Gaussian, Multinomial, Poisson, Weibull, NegativeBinomial) are implemented, as well as two advanced models (Func_CS and Rank_ISR). MixtComp has the ability to natively manage missing data (completely or by interval). MixtComp is used as an R package, but its internals are coded in C++ using state of the art libraries for faster computation.
This package contains plots, getters and format functions to simplify the use of RMixtComp
and RMixtCompIO
packages. It is recommended to use RMixtComp
(instead of RMixtCompIO
) which is more user-friendly.
createAlgo gives you default values for required parameters.
convertFunctionalToVector
, createFunctional
and refactorCategorical
functions help to transform data
to the required format.
Getters are available to easily access some results: getBIC, getICL, getCompletedData, getParam, getTik, getEmpiricTik, getPartition, getType, getModel, getVarNames.
You can compute discriminative powers and similarities with functions: computeDiscrimPowerClass, computeDiscrimPowerVar, computeSimilarityClass, computeSimilarityVar.
Graphics functions are plot.MixtComp, heatmapClass, heatmapTikSorted, heatmapVar, histMisclassif, plotConvergence,plotDataBoxplot, plotDataCI, plotDiscrimClass, plotDiscrimVar, plotProportion.
RMixtComp
RMixtCompIO
Rmixmod
packages
Get information about models implemented in MixtComp
availableModels()
availableModels()
a data.frame containing models implemented in MixtComp
model name
data type
Special format required for individuals
accepted formats (separated by a ;) for missing values
Required hyperparameters in the paramStr elements of model object
comments about the model
link to article
Quentin Grimonprez
mixtCompLearn
availableModels()
availableModels()
Add the missing element to algo parameter with default values
completeAlgo(algo)
completeAlgo(algo)
algo |
a list with the different algo parameters for rmc function |
algo parameter with all required elements (see createAlgo function)
Quentin Grimonprez
Compute the discriminative power of each variable or class
computeDiscrimPowerVar(outMixtComp, class = NULL) computeDiscrimPowerClass(outMixtComp)
computeDiscrimPowerVar(outMixtComp, class = NULL) computeDiscrimPowerClass(outMixtComp)
outMixtComp |
object of class MixtCompLearn or MixtComp obtained using |
class |
NULL or a number of classes. If NULL, return the discriminative power of variables globally otherwise return the discriminative power of variables in the given class |
The discriminative power of variable j is defined by 1 - C(j)
A high value (close to one) means that the variable is highly discriminating. A low value (close to zero) means that the variable is poorly discriminating.
The discriminative power of variable j in class k is defined by 1 - C(j)
The discriminative power of class k is defined by 1 - D(k)
the discriminative power
Matthieu Marbac
plotDiscrimClass
plotDiscrimVar
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) discVar <- computeDiscrimPowerVar(resLearn) discVarInClass1 <- computeDiscrimPowerVar(resLearn, class = 1) discClass <- computeDiscrimPowerClass(resLearn) # graphic representation of discriminant variables plotDiscrimVar(resLearn) # graphic representation of discriminant classes plotDiscrimClass(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) discVar <- computeDiscrimPowerVar(resLearn) discVarInClass1 <- computeDiscrimPowerVar(resLearn, class = 1) discClass <- computeDiscrimPowerClass(resLearn) # graphic representation of discriminant variables plotDiscrimVar(resLearn) # graphic representation of discriminant classes plotDiscrimClass(resLearn) }
Compute the similarity between variables (or classes)
computeSimilarityVar(outMixtComp) computeSimilarityClass(outMixtComp)
computeSimilarityVar(outMixtComp) computeSimilarityClass(outMixtComp)
outMixtComp |
object of class MixtCompLearn or MixtComp obtained using |
The similarities between variables j and h is defined by Delta(j,h)
The similarities between classes k and g is defined by 1 - Sigma(k,g)
a similarity matrix
Quentin Grimonprez
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) simVar <- computeSimilarityVar(resLearn) simClass <- computeSimilarityClass(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) simVar <- computeSimilarityVar(resLearn) simClass <- computeSimilarityClass(resLearn) }
Convert a MixtComp functional string into a list of 2 vectors
convertFunctionalToVector(x)
convertFunctionalToVector(x)
x |
a string containing a functional observation (cf example) |
a list of 2 vectors: time
and value
Quentin Grimonprez
convertFunctionalToVector("1:5,1.5:12,1.999:2.9")
convertFunctionalToVector("1:5,1.5:12,1.999:2.9")
create an algo object required by mixtCompLearn
and mixtCompPredict
from RMixtComp
.
createAlgo( nbBurnInIter = 50, nbIter = 50, nbGibbsBurnInIter = 50, nbGibbsIter = 50, nInitPerClass = 50, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.99, nStableCriterion = 20 )
createAlgo( nbBurnInIter = 50, nbIter = 50, nbGibbsBurnInIter = 50, nbGibbsIter = 50, nInitPerClass = 50, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.99, nStableCriterion = 20 )
nbBurnInIter |
Number of iterations of the burn-in part of the SEM algorithm. |
nbIter |
Number of iterations of the SEM algorithm. |
nbGibbsBurnInIter |
Number of iterations of the burn-in part of the Gibbs algorithm. |
nbGibbsIter |
Number of iterations of the Gibbs algorithm. |
nInitPerClass |
Number of individuals used to initialize each cluster. |
nSemTry |
Number of try of the algorithm for avoiding an error. |
confidenceLevel |
confidence level for confidence bounds for parameter estimation |
ratioStableCriterion |
stability partition required to stop earlier the SEM |
nStableCriterion |
number of iterations of partition stability to stop earlier the SEM |
a list with the parameters values
Quentin Grimonprez
# default values algo <- createAlgo() # change some values algo <- createAlgo(nbIter = 200)
# default values algo <- createAlgo() # change some values algo <- createAlgo(nbIter = 200)
Create a functional in MixtComp format
createFunctional(time, value)
createFunctional(time, value)
time |
vector containing the time of the functional |
value |
vector containing the value of the functional |
The functional data formatted to the MixtComp standard
Quentin Grimonprez
mat <- matrix(c(1, 2, 3, 9, 1, 1.5, 15, 1000), ncol = 2) createFunctional(mat[, 1], mat[, 2])
mat <- matrix(c(1, 2, 3, 9, 1, 1.5, 15, 1000), ncol = 2) createFunctional(mat[, 1], mat[, 2])
format data.frame or matrix in list of character
formatData(data)
formatData(data)
data |
data parameter as data.frame, matrix or list |
data as a list of characters
Quentin Grimonprez
Format the model list for rmc/rmcMultiRun functions: - add paramStr when missing - ensure the list format of each element
formatModel(model)
formatModel(model)
model |
description of model used per variable |
model as a list where each element is the model applied to a variable (list with elements type and paramStr)
Quentin Grimonprez
Get criterion value
getBIC(outMixtComp) getICL(outMixtComp)
getBIC(outMixtComp) getICL(outMixtComp)
outMixtComp |
object of class MixtCompLearn or MixtComp obtained using |
value of the criterion
Quentin Grimonprez
Other getter:
getCompletedData()
,
getEmpiricTik()
,
getMixtureDensity()
,
getParam()
,
getPartition()
,
getType()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # get criterion bic <- getBIC(resLearn) icl <- getICL(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # get criterion bic <- getBIC(resLearn) icl <- getICL(resLearn) }
Get the completed data from MixtComp object (does not manage functional models)
getCompletedData(outMixtComp, var = NULL, with.z_class = FALSE)
getCompletedData(outMixtComp, var = NULL, with.z_class = FALSE)
outMixtComp |
object of class MixtCompLearn or MixtComp obtained using |
var |
Name of the variables for which to extract the completed data. Default is NULL (all variables are extracted) |
with.z_class |
if TRUE, z_class is returned with the data. |
a matrix with the data completed by MixtComp (z_class is in the first column and then variables are sorted in alphabetic order, it may differ from the original order of the data).
Quentin Grimonprez
Other getter:
getBIC()
,
getEmpiricTik()
,
getMixtureDensity()
,
getParam()
,
getPartition()
,
getType()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) # add missing values dataLearn$var1[12] <- "?" dataLearn$var2[72] <- "?" model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <- RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # get completedData completedData <- getCompletedData(resLearn) completedData2 <- getCompletedData(resLearn, var = "var1") }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) # add missing values dataLearn$var1[12] <- "?" dataLearn$var2[72] <- "?" model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <- RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # get completedData completedData <- getCompletedData(resLearn) completedData2 <- getCompletedData(resLearn, var = "var1") }
Get the a posteriori probability to belong to each class for each individual
getEmpiricTik(outMixtComp) getTik(outMixtComp, log = TRUE)
getEmpiricTik(outMixtComp) getTik(outMixtComp, log = TRUE)
outMixtComp |
object of class MixtCompLearn or MixtComp obtained using |
log |
if TRUE, log(tik) are returned |
getTik returns a posteriori probabilities computed with the returned parameters. getEmpiricTik returns an estimation based on the sampled z_i during the algorithm.
a matrix containing the tik for each individual (in row) and each class (in column).
Quentin Grimonprez
Other getter:
getBIC()
,
getCompletedData()
,
getMixtureDensity()
,
getParam()
,
getPartition()
,
getType()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # get tik tikEmp <- getEmpiricTik(resLearn) tik <- getTik(resLearn, log = FALSE) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # get tik tikEmp <- getEmpiricTik(resLearn) tik <- getTik(resLearn, log = FALSE) }
Get the mixture density for each individual
getMixtureDensity(outMixtComp)
getMixtureDensity(outMixtComp)
outMixtComp |
object of class MixtCompLearn or MixtComp obtained using |
a vector containing the mixture density for each individual.
Quentin Grimonprez
Other getter:
getBIC()
,
getCompletedData()
,
getEmpiricTik()
,
getParam()
,
getPartition()
,
getType()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) d <- getMixtureDensity(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) d <- getMixtureDensity(resLearn) }
Get the estimated parameter
getParam(outMixtComp, var) getProportion(outMixtComp)
getParam(outMixtComp, var) getProportion(outMixtComp)
outMixtComp |
object of class MixtCompLearn or MixtComp obtained using |
var |
name of the variable to get parameter |
the parameter of the variable
Quentin Grimonprez
Other getter:
getBIC()
,
getCompletedData()
,
getEmpiricTik()
,
getMixtureDensity()
,
getPartition()
,
getType()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <- RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # get estimated parameters for variable var1 param <- getParam(resLearn, "var1") prop <- getProportion(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <- RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # get estimated parameters for variable var1 param <- getParam(resLearn, "var1") prop <- getProportion(resLearn) }
Get the estimated class from MixtComp object
getPartition(outMixtComp, empiric = FALSE)
getPartition(outMixtComp, empiric = FALSE)
outMixtComp |
object of class MixtCompLearn or MixtComp obtained using |
empiric |
if TRUE, use the partition obtained at the end of the gibbs algorithm. If FALSE, use the partition obtained with the observed probabilities. |
a vector containing the estimated class for each individual.
Quentin Grimonprez
Other getter:
getBIC()
,
getCompletedData()
,
getEmpiricTik()
,
getMixtureDensity()
,
getParam()
,
getType()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <- RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # get class estimatedClass <- getPartition(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <- RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # get class estimatedClass <- getPartition(resLearn) }
getType returns the type output of a MixtComp object, getModel returns the model object, getVarNames returns the name for each variable
getType(outMixtComp, with.z_class = FALSE) getModel(outMixtComp, with.z_class = FALSE) getVarNames(outMixtComp, with.z_class = FALSE)
getType(outMixtComp, with.z_class = FALSE) getModel(outMixtComp, with.z_class = FALSE) getVarNames(outMixtComp, with.z_class = FALSE)
outMixtComp |
object of class MixtCompLearn or MixtComp obtained using |
with.z_class |
if TRUE, the type of z_class is returned. |
a vector containing the type of models, names associated with each individual.
Quentin Grimonprez
Other getter:
getBIC()
,
getCompletedData()
,
getEmpiricTik()
,
getMixtureDensity()
,
getParam()
,
getPartition()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # get type type <- getType(resLearn) # get model object model <- getModel(resLearn) # get variable names varNames <- getVarNames(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # get type type <- getType(resLearn) # get model object model <- getModel(resLearn) # get variable names varNames <- getVarNames(resLearn) }
Heatmap of the similarities between classes about clustering
heatmapClass(output, pkg = c("ggplot2", "plotly"), ...)
heatmapClass(output, pkg = c("ggplot2", "plotly"), ...)
output |
object returned by mixtCompLearn function from RMixtComp or rmcMultiRun function from RMixtCompIO |
pkg |
"ggplot2" or "plotly". Package used to plot |
... |
arguments to be passed to plot_ly. For pkg = "ggplot2", addValues = TRUE prints similarity values on the heatmap |
The similarities between classes k and g is defined by 1 - Sigma(k,g)
Matthieu MARBAC
Other plot:
heatmapTikSorted()
,
heatmapVar()
,
histMisclassif()
,
plot.MixtComp()
,
plotConvergence()
,
plotDataBoxplot()
,
plotDataCI()
,
plotDiscrimClass()
,
plotDiscrimVar()
,
plotParamConvergence()
,
plotProportion()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot heatmapClass(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot heatmapClass(resLearn) }
Heatmap of the tik = P(Z_i=k|x_i)
heatmapTikSorted(output, pkg = c("ggplot2", "plotly"), ...)
heatmapTikSorted(output, pkg = c("ggplot2", "plotly"), ...)
output |
object returned by mixtCompLearn function from RMixtComp or rmcMultiRun function from RMixtCompIO |
pkg |
"ggplot2" or "plotly". Package used to plot |
... |
arguments to be passed to plot_ly |
Observation are sorted according to the hard partition then for each component they are sorted by decreasing order of their tik's
Matthieu MARBAC
Other plot:
heatmapClass()
,
heatmapVar()
,
histMisclassif()
,
plot.MixtComp()
,
plotConvergence()
,
plotDataBoxplot()
,
plotDataCI()
,
plotDiscrimClass()
,
plotDiscrimVar()
,
plotParamConvergence()
,
plotProportion()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot heatmapTikSorted(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot heatmapTikSorted(resLearn) }
Heatmap of the similarities between variables about clustering
heatmapVar(output, pkg = c("ggplot2", "plotly"), ...)
heatmapVar(output, pkg = c("ggplot2", "plotly"), ...)
output |
object returned by mixtCompLearn function from RMixtComp or rmcMultiRun function from RMixtCompIO |
pkg |
"ggplot2" or "plotly". Package used to plot |
... |
arguments to be passed to plot_ly. For pkg = "ggplot2", addValues = TRUE prints similarity values on the heatmap |
The similarities between variables j and h is defined by Delta(j,h)
Matthieu MARBAC
Other plot:
heatmapClass()
,
heatmapTikSorted()
,
histMisclassif()
,
plot.MixtComp()
,
plotConvergence()
,
plotDataBoxplot()
,
plotDataCI()
,
plotDiscrimClass()
,
plotDiscrimVar()
,
plotParamConvergence()
,
plotProportion()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot heatmapVar(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot heatmapVar(resLearn) }
Histogram of the misclassification probabilities
histMisclassif(output, pkg = c("ggplot2", "plotly"), ...)
histMisclassif(output, pkg = c("ggplot2", "plotly"), ...)
output |
object returned by mixtCompLearn function from RMixtComp or rmcMultiRun function from RMixtCompIO |
pkg |
"ggplot2" or "plotly". Package used to plot |
... |
arguments to be passed to plot_ly |
Missclassification probability of observation i is denoted err_i err_i = 1 - max_k=1,...,K P(Z_i=k|x_i) Histograms of err_i's can be plotted for a specific class, all classes or every class
Matthieu MARBAC
Other plot:
heatmapClass()
,
heatmapTikSorted()
,
heatmapVar()
,
plot.MixtComp()
,
plotConvergence()
,
plotDataBoxplot()
,
plotDataCI()
,
plotDiscrimClass()
,
plotDiscrimVar()
,
plotParamConvergence()
,
plotProportion()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot histMisclassif(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot histMisclassif(resLearn) }
Plot of a MixtComp object
## S3 method for class 'MixtComp' plot( x, nVarMaxToPlot = 3, pkg = c("ggplot2", "plotly"), plotData = c("CI", "Boxplot"), ... )
## S3 method for class 'MixtComp' plot( x, nVarMaxToPlot = 3, pkg = c("ggplot2", "plotly"), plotData = c("CI", "Boxplot"), ... )
x |
MixtComp object |
nVarMaxToPlot |
number of variables to display |
pkg |
"ggplot2" or "plotly". Package used to plot |
plotData |
"CI" or "Boxplot". If "CI", uses plotDataCI function. If "Boxplot", uses plotDataBoxplot |
... |
extra parameter for plotDataCI |
Quentin Grimonprez
mixtCompLearn
mixtCompPredict
Other plot:
heatmapClass()
,
heatmapTikSorted()
,
heatmapVar()
,
histMisclassif()
,
plotConvergence()
,
plotDataBoxplot()
,
plotDataCI()
,
plotDiscrimClass()
,
plotDiscrimVar()
,
plotParamConvergence()
,
plotProportion()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) plot(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) plot(resLearn) }
Plot the evolution of the completed loglikelihood during the SEM algorithm. The vertical line denotes the end of the burn-in phase.
plotConvergence(output, ...)
plotConvergence(output, ...)
output |
object returned by mixtCompLearn function from RMixtComp or rmcMultiRun function from RMixtCompIO |
... |
graphical parameters |
This function can be used to check the convergence and choose the parameters nbBurnInIter and nbIter from mcStrategy.
Quentin Grimonprez
Other plot:
heatmapClass()
,
heatmapTikSorted()
,
heatmapVar()
,
histMisclassif()
,
plot.MixtComp()
,
plotDataBoxplot()
,
plotDataCI()
,
plotDiscrimClass()
,
plotDiscrimVar()
,
plotParamConvergence()
,
plotProportion()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot plotConvergence(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot plotConvergence(resLearn) }
Display a boxplot (5
plotDataBoxplot( output, var, class = seq_len(output$algo$nClass), grl = TRUE, pkg = c("ggplot2", "plotly"), ... )
plotDataBoxplot( output, var, class = seq_len(output$algo$nClass), grl = TRUE, pkg = c("ggplot2", "plotly"), ... )
output |
object returned by mixtCompLearn function from RMixtComp or rmcMultiRun function from RMixtCompIO |
var |
name of the variable |
class |
classes to plot |
grl |
if TRUE plot the general distribution of the data |
pkg |
"ggplot2" or "plotly". Package used to plot |
... |
other parameters (see Details) |
For functional data, three other parameters are available:
if TRUE, observations are added to the plot. Default = FALSE.
ylim of the plot.
xlim of the plot.
Matthieu MARBAC
Other plot:
heatmapClass()
,
heatmapTikSorted()
,
heatmapVar()
,
histMisclassif()
,
plot.MixtComp()
,
plotConvergence()
,
plotDataCI()
,
plotDiscrimClass()
,
plotDiscrimVar()
,
plotParamConvergence()
,
plotProportion()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot plotDataBoxplot(resLearn, "var1") }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot plotDataBoxplot(resLearn, "var1") }
Mean and 95%-level confidence intervals per class
plotDataCI( output, var, class = seq_len(output$algo$nClass), grl = FALSE, pkg = c("ggplot2", "plotly"), ... )
plotDataCI( output, var, class = seq_len(output$algo$nClass), grl = FALSE, pkg = c("ggplot2", "plotly"), ... )
output |
object returned by mixtCompLearn function from RMixtComp or rmcMultiRun function from RMixtCompIO |
var |
name of the variable |
class |
class to plot |
grl |
if TRUE plot the CI for the dataset and not only classes |
pkg |
"ggplot2" or "plotly". Package used to plot |
... |
other parameters (see Details) |
For functional data, three other parameters are available:
if TRUE, observations are added to the plot. Default = FALSE.
if FALSE, confidence intervals are removed from the plot. Default = TRUE.
xlim of the plot.
ylim of the plot.
Matthieu MARBAC
Other plot:
heatmapClass()
,
heatmapTikSorted()
,
heatmapVar()
,
histMisclassif()
,
plot.MixtComp()
,
plotConvergence()
,
plotDataBoxplot()
,
plotDiscrimClass()
,
plotDiscrimVar()
,
plotParamConvergence()
,
plotProportion()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot plotDataCI(resLearn, "var1") }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot plotDataCI(resLearn, "var1") }
Barplot of the discriminative power of the classes
plotDiscrimClass(output, ylim = c(0, 1), pkg = c("ggplot2", "plotly"), ...)
plotDiscrimClass(output, ylim = c(0, 1), pkg = c("ggplot2", "plotly"), ...)
output |
object returned by mixtCompLearn function from RMixtComp or rmcMultiRun function from RMixtCompIO |
ylim |
vector of length 2 defining the range of y-axis |
pkg |
"ggplot2" or "plotly". Package used to plot |
... |
arguments to be passed to plot_ly |
The discriminative power of class k is defined by 1 - D(k)
Matthieu MARBAC
Other plot:
heatmapClass()
,
heatmapTikSorted()
,
heatmapVar()
,
histMisclassif()
,
plot.MixtComp()
,
plotConvergence()
,
plotDataBoxplot()
,
plotDataCI()
,
plotDiscrimVar()
,
plotParamConvergence()
,
plotProportion()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) plotDiscrimClass(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) plotDiscrimClass(resLearn) }
Barplot of the discriminative power of the variables
plotDiscrimVar( output, class = NULL, ylim = c(0, 1), pkg = c("ggplot2", "plotly"), ... )
plotDiscrimVar( output, class = NULL, ylim = c(0, 1), pkg = c("ggplot2", "plotly"), ... )
output |
object returned by mixtCompLearn function from RMixtComp or rmcMultiRun function from RMixtCompIO |
class |
NULL or a number of classes. If NULL, return the discrimative power of variables globally otherwise return the discrimative power of variables in the given class |
ylim |
vector of length 2 defining the range of y-axis |
pkg |
"ggplot2" or "plotly". Package used to plot |
... |
arguments to be passed to plot_ly |
The discriminative power of variable j is defined by 1 - C(j)
Matthieu MARBAC
Other plot:
heatmapClass()
,
heatmapTikSorted()
,
heatmapVar()
,
histMisclassif()
,
plot.MixtComp()
,
plotConvergence()
,
plotDataBoxplot()
,
plotDataCI()
,
plotDiscrimClass()
,
plotParamConvergence()
,
plotProportion()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot plotDiscrimVar(resLearn) plotDiscrimVar(resLearn, class = 1) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot plotDiscrimVar(resLearn) plotDiscrimVar(resLearn, class = 1) }
Plot the evolution of estimated parameters after the burn-in phase.
plotParamConvergence(output, var, ...)
plotParamConvergence(output, var, ...)
output |
object returned by mixtCompLearn function from RMixtComp or rmcMultiRun function from RMixtCompIO |
var |
name of the variable |
... |
graphical parameters |
Quentin Grimonprez
Other plot:
heatmapClass()
,
heatmapTikSorted()
,
heatmapVar()
,
histMisclassif()
,
plot.MixtComp()
,
plotConvergence()
,
plotDataBoxplot()
,
plotDataCI()
,
plotDiscrimClass()
,
plotDiscrimVar()
,
plotProportion()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot plotParamConvergence(resLearn, "var1") plotParamConvergence(resLearn, "var2") }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot plotParamConvergence(resLearn, "var1") plotParamConvergence(resLearn, "var2") }
Plot the mixture's proportions
plotProportion(output, pkg = c("ggplot2", "plotly"), ...)
plotProportion(output, pkg = c("ggplot2", "plotly"), ...)
output |
object returned by mixtCompLearn function from RMixtComp or rmcMultiRun function from RMixtCompIO |
pkg |
"ggplot2" or "plotly". Package used to plot |
... |
arguments to be passed to plot_ly |
Quentin Grimonprez
Other plot:
heatmapClass()
,
heatmapTikSorted()
,
heatmapVar()
,
histMisclassif()
,
plot.MixtComp()
,
plotConvergence()
,
plotDataBoxplot()
,
plotDataCI()
,
plotDiscrimClass()
,
plotDiscrimVar()
,
plotParamConvergence()
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot plotProportion(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) # plot plotProportion(resLearn) }
Print a MixtComp object
## S3 method for class 'MixtComp' print(x, nVarMaxToPrint = 5, ...)
## S3 method for class 'MixtComp' print(x, nVarMaxToPrint = 5, ...)
x |
MixtComp object |
nVarMaxToPrint |
number of variables to display (including z_class) |
... |
parameter of |
Quentin Grimonprez
mixtCompLearn
mixtCompPredict
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) print(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) print(resLearn) }
Rename a categorical value
refactorCategorical( data, oldCateg = unique(data), newCateg = seq_along(oldCateg) )
refactorCategorical( data, oldCateg = unique(data), newCateg = seq_along(oldCateg) )
data |
matrix/data.frame/vector containing the data |
oldCateg |
vector containing categories to change |
newCateg |
vector containing new categorical values |
Data with new categorical values
Quentin Grimonprez
dat <- c("single", "married", "married", "divorced", "single") refactorCategorical(dat, c("single", "married", "divorced"), 1:3)
dat <- c("single", "married", "married", "divorced", "single") refactorCategorical(dat, c("single", "married", "divorced"), 1:3)
Summary of a MixtComp object
## S3 method for class 'MixtComp' summary(object, ...)
## S3 method for class 'MixtComp' summary(object, ...)
object |
MixtComp object |
... |
Not used. |
Quentin Grimonprez
mixtCompLearn
print.MixtComp
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) summary(resLearn) }
if (requireNamespace("RMixtCompIO", quietly = TRUE)) { dataLearn <- list( var1 = as.character(c(rnorm(50, -2, 0.8), rnorm(50, 2, 0.8))), var2 = as.character(c(rnorm(50, 2), rpois(50, 8))) ) model <- list( var1 = list(type = "Gaussian", paramStr = ""), var2 = list(type = "Poisson", paramStr = "") ) algo <- list( nClass = 2, nInd = 100, nbBurnInIter = 100, nbIter = 100, nbGibbsBurnInIter = 100, nbGibbsIter = 100, nInitPerClass = 3, nSemTry = 20, confidenceLevel = 0.95, ratioStableCriterion = 0.95, nStableCriterion = 10, mode = "learn" ) resLearn <-RMixtCompIO::rmcMultiRun(algo, dataLearn, model, nRun = 3) summary(resLearn) }