All functions |
|
|---|---|
An Activation Function: Softmax |
|
the Comparison Data (CD) Approach |
|
the Comparison Data Forest (CDF) Approach |
|
Check and Install Python Libraries (numpy and onnxruntime) |
|
25 Personality Items Representing 5 Factors |
|
20-item Dependency-Oriented and Achievement-Oriented Psychological Control Scale (DAPCS) |
|
Subset Dataset for Training the Deep Neural Network (DNN) |
|
Subset Dataset for Training the Long Short Term Memory (LSTM) Network |
|
the Scaler for the pre-trained Deep Neural Network (DNN) |
|
the Scaler for the pre-trained Long Short Term Memory (LSTM) Network |
|
Hierarchical Clustering for EFA |
|
Various Indeces in EFA |
|
K-means for EFA |
|
Scree Plot |
|
Simulate Data that Conforms to the theory of Exploratory Factor Analysis. |
|
Voting Method for Number of Factors in EFA |
|
Empirical Kaiser Criterion |
|
Extracting features According to Goretzko & Buhner (2020) |
|
Extracting features for the pre-trained Neural Networks for Determining the Number of Factors |
|
Factor Analysis by Principal Axis Factoring |
|
Factor Forest (FF) Powered by An Tuned XGBoost Model for Determining the Number of Factors |
|
Simulating Data Following John Ruscio's RGenData |
|
the Hull Approach |
|
Kaiser-Guttman Criterion |
|
Load the the pre-trained Neural Networks for Determining the Number of Factors |
|
Load the Scaler for the pre-trained Neural Networks for Determining the Number of Factors |
|
Load the Tuned XGBoost Model |
|
Minimum Average Partial (MAP) Test |
|
the Tuned XGBoost Model for Determining the Number of Facotrs |
|
the pre-trained Neural Networks for Determining the Number of Factors |
|
Feature Normalization for the pre-trained Neural Networks for Determining the Number of Factors |
|
Parallel Analysis |
|
|
|
Plot Methods |
Prediction Function for the Tuned XGBoost Model with Early Stopping |
|
|
|
Print Methods |
Scree Test Optimal Coordinate (STOC) |
|