1.8.0 release notes for the Engine, including the introduction of new explainable AI features and improved batch prediction.
We are excited to announce the latest release of the AI & Analytics Engine - 1.8.0. This release aims to improve the insights and accessibility of the supervised machine learning predictive models. The PI.EXCHANGE team is aware about the importance of explainability AI for users to confidently use AI-powered tools. Therefore, the team is determined to help users gain more insights about the impact of different features on the models' predictions by introducing Feature importance and Prediction explanation. Furthermore, the model summary tab is updated to help users find the available tools easily for model evaluation and prediction consumption. Finally, as a result of the technical improvements for models, batch prediction is also now available for any model without the need for an active deployment.Feature importance
A step to sense-check machine learning model performance is to check the factors contributing to the models' predictions. In the Engine, for any successful trained supervised machine learning model (regression and classification), users can generate the feature importance score after a model successfully finishes training. For models that predict a label from a list of labels (multi-class classification), the feature importance scores are different for different labels.
By default, up to 5 features are displayed. To see the other contributing score, click “show all”
For models predicting a label from a list of labels (multi-class classification), the feature importance scores are different for different labels.
Prediction explanation
In addition to the overall impact score of each feature, users can now investigate how different values of a feature impact a model’s prediction using the prediction explanation feature. For models that predict a label from a list of labels (multi-class classification), prediction explanations are different for different labels. So, a prediction explanation is generated for each label.
Prediction explanation examples
Users can view some prediction explanation examples from the data used to build predictive models.
Live prediction explanation in what-if analysis tool
Users can also generate prediction explanations as they experiment with different input value combinations using the what-if analysis tool.
Model summary tab
Given that each model offers multiple tools to examine insights and generate predictions, a model summary tab is developed to give users guidance on the next steps after successful model training in the journey.
Batch prediction without an active deployment
We have made technical improvements so that users can now use the batch prediction feature to upload a file and generate predictions without deploying a model. The batch prediction feature can be found in the Prediction tab at the model detail page.