Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Predictions puts "?" regression error values on output while a regular numeric value is shown in its own visualization/dialog box #6911

Open
wvdvegte opened this issue Oct 10, 2024 · 1 comment
Labels
bug report Bug is reported by user, not yet confirmed by the core team

Comments

@wvdvegte
Copy link

What's wrong?
How can we reproduce the problem?
In the attached workflow using user reviews from Kaggle (included), open Select Columns, and drag 3 out of 4 features to 'Ignored' so that 'compound' is left as the only feature. Now for several rows, Predictions will produce a "?" on its output for 'Linear Regression (error)' while a regular numeric value is shown in its own visualization/dialog box. For instance, for row 347, the error is shown as 2 in the Predictions visualization/dialog box, but instead Data Table (1) shows a "?"

Review Mining.zip

What's your environment?

  • Operating system: Mac OS Sequoia
  • Orange version: 3.37.0
  • How you installed Orange: from dmg; updates through add-ons menu
@wvdvegte wvdvegte added the bug report Bug is reported by user, not yet confirmed by the core team label Oct 10, 2024
@processo
Copy link

I can confirm this. When the Absolute difference output of Predictions would be exactly 2 it is turned into missing (?), no matter the model or data.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug report Bug is reported by user, not yet confirmed by the core team
Projects
None yet
Development

No branches or pull requests

2 participants