Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Outcomes in WEB and R don't match #94

Open
tonisoto opened this issue Sep 10, 2020 · 11 comments
Open

Outcomes in WEB and R don't match #94

tonisoto opened this issue Sep 10, 2020 · 11 comments
Labels
bug Something isn't working question Further information is requested

Comments

@tonisoto
Copy link

By chance I noticed that the outcomes of a paired (X and Y) mean difference that I had run in R (dabestr v0.3.0) and the one obtained after copy&paste and also upload to https://www.estimationstats.com/#/analyze/paired are quite different.

In R: Paired mean difference of X (n = 202) minus Y (n = 202) 0.161 [95CI -0.371; 0.713]
In WEB: The paired mean difference between X and Y is 0.161 [95.0%CI -0.0476, 0.367]

In both cases 202 paired observations were used (I attached the csv file). In WEB both variables (X, Y) were upload as expected in distinct columns and in R I prepared a tidy dataset as it is explained here.

As you can see the limits of the 95%CI are very different and I don't understand why. Which one is the correct outcome?


By the way.. I wonder to what extend the length of the 95%CI can be used to assess whether the scores of my two variables (X,Y) are 'equivalent' as it is done in TOST analysis (e.g. TOSTER R package). Let's suppose that I define that two scores in my research field are equivalent if their mean paired differences fall inside -0.5 and +0.5. If all I said is correct the outcome of the web [95.0%CI -0.0476, 0.367] would support equivalence but the outcome in R does not. Am I correct if I use the 95%CI of dabestr to assess equivalence or do I have to run specific tests for this?

Thanks you so much for your software. Actually, as soon as I was able to solve this issue I'm going to use it in my next paper. 👍

Best regards from Spain!

202_X_Y_paired_data.zip

@josesho josesho added question Further information is requested bug Something isn't working labels Sep 11, 2020
@josesho
Copy link
Member

josesho commented Sep 11, 2020

HI @tonisoto , thanks for raising this issue; I can replicate it. Let me pinpoint what's going wrong and get back ASAP!

@tonisoto
Copy link
Author

tonisoto commented Sep 11, 2020 via email

@francescastarita
Copy link

francescastarita commented Jan 25, 2021

Hi @josesho and @tonisoto,

I wondered if this was solved as I have the same problem.

In R: Paired mean difference of CS+ (n = 30) minus CS-(n = 30) 0.207 [95CI -2.04; 2.59]
In WEB: Paired mean difference of CS+ (n = 30) minus CS-(n = 30) 0.207 [95.0%CI 0.0687, 0.388]

This is the raw data:
CS- | CS+
8.33 | 8.57
1.30 | 1.26
6.57 | 6.57
7.11 | 7.64
7.16 | 7.46
6.69 | 6.44
4.77 | 4.72
3.21 | 3.46
13.17 | 13.17
5.33 | 5.71
3.04 | 2.66
11.74 | 11.64
3.34 | 3.58
4.39 | 3.88
2.90 | 2.84
6.98 | 6.83
7.84 | 8.59
18.94 | 19.86
19.91 | 21.43
8.42 | 8.88
7.41 | 6.82
11.85 | 12.82
4.99 | 5.31
3.51 | 3.79
10.17 | 10.29
6.48 | 6.62
5.95 | 5.85
3.67 | 3.95
2.90 | 3.13
5.50 | 6.02

Thanks,
Francesca

@josesho
Copy link
Member

josesho commented Feb 7, 2021

There is a bug in the way the confidence intervals for paired differences are computed. Currently now it seems the CIs for unpaired differences are returned instead.

Specifically, it looks to be related to how stratified resampling is performed to ensure that we can resample from groups with different Ns.

This is not an issue for UNpaired, of course, but it looks like there needs to be some refactoring done to handle paired differences, OR if anyone can point me to how to handle bootstrapping with with the boot package for groups with non-matching Ns, that might also work!

I apologise for this inconvenience!

edit: to clarify typo below.

@francescastarita
Copy link

Thank you for the clarification! Unfortunately I am not able to help with the bug.

Just a couple of other questions:

So if I use the website, I should be fine, shouldn't I? I am asking because I have used it to make the figures of 2 recently published papers and I was planning to use it for my future papers, so I wanted to make sure I got a correct outcome. I have looked also at the MATLAB version of the code, but there doesn't seem a way to calculate CI for multi paired groups.

"This is not an issue for paired, of course, ..." Do you mean it is not an issue for UNpaired?

Thanks again

@josesho
Copy link
Member

josesho commented Feb 9, 2021

You should be fine if you use the web app, yes!

Also, yes, I meant unpaired, thanks for pointing it out! Have edited the typo.

@josesho
Copy link
Member

josesho commented Jun 24, 2021

cf #107

@josesho
Copy link
Member

josesho commented Jul 26, 2021

Update:

The dev version v0.3.9999 fixes this, and can be installed with

devtools::install_github("ACCLAB/dabestr",  ref = "v0.3.9999")

cf #99

@mlotinga
Copy link

mlotinga commented Jun 7, 2024

It would be really great if this bugfix could be pushed into a release? Currently the release version should not be used for repeated measures analysis, while the bugfixed dev (0.3.9999) version is behind on many other aspects. Thanks for your help!

@mlotinga
Copy link

mlotinga commented Jun 7, 2024

Or is it believed that this has been addressed in the latest released version? My analysis shows this not to be the case using v2023.9.12. Using the same dataset:

Mean diff v2023.9.12: -0.315 [-0.617 -0.0206]
Estimation tools (web): -0.315 [-0.512, -0.116]

dev v0.3.9999 95% CIs match web, not v2023.9.12

@mlotinga
Copy link

mlotinga commented Jun 7, 2024

I've also tried using the

devtools::install_github("ACCLAB/dabestr", ref = "dev")

version _2023.9.12, but, while this addresses many of the updates missing in v0.3.9999, this still gives me the same (incorrect) result as v2023.9.12 for the paired differences analysis:

Mean diff 'dev' v_2023.9.12: -0.315 [-0.617 -0.0206]
Estimation tools (web): -0.315 [-0.512, -0.116]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants