Thank you very much to those of you who have expressed interest in the IUD study. We have decided to perform two sets of estimation analyses. The first set compares copper intrauterine device (Cu-IUD) to levonorgestrel intrauterine system (LNG-IUS) users with cohort definitions that have been designed for an EHR database. The PLE package for that analysis can be found here: http://atlas-demo.ohdsi.org/#/estimation/cca/262
Cohorts# 1772370-75 and 1772392-95 are additional cohorts for sensitivity analyses.
The second analysis compares Cu-IUD to LNG-IUS users with definitions that have been designed for a claims database. The PLE package for that analysis can be found here: http://atlas-demo.ohdsi.org/#/estimation/cca/230.
The outcome cohort is the same. Cohorts # 1772814-23 are additional cohorts for sensitivity analyses.
We intend to distribute an R package soon that consists of the PLE packages and automated non-standard characterizations. Thank you very much for your help and please let me know if I can provide you with any additional information.
@mattspotnitz Would you mind if I upload this package to the OHDSI Study Git repo?
This is an OHDSI study (which means a multi-institutional study). And I want OHDSI study leaders to follow our best practice (make the code reproducible and traceable).
Hi @SCYou, You are right. It does belong there. I wanted to get it ready before moving it over, and Iâm adding a few other analyses to the package. Hopefully, weâll be ready in a day or two. Afterwards, weâll definitely move it over to the OHDSI study repo. In the interim, you can find the repo here (still need to update the new some of the cohort definitions).
Guys⌠should we really be customizing definitions to the type of RWD weâre running the study against? Doesnât that fly in the face of what OHDSI network studies are supposed to do?
We are recommending that all sites who are interested in participating try to run both sets of T/Cs. The original study was appropriate for the Columbia data, but on testing in a large US claims data was determined to not correctly classify exposure, so the alternative T/Cs were created, which when tested back at Columbia, did not perform consistently with the original T/C definitions. So, we have prepared two packages as a means of performing a sensitivity analysis around exposure definition to evaluate robustness of the findings.
Actually, current ATLAS-generated PLE and PLP package generates a table with the count of the cohorts in the study. So we can share this file first, before run the rest of package.
Now, thinking out loud: we could have a study package set up run with CreateCohorts=TRUE and turn everything else to FALSE. Iâve never thought to do this but it would be a very simple feasibility test.
I do agree that it would be useful to have some easier way to do feasibility. When you receive just a study ZIP (preferably in a GitHub repo ), there is a bit of a scavenger hunt to find the cohorts to throw them into ATLAS for a quick and dirty feasibility test. It gets more manageable as you become more familiar the study package format.
Youâre so right. A lot of challenges arise from small samples. Heck, I didnât even know you can also somehow screw up packaging your study cohorts until yesterday. May also have been a hazard of that investigator designing in a very active public spot that anyone can muck with.
Other challenges arise from highly correlated covariates â often the result of not going up in the ancestor-descendant relationships to find the most comprehensive ancestor to put into the list. But you never really find this out until youâre really far into running the package. I wonder if thereâs a way to do a correlation test before a full study is run?
Ok. Iâll stop for now. I wholeheartedly support any way to make evaluating T/C/O feasibility more straightforward.
@mattspotnitz Well done! @zhuk and I have some experience on hysteroscopic contraceptive device study so let me add my 2 cents in here
In âCu-IUDâ cohort for EHR data youâre using âIUD Placement Procedureâ concept set defined as 4275113 Insertion of intrauterine contraceptive device including all the descendants. The problem is some of the descendants look messy for me:
It looks like the 1st one should be excluded from here or even added to the C-cohort.
Have some concerns on the 2nd. Isnât it used for the hormone-realizing method from time to time? Link and screenshot:
Even though these concept-sets are validated on particular datasets or youâre excluding the hormone-realizing methods with inclusion criteria, I guess they should be transparent in terms of a network study. What do you think?
We have code ready for the network study and just created a PR to the ohdsi-studies repo, so it should be there soon. I really liked the cohort diagnostics that @schuemie and @Patrick_Ryan developed during the Barcelona study. We didnât add it to the package since itâs under development, but we will leverage it for potentially debugging issues when sites run the study.
If anyone wants to try to run the study package before it gets to the ohdsi-studies repo, they can go to the this repo to test things out. Thanks!