OHDSI Home | Forums | Wiki | Github

ATLAS attrition inconsistency?

I have a cohort with three rules and another with the same three and a fourth. Is it reasonable to expect the attrition tables to match for the three rules they have in common?

For counter example, you would not expect the table below, right?
qualified events. 10k 10k
condition a 5k 5k
condition b 4000 3990
measurement c. 3500 3475
measurement d. n/a 2000

I’ve dug down into the generated sql for the bigger version, and trimmed away to the smaller, and not been able to reproduce what I see in the table. Dissecting the generated code, removing the last rule, produces consistent results.

How similar is the generated code to what the attrition table uses?

Thanks
-Chris

The intersection report and attrition report are built off of the same core dataset: the stratified counts of entry events by inclusion rule matches. Ie: given this dataset:

bitString count
1000 5
1100 10
1011 20
0101 25
1111 30

Noe last row represents people who met all 4 rules (1111).

So, the table on the intersect view goes through each bit, and tells you how many people met the indicated rule: rule 1 has 5+10+20+30 = 65. rule 2 has 10+25 = 35. rule 3 has 20+30 = 50, etc.

The table on the attrition works slightly differently, but off the same dataset:
Row 0 is 100% of entry events
Row 1 is people with bits starting with ‘1’: 5+10+20+30 = 55
Row 2 is people with bits starting with ‘11’: 10+30 = 40
Row 3 is people with bits starting with ‘111’ = 30
Row 4 is people with bits starting with ‘1111’ = 30.

That’s how those tables are calculated: it’s a client side calculation, so you won’t find it stored in the database that way.

Thanks @Chris_Knoll. I was able to modify the generated script and have it not delete the qualified_events and inclusion_events tables, and then do as you suggest above, but in SQL. That much works out fine.

[edited to remove request for JS code]

t