OHDSI Home | Forums | Wiki | Github

EHR Dream Challenge: Patient Mortality

I thought this might be an interesting DREAM challenge to look at, therefore, sharing.

Challenge qustion: Of patients who have at least one hospital visit, can we predict who will pass away within 6 months of their last visit?

Challenge Data: Yes, the data is in OMOP :slight_smile:
The University of Washington is hosting a curated dataset from their Electronic Health Record (EHR) enterprise data warehouse for this challenge. The data collected spans 10 years from 2009 to 2019, with the last death record in the available repository being recorded in February 2019. The data represents 1.3 million patients who have at least one visit occurrence in their medical record. 22 million visits are spread across these patients and include 33 million procedures, 5 million drug exposure records, 48 million condition records, 10 million observations, and 221 million measurements. The data has been converted to the OMOP.

Link to the DREAM challenge:
https://www.synapse.org/#!Synapse:syn18405991/wiki/589657

Regards

  • Rohit Vashisht
4 Likes

@rohitv , +1 on this! Let’s do it!! How do you want to mobilize? I’m happy to carve out time for a working session and get the juices flowing. I can set-up a WebEx if you’ve got a time that works well for you.

I’m teaching the next two Tuesdays (aka tomorrow and Oct 1), otherwise flexible.

1 Like

Hi @krfeeney, Thank you.

Following is what I am thinking:
a) We can have a Skype call this week any day after Tuesday. I’ll prepare a small ppt highlighting: a) aspects of the challenge b) data types/access c) timeline and feasibility and d) how can we go about working on it. Most of this information is also available on the challenge website (except feasibility :)).
b) Based on our discussion, we can present those slides in the next week’s community call of OHDSI to seek wider participation and build the team, and move from there.

Regards
-Rohit

1 Like

Hi Rohit. I’m happy to join as well.

1 Like

I would like to join as well. Perhaps we can have even more than one team if there is enough interest. I was not able to see how much is the price in $? (if any; this may generate COI for some folks)

Hi I’m Chungsoo Kim from Ajou univ in South Korea. (Please call me Ted)
I want to join the team, I’ve already submit a poster to OHDSI symposium about the death prediction model using claim database.
https://www.ohdsi.org/2019-us-symposium-showcase-62/

Folks - time zones are going to be a challenge lining up Pacific and Korean time.

Any chance tomorrow at 11AM EDT / 8AM PDT works? I know that is midnight in Korea. Maybe we can chat and figure out a work plan that can be broken across hemispheres.

https://iqvia.webex.com/iqvia/j.php?MTID=mba3ec356d908dec9fb14d7b20d104d3f

Meeting number (access code): 963 519 140
Meeting password: OHDSI

JOIN BY PHONE
+1-844-517-1271 US Toll Free

Global call-in numbers:
https://iqvia.webex.com/iqvia/globalcallin.php?MTID=m7e103b0c5aeaa7d04f138367ce7c7017

3 Likes

Thank you.
Tomorrow 8:00am PDT works for me.

Regards
-Rohit

I don’t care about the time . Thank you :grinning:

Hello Everyone,

Can I confirm my understanding of the schedule?

Sep 9th - Oct 9th - We have to develop our prediction model either on SynPUF data or on any other data (our own OMOP data) and submit it as a docker file?

Oct 9th - Jan 9th - We further tune our models during this 3 month to be generalizable for UW dataset? We again submit a docker file here. Am I right?

After Jan 9th - I guess our model will be tested on the recent data. At this phase, we don’t have to do anything really.

Thanks
Selva

@rohitv +1, count me In as well.

@krfeeney I’ve just seen your message. 8am worked for me as well, but it seems like the meeting already happened. Do you have a video?

All - thanks to those who dialed in! We just wrapped the first project discussion.

Our meeting minutes are here: https://docs.google.com/document/d/1y0lkFRKOiDrV6D9Cm5PYx-_NSVwWfIvekKTE_xprQGs/edit?usp=sharing

Please take a moment to add your information if you want to participate.

We will exchange questions and do a bit more discovery on the overall challenge requirements over the next days. We will use the Forum to discuss and will decide on a time to chat again accordingly – likely later next week.

@lee_evans @SCYou @JamesSWiggins - this challenge has a requirement that we submit in a Docker image. Any chance you could help us understand how to do that?

4 Likes

@krfeeney, I’m not sure whether I will participate in DREAM challenge and whether I’ll join this team.
Still, I can teach @Chungsoo_Kim how to build a docker image.

4 Likes

Am interested with this challenge! Also learning how to set up dockers image.

Guys, interested in joining this team. You did this first meeting in a record time - 3 days from the idea to the call - and I missed the whole discussion :slight_smile:

A couple of pointers based on the meeting notes I read, hopefully will save you some time. You do not need to build a separate docker image for PLP studies (if I understood your intent properly). You already have two choices to be able to execute the PLP “in the box” today:

  1. ATLAS has been linked with ARACHNE Execution Engine. If enabled, click a button and it will execute PLP and will return the results. This is the way we ran the PLP study during our Scientific Retreat with CSS team

  2. You can also now install ARACHNE Data Node (1.15 release) in a disconnected mode and use that to execute PLP studies. Download the generated code from ATLAS, upload into Data node and click a button.

1 Like

Greg, these are good ideas but we cannot install new software on the University of Washington’s system nor can we modify the submission format. Currently the data is served up in CSV files. We’re making some guesses on what they’re doing on the live environment side. The mandate is we submit everything in docker images. This is the workflow:

(Reference: Synapse | Sage Bionetworks)

Not sure either option presented would be feasible given this.

Here is more information about docker containers they expect to get https://www.synapse.org/#!Synapse:syn18405991/wiki/595495

They are not going to share any data (except Synpuf). We’ll have to make our code ready to work with input data in CSV format for training and for evaluating our model.

Hi Gregory. I can’t find any information about it on their website. May I ask you to share a link where they say it?

Ah, thank you guys for this information - still have a lot of question but I will “read the manual” next time :slight_smile:

Looking forward to be a part of the team and win this challenge together!

2 Likes
t