OHDSI Home | Forums | Wiki | Github

Out of memory when running Cohort Diagnostics

Hi there,

I am running the Cohort Diagnostics package on 10 cohort definitions against Optum Market Clarity, but keep running into the “Exception in thread “RingBufferThread” java.lang.OutOfMemoryError: Java heap space” error. I have tried increasing the memory for Java up to 700 gb, but nothing changes, which makes me think there’s a leak in memory somewhere, but my knowledge kinda stops there. I enclosed here screenshots. The error always happen at the data fetching step after the temporal characterization.

Any suggestion is much appreciated.

I know it’s been a while since the original post, but just documenting here that I had the identical error, and was able to solve it by switching from Amazon Java Corretto 11 to Java Corretto 8 (which doesn’t make any sense, I know).

2 Likes

Perfect timing, we just had this issue come up again :relaxed:

Perfect timing indeed!

So it looks like we had open JDK 8 in our R instance where this failed. Based on this post and discussions with @jpegilbert , we changed it to use Amazon Corretto Java 8 and no longer have this error.

@schuemie – I wonder if we ought to recommend Corretto 8 in the HADES guide?

Great idea! Just did.

1 Like
t