OHDSI Home | Forums | Wiki | Github

Tomcat failing - java.lang.OutOfMemoryError: Java heap space error

Hi everyone, Just wondering if I can get some help. Tomcat is failing and giving the exception below. Can anyone help me please.
Exception in thread “http-nio-8080-AsyncTimeout” Exception in thread “http-nio-8080-ClientPoller-0” Exception in thread “NioBlockingSelector.BlockPoller-0” java.lang.OutOfMemoryError: Java heap space
Exception in thread “http-nio-8080-ClientPoller-1” java.lang.OutOfMemoryError: Java heap space
Exception in thread “http-nio-8080-Acceptor-0” java.lang.OutOfMemoryError: Java heap space
Feb 23, 2023 12:54:37 PM com.microsoft.sqlserver.jdbc.TDSReader throwInvalidTDS
SEVERE: ConnectionID:32 ClientConnectionId: f2bf3997-a315-442b-b8d6-ee1430ee5fe2 got unexpected value in TDS response at offset:4934
Feb 23, 2023 12:54:43 PM com.microsoft.sqlserver.jdbc.TDSParser throwUnexpectedTokenException
SEVERE: ConnectionID:32 ClientConnectionId: f2bf3997-a315-442b-b8d6-ee1430ee5fe2: getNextResult: Encountered unexpected unknown token (0x31)
Feb 23, 2023 12:54:43 PM com.microsoft.sqlserver.jdbc.TDSReader throwInvalidTDSToken
SEVERE: ConnectionID:32 ClientConnectionId: f2bf3997-a315-442b-b8d6-ee1430ee5fe2 got unexpected value in TDS response at offset:4934

2023-02-24 11:49:57.843 ERROR taskExecutor-2 org.ohdsi.webapi.cdmresults.CDMResultsCacheTasklet - - Failed to warm cache for AURUM_CDM_50Practices. Exception: PreparedStatementCallback; bad SQL grammar [SELECT
concept_id,
record_count,
descendant_record_count
FROM RESULTS.achilles_result_concept_count]; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name ‘RESULTS.achilles_result_concept_count’.
2023-02-24 11:49:57.843 WARN taskExecutor-2 org.ohdsi.webapi.cdmresults.CDMResultsCacheTasklet - - Failed to warm cache for AURUM_CDM_50Practices. Trying to execute caching from scratch. Exception: PreparedStatementCallback; bad SQL grammar [SELECT
concept_id,
record_count,
descendant_record_count
FROM RESULTS.achilles_result_concept_count]; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name ‘RESULTS.achilles_result_concept_count’.
2023-02-24 11:49:57.843 ERROR taskExecutor-1 org.ohdsi.webapi.cdmresults.CDMResultsCacheTasklet - - Failed to warm cache for AURUM_CDM. Exception: PreparedStatementCallback; bad SQL grammar [SELECT
concept_id,
record_count,
descendant_record_count
FROM RESULTS.achilles_result_concept_count]; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name ‘RESULTS.achilles_result_concept_count’.
2023-02-24 11:49:57.843 WARN taskExecutor-1 org.ohdsi.webapi.cdmresults.CDMResultsCacheTasklet - - Failed to warm cache for AURUM_CDM. Trying to execute caching from scratch. Exception: PreparedStatementCallback; bad SQL grammar [SELECT
concept_id,
record_count,
descendant_record_count
FROM RESULTS.achilles_result_concept_count]; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name ‘RESULTS.achilles_result_concept_count’.
2023-02-24 11:49:59.965 INFO taskExecutor-4 org.springframework.batch.core.job.SimpleStepHandler - - Executing step: [warming AURUM_HES_APC_CDM cache results]
2023-02-24 11:50:00.874 ERROR taskExecutor-3 org.ohdsi.webapi.cdmresults.CDMResultsCacheTasklet - - Failed to warm cache for AURUM_CDM_MAY_2022. Exception: PreparedStatementCallback; bad SQL grammar [SELECT
concept_id,
record_count,
descendant_record_count
FROM RESULTS.achilles_result_concept_count]; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name ‘RESULTS.achilles_result_concept_count’.
2023-02-24 11:50:00.874 WARN taskExecutor-3 org.ohdsi.webapi.cdmresults.CDMResultsCacheTasklet - - Failed to warm cache for AURUM_CDM_MAY_2022. Trying to execute caching from scratch. Exception: PreparedStatementCallback; bad SQL grammar [SELECT
concept_id,
record_count,
descendant_record_count
FROM RESULTS.achilles_result_concept_count]; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name ‘RESULTS.achilles_result_concept_count’.
2023-02-24 11:50:05.779 ERROR taskExecutor-4 org.ohdsi.webapi.cdmresults.CDMResultsCacheTasklet - - Failed to warm cache for AURUM_HES_APC_CDM. Exception: PreparedStatementCallback; bad SQL grammar [SELECT
concept_id,
record_count,
descendant_record_count
FROM RESULTS_HESAPC_AURUM.achilles_result_concept_count]; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name ‘RESULTS_HESAPC_AURUM.achilles_result_concept_count’.
2023-02-24 11:50:05.780 WARN taskExecutor-4 org.ohdsi.webapi.cdmresults.CDMResultsCacheTasklet - - Failed to warm cache for AURUM_HES_APC_CDM. Trying to execute caching from scratch. Exception: PreparedStatementCallback; bad SQL grammar [SELECT
concept_id,
record_count,
descendant_record_count
FROM RESULTS_HESAPC_AURUM.achilles_result_concept_count]; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name ‘RESULTS_HESAPC_AURUM.achilles_result_concept_count’.
2023-02-24 11:53:30.514 INFO taskExecutor-2 org.springframework.batch.core.launch.support.SimpleJobLauncher - - Job: [SimpleJob: [name=warming AURUM_CDM_50Practices cache]] completed with the following parameters: [{jobName=warming AURUM_CDM_50Practices cache, time=1677239315971, jobAuthor=anonymous}] and the following status: [COMPLETED]

You could try increasing the sizes of Xms and Xmx in the environment variable, e.g.,
CATALINA_OPTS="-Xms1G -Xmx10G -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005"

If using Broadsea, this can be set in the docker-compose .env file. I’m not sure if this will fix all the issues here, but it should help at least with the OOM errors.

Hi, it also looks like your achilles_result_cocept_count wasnt’ created. It is defined in this script. I’m not sure which version of WebAPI you are running, but this table is required in v2.13.

Thanks a lot @Dillo. I will try and increase it and see if that works

@Chris_Knoll, Many thanks for coming back. Yes we are run very old version of webapi. I think it is 2.10.1.

We will update soon and hopefully these errors are gone. Thanks for sharing the script

Hi @Dillo. we increased our max help size to 6GB and it did resolved the heap size issue. Thank you very much for your help.

Hi @Chris_Knoll. I did added the missing table and restarted the websapi. All the errors are gone but still Atlas is very slow. Is it due some caching issue. Our database is quite massive so may be we need to optimise it . Any suggestions.

We also want to update our Atlas and websapi to the latest version but hearing the news that is has some compatiblity with the MS SQL server. We have our CDM database on the SQL server.

Will you be able to suggest us with version of Atlas and WEBAPI we should be using and that is compatible to SQL server.
Many Thanks

It won’t work on MS Sql server because the necessary tables needed for post 2.10 releases are not created in MSSQL server. If you would like to contribute the necessary migration scripts, you can compare the scripts found in the postgresql migration folder with the scripts found in mssql server.

The reason why we stopped support was we didn’t have the resources to maintain a db schema across 3 different platforms The decision for postgres was that it’s open source, free and supported by most hosting companies like microsoft and aws.

Great to hear!

Just to be clear, you can run current versions of 2.12 with MS SQL Server CDM. It is just the WebAPI DB that is restricted to Postgres. That is what we are running.

@Dillo, thanks for coming back. Are you using the 2.12 version of both Webapi and Atlas. Yes, we are using postgres for webapi db as well. Did you had any installation or compatibility issues?
Can we just download the latest version of both and no need of any alteration.
Just confused a bit from the response from Chris that "It won’t work on MS Sql server because the necessary tables needed for post 2.10 releases are not created in MSSQL server. If you would like to contribute the necessary migration scripts, you can compare the scripts.

Can you please share your experience and tips please

It seemed to me that Chris thought you were referring to the WebAPI DB. SQL Server is fully supported as a CDM, even if it wasn’t clear enough in the docs.

If you already have 2.10 working, I’m not sure of what new if any issues 2.12 might present. I’m on 2.12.0–and will wait for 2.13 or higher to upgrade again. Three of our Achilles SQL scripts had to be modified to work as I recall, but that problem was not new. I never figured out how to get AD/LDAP authentication working, so we used CAS and wrote my own python code to synchronize with cron the authenticated IDs with the AD group intended for authorization. I had to modify the start up routine to update Tomcat in place and some JS library to appease our security team. This is using Broadsea.

As long as you have a test/dev environment, you can experiment freely with the new version.

Hi @Dillo, This sounds bit promising. We also had some issues with Achilles related to tempdb and some analysis failed, we had to make some changes. You are right we should test 2.12 on our test environment. For now we can just keep using 2.10. Again, thanks very much for your advise again.

Sorry for the confusion. I thought you were mentioning running Atlas and WebAPI on MSSQL. The CDMs are a different context, and supports more platforms (including MSSql).

Hi @Chris_Knoll, sorry my words were not clear. We run Atlas and Webapi using postgres database (as per as OHDSI recommendations we have webapi database in postgres) but we have our cdm database in MS SQL. So I just wanted to be sure about any compatibility issues when we will run Atlas ,Achilles, DQD. Thanks

The OHDSI analysis tools (Achilles, DQD, CIRCE, HADES, etc) use sql translation technology to run an analysis query across different DBMS. Details of this package can be found here.

Atlas is the UI that leverages WebAPI, so from a database layer perspective, WebAPI is the software component that we talk about when we speak of DB compatibility. WebAPI’s DB needs go beyond what analytical queries use: foreign keys, identity columns (via sequences), and, most importantly, the managing of schema changes between releases of WebAPI. Aside from annoying dbms behavior with semantics around users, tables, schemas and databases (looking at you, Oracle), it was a lot of work to mange migration scripts across multiple database platforms. So we standardized on one, but only for WebAPI.

t