OHDSI Home | Forums | Wiki | Github

ACHILLES "Error loading report"

Hi there,

I am running into an issue with ACHILLES in ATLAS and could use some help diagnosing the root cause. The Condition Occurrence, Drug Exposure, and Procedure reports do not show up; instead I get an error message “Error loading report”. I’ve done some digging on the forums and don’t see an obvious solution right now.

Some info about my instance:

My theory right now is that this has to do with the fact that our condition_occurrence, procedure_occurrence, and drug_exposure tables contain events dated outside patients’ observation periods. Our measurement and visit_occurrence tables do not contain data outside the OP, and those 2 reports work fine in ATLAS. Based on other forum posts, it does not seem like this should prevent the reports from running, but maybe I’m wrong.

Finally - the RC/DRC counts in ATLAS are not populating for us. It seems that this feature is linked to ACHILLES so I wanted to cite that as a potentially related issue.

Thanks in advance for any help!

Katy

I just wanted to bump this thread in case it got lost in the shuffle. Would really appreciate any insight if any folks have experienced a similar issue; thanks!

Hi @katy-sadowski,
I am running into the same issue.

Also, I don’t see concept_hierarchy table in my schema. How is this generated ?

I am running WebAPI v2.8 and CDM is 5.3. If you find any fixes, please post here.

Thanks,

This is the sourcecode for that table:

The documentation describes how to set up a CDM results schema for Webapi:

@Chris_Knoll could you please confirm whether or not ACHILLES requires all events to be within the observation period in order for a report to load? Thanks :slightly_smiling_face:

I don’t think so: this is the query for the persons with condition by condition_concept_id:

There’s no reference to observation period here, just a call to the condition-era table.

The “Error Loading Report” could be related to a few different things…but I don’t think it’s from a UI error (like a javascript problem). Do you know how to bring up the dev console in your browser? There may be a hint there as to the actual error, and you can also take a look in your WebAPI logs to see if there’s a query that failed.

The exporting JSON files is not related to Atlas: Atlas (WebAPI) reads the results tables directly for the reports in Atlas, while the JSON exported files is used in a different application (AchillesWeb) where no database server is required. In AchillesWeb, the process is to run the achilles analysis (which populates the database tables) and then you export JSON files which are consumed in AchillesWeb. If you’re only talking about Atlas, then ignore all the exported JSON.

Thanks @Chris_Knoll. I missed this step. I will run it now. Is it okay to run this script after the Syn 1K data is loaded? I am asking because, the script has been running for past couple of hours.

Thanks,

That’s good to know; thanks. I will try inspecting the dev console and the WebAPI logs to see if the issue is apparent there. I’ll report back :slight_smile:

Yes, it’s normal to load the data first, then create the results schema after.

I inspected the dev console and the following error appears 2x each time I try to open one of the failing reports:

It appears that no queries are being run against my CDM database when I try to open these reports (nothing in the query logs).

For the WebAPI database, I get ~100 identical rows in the query log with this: execute <unnamed>: SELECT JOB_EXECUTION_ID, KEY_NAME, TYPE_CD, STRING_VAL, DATE_VAL, LONG_VAL, DOUBLE_VAL, IDENTIFYING from webapi.BATCH_JOB_EXECUTION_PARAMS where JOB_EXECUTION_ID = $1

Any idea what’s up here? The “bad SQL grammar” error seems it might indicate an issue with something that checks the syntax of the query that’s going to be run against Redshift (our CDM database).

Thanks so much!

The query against webapi.batch_job_execution_params is not related to your achilles problem. The batch_job table is just polling for information about batch jobs.

The bad SQL grammar is a query against your CDM, and it’s when it’s trying to pull up the drug summary report. You will need to go to your WebAPI logs (ie: your tomcat logs) and find the exception, and then surrounding the exception should show what query is attempted that resulted in the error.

Aha, thanks so much! I found it! Here is the offending query:

SELECT
   concept_hierarchy.concept_id AS concept_id,
   CONCAT(
     COALESCE(concept_hierarchy.level4_concept_name,'NA'), '||',
     COALESCE(concept_hierarchy.level3_concept_name,'NA'), '||',
     COALESCE(concept_hierarchy.level2_concept_name,'NA'), '||',
     COALESCE(concept_hierarchy.level1_concept_name,'NA'), '||',
     concept_hierarchy.concept_name
   ) AS concept_path,
   ar1.count_value AS num_persons,
   ROUND(CAST(1.0 * ar1.count_value / denom.count_value AS NUMERIC),5) AS percent_persons,
   ROUND(CAST(1.0 * ar2.count_value / ar1.count_value AS NUMERIC),5)   AS records_per_person
 FROM (SELECT *
       FROM results.achilles_results WHERE analysis_id = 700) ar1
   INNER JOIN
   (SELECT *
    FROM results.achilles_results WHERE analysis_id = 701) ar2
     ON ar1.stratum_1 = ar2.stratum_1
   INNER JOIN
   results.concept_hierarchy concept_hierarchy
     ON CAST(CASE WHEN CASE WHEN (CAST(ar1.stratum_1 AS VARCHAR) ~ '^([0-9]+\.?[0-9]*|\.[0-9]+)$') THEN 1 ELSE 0 END = 1 THEN ar1.stratum_1 ELSE null END AS INT) = concept_hierarchy.concept_id
   AND concept_hierarchy.treemap='Drug'
   ,
   (SELECT count_value
    FROM results.achilles_results WHERE analysis_id = 1) denom

 ORDER BY ar1.count_value DESC

And here is the error: Invalid operation function concat(character varying, "unknown", character varying, "unknown", character varying, "unknown", character varying, "unknown", character varying) does not exist

The reason for this error is that the CONCAT function in Redshift only accepts 2 arguments (CONCAT - Amazon Redshift).

…and guess what. This caused me to check our ATLAS source configuration page, and looks like for some reason the Redshift source’s dialect had been listed as postgresql. I’m guessing perhaps a copy/paste error by an engineer as we have 2 other Postgres sources.

I am going to get this changed to redshift and will report back :slight_smile:

yes, specifying the right dialect should fix it. i believe there’s some rule in SqlRender which will expand CONCAT(a,b,c) into CONCAT(a,CONCAT(b,c))

It worked! Thanks very much for your help @Chris_Knoll - sorry that turned out to be a bit of a silly error. But now I am empowered to debug next time something comes up :slight_smile:

t