OHDSI Home | Forums | Wiki | Github

Most reports in Atlas show no data or error

Hi all,

After quite a bit of a struggle I finally have Atlas / WebAPI talking to my CDM :grinning:. However, only part of a few reports (Dashboard, Person) show data. The rest of the reports show ‘No data’, and Observation Period shows an ‘Error loading report’ (see attached screenshots). Any idea what could be the cause of this?

Cheers,
Bert



I noticed in some of your screenshots, you have requests that failed (in red). If the request resulted in an HTTP 500 error, you should be able to see the error message in your WebAPI logs.

If you could confirm your version of WebAPI you are running, that would be helpful.

Hello @Chris_Knoll, I am using Broadsea v3.0.0, which I installed few weeks ago, so I assume the WebAPI version is fairly recent. I am not sure what I am looking for in the logs (attached). There seem to be a lot of “bad SQL grammar” errors …

webapi_log.txt (711.2 KB)

Yes, I found:

Caused by: org.postgresql.util.PSQLException: ERROR: permission denied for relation achilles_results

ERROR: relation "results.achilles_result_concept_count" does not exist

Resource not found: /resources/cdmresults/sql/report/visits/treemap.sql

Resource not found: /resources/cdmresults/sql/report/observation_period/treemap.sql

Some of this is environmnt problem that you can fix, others are code issues we need to fix:

For the permission error, you need to make sure you granted permission to the achilles_results table.

For the achilles_result_concept_count eror, you didn’t create that table. There are instructions elsewhere in the forums on that, I believe.

The resource not found errors are a software problem (and I’m surprised we haven’t seen this sooner) but I don’t think there’s a treemap for ObservationPeriod, and for the one for visits, the path to the file should be ‘visit’ not ‘visits’, so there’s a bug in a call somewhere but we will fix that for next release.

@Ajit_Londhe , I’ve scoured the codebase (multiple times) and I can’t figure out how the path to load the visits treemap is resolving to a ‘visits’ path when the report name is ‘visit’ in the UI and that gets passed along to the service. Is broadsea doing something special?

I also don’t know why observation_period is calling treemap, there is no treemap in the observation period report.

@Chris_Knoll

I fixed the permissions on achilles_results and generated the achilles_result_concept_count table. Now I get the following error message:

2023-09-15 16:56:42.413 ERROR taskExecutor-1 org.ohdsi.webapi.cdmresults.service.CDMCacheService - [] - Failed to warm cache AMYPAD. Exception: PreparedStatementCallback; bad SQL grammar [select cc.concept_id, record_count, descendant_record_count, person_count, descendant_person_count
from results.achilles_result_concept_count cc
join cdmdatabaseschema.concept c on cc.concept_id = c.concept_id
WHERE cc.concept_id >= ? and cc.concept_id <= ?]; nested exception is org.postgresql.util.PSQLException: ERROR: operator does not exist: character varying = integer
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.
Position: 196

Shouldn’t be anything different in Broadsea. It builds off of the docker images (which are built on the source code).

Ok, so in the above query, there’s some kind of mismatch between the left side of = and right side. Looking above, we see there’s a join on result_concept_count.concept_id and concept.concept_id. That’s the only thing, so one of those things is a varchar. But that can’t be because the DDL for both those tables would make concept_id an integer field.

What is the column type of concept_id in each of those tables?

The type of concept_id in the achilles_result_concept_count was varchar. I changed it to integer. Now I get a complaint that the column person_count doesn’t exist in the achilles_result_concept_count table …

So person_count in that table is a bit of a newer, optional addition. In the endpoint “/WebAPI/ddl/achilles”, you’ll find the table create and query for rebuilding the table with it included.

I have generated the achilles_result_concept_count table now and said error message has disappeared from the logs. However, still most reports in Atlas are empty (or rather show “No data”). I was expecting at least for instance the “Data density” report to show me something. If I run the query https://github.com/OHDSI/WebAPI/blob/master/src/main/resources/resources/cdmresults/sql/report/datadensity/totalrecords.sql against my db, it does return results. So, why are these not shown in the UI?

Likely the cache has messed with it. You can find the source_id in the source table of the webapi schema, then use that to identify which caches to clear from the achilles_cache table in the webapi schema.

The achilles_cache table is empty.

Clear local cookies / cache?

@Ajit_Londhe No, unfortunately clearing the cache and deleting cookies doesn’t change things.

Just FYI, these are the only reports / panels that currently show data:

Dashboard - CDM Summary, Population by Gender
Person - Year of Birth, Gender, Race
Observation Period - All panels

When I for instance look at the Data Density report, I can see that no data is returned at all. Same when I query the WebAPI directly. But, as I showed above, when I for example run the totalrecords.sql code directly against my db, it does return data …

Cheers,
Bert

Can you please share the output of this, where @resultsDatabaseSchema is your fully qualified results schema name?

select distinct analysis_id from your @resultsDatabaseSchema.achilles_results

2000101
110
2000400
2001411
2001827
2000900
2001020
2000011
2000001
2001900
2001312
814
116
2002201
2000402
1800
2000620
2001202
2000210
1804
2000112
2000814
2000207
826
2002003
2000200
2000823
1891
2001304
2000002
2000920
2002200
2001410
1805
2000204
2002191
2000416
2003
101
2000825
2000625
2002125
1825
2001301
1827
1823
2001000
2002004
112
108
2000119
1811
2001004
2000209
2000420
2000300
2000701
2001819
2001425
2000212
200
2000113
2001203
11
2001321
2001102
1826
2001201
2001002
2000804
117
119
113
2001891
2000702
2000220
2000116
202
2001100
2000826
2000404
2000601
802
2002120
2001413
2000600
2000807
2000303
226
2000109
2002104
2000801
2000415
2001302
2000605
2000505
3
212
2001826
2000221
800
2000791
2000800
204
2000604
5
2000802
1821
2000226
1820
2001818
820
807
4
225
0
2000820
2000425
2001814
2000602
201
1326
10
1819
1807
2000891
2001300
2000827
2000725
804
2001820
2001325
891
220
2000691
2001823
2000805
2000005
2000003
823
822
2000225
2001804
801
2000700
2001409
1802
2000301
221
2000414
2000111
2004
102
2001801
2000902
2000501
109
2001001
2001807
2000010
2000401
2002001
2000502
111
2000108
2
2001412
2000704
2001101
2000012
805
2002100
1822
2000822
2000630
2000117
2000110
2001825
2001822
2000325
2000705
2000202
2001800
2000525
2001408
1801
2000720
2002105
2000000
2001200
2002000
827
12
2001805
2000901
2000504
2002101
2001320
2001821
2000201
2001103
2000904
825
2001811
2000500
2000405
2002102
2001802
2000102
2001326
1
2002002
2000004

Seems a little light on analysis ids. 62 out of 194 default analyses. You may want to go back to your Achilles execution and confirm all of the necessary analyses executed.

In R, you can see the default analyses using:
Achilles::getAnalysisDetails() |> dplyr::filter(IS_DEFAULT == 1)

That query gives 194 rows back. Is this the list of analyses that HAVE BEEN executed or the analyses that CAN BE executed? If the latter, how do I confirm which analyses have been executed? Or how do I make sure all of them are executed? Apologies for all the naive questions …

In fact, looking at my log_achilles.txt, all 194 analyses seem to have been done.

excluding the ones with the 2000000 offset (those are performance benchmarks)

t