OHDSI Home | Forums | Wiki | Github

Where is the @temp_database_schema used by Atlas created?

I’m creating a Cohort defintion and I have Observation critera that use a Concept Set. When I try to generate the cohort I’m getting 0 records where I believe there should be matches.

Looking at the SQL generated for the cohort I see this:

 -- Begin Observation Criteria
select C.person_id, C.observation_id as event_id, C.observation_date as start_date, date_add(C.observation_date, 1) as END_DATE,
 C.visit_occurrence_id, C.observation_date as sort_date
from 
(
 select o.* 
 FROM @cdm_database_schema.OBSERVATION o
JOIN @temp_database_schema.tk378kp9Codesets cs on (o.observation_concept_id = cs.concept_id and cs.codeset_id = 2)
) C

I recall creating a covid_ohdsi_ach_tmp schema for Achilles but this schema seems to be empty and I’m thinking that this a different temp schema.

Where is @temp_database_schema created?

Is there a way I can see what Atlas thinks @temp_database_schema is supposed to be (i.e. where is the table tk378kp9Codesets)?

That should be set in the Configuration of the source, see the Temp daimon below. For Databricks, as there is no support of temp tables, you need to set a schema in your cluster that is dedicated for creating “temp” tables (but are actual tables that get dropped after the job is finished).

Thanks Ajit!

I’m not sure I understand completely. It sounds like I need to create a schema in Databricks but is there more that I need to do or is Atlas smart enough to know it is dealing with a Spark schema and will take care of the create/drop of the tables?

I’m not sure how to get to the menu from your screen shot, is it somewhere in Atlas? When I get there, I’m guessing I check the check box and then enter the name of the schema that I’ve created in Databricks for the temp stiff?

I’ll give the above a try and let you know how I make out.

Thanks again,
John

Atlas won’t create the schema automatically. As part of using Databricks (or any other dbms that doesn’t support temp tables), you’d need to plan for where that CDM can write “temp” (but actually real) tables.

The screenshot I showed was the Configuration page in Atlas. When creating a new source, or modifying an existing one, you can set the temp daimon. Alternatively, if you don’t have security enabled in Atlas, you can make this change directly in the webapi source_daimon table.

Thanks again Ajit, do you happen to know the value for daimon_type for the configuration I need to add. This is what I have for cdm, vocab, and Achilles results.

Looks like 5:

1 Like

Btw, not sure if you got a simple answer on this, but WebAPI/Atlas doesn’t use this directly, but we do pass this information to where we call SqlRender’s renderSql() method. renderSql() will create ‘fake temp tables’ in the temporary schema. Some DBMS platforms don’t support temp tables, so we have a way to ‘fake it’ by creating tables with a special session_id prefix that get cleaned up after an analysis runs. You might even see some in your temp schema that was left over from failed runs. The nice thing (and reason) for the temp schema is that you can safely drop and re-create it between WebAPI restarts to easily clear out orphaned tables.

1 Like
t