OHDSI Home | Forums | Wiki | Github

Issue in creating the redshift result schema setup

Hi Team,
I am new to this OHDSI.
I have a OMAP data sitting is redshift spectrum table. I was able to connect to it via R studio. Now I need to set up the result schema to execute the achilles and get the data in atlas.

http://localhost:8080//WebAPI/ddl/results?dialect=redshift&schema=redshift_schema&vocabSchema=vocabulary&tempSchema=temp_redshift&initConceptHierarchy=true

I am trying to run this above command to generate the sql script for the result schema but i am ending up with below error.

{“payload”:{“cause”:null,“stackTrace”:[],“response”:{“context”:{“headers”:{},“entity”:null,“entityType”:null,“entityAnnotations”:[],“entityStream”:{“committed”:false,“closed”:false},“length”:-1,“language”:null,“location”:null,“lastModified”:null,“date”:null,“mediaType”:null,“committed”:false,“requestCookies”:{},“responseCookies”:{},“links”:[],“lengthLong”:-1,“entityClass”:null,“stringHeaders”:{},“allowedMethods”:[],“acceptableMediaTypes”:[{“type”:"",“subtype”:"",“parameters”:{},“quality”:1000,“wildcardSubtype”:true,“wildcardType”:true}],“acceptableLanguages”:["*"],“entityTag”:null},“status”:404,“length”:-1,“language”:null,“location”:null,“lastModified”:null,“date”:null,“entity”:null,“mediaType”:null,“cookies”:{},“links”:[],“metadata”:{},“stringHeaders”:{},“statusInfo”:“NOT_FOUND”,“allowedMethods”:[],“entityTag”:null,“headers”:{}},“localizedMessage”:“HTTP 404 Not Found”,“message”:“HTTP 404 Not Found”,“suppressed”:[]},“headers”:{“id”:“b4d75da4-814d-e6b0-4ba9-fde1b4f0d581”,“timestamp”:1637080957992}}

could you please suggest me here.

Hello @preetham_kotian,

To execute Achilles, the only preparation step you need is to create an empty schema: CREATE SCHEMA redshift_schema;. After that, set the connection details and launch Achilles from RStudio, it will automatically create tables that it manages.

For the WebAPI endpoint question:

  • have you deployed WebAPI? just to make sure, because you asked questions in other threads;
  • if yes, try replacing the double slash in URL (between 8080 and WebAPI) with a single slash.

@rookie_crewkie

thanks Rookie. in that case no need of creating any tables in result scehma.
if i execute the command below it should be able to create the tables needed. is that my understanding correct?
achillesResults ← achilles(connectionDetails, cdmDatabaseSchema=“cdmschema”,
resultsDatabaseSchema=“resultscehma”, sourceName=“CDM3”,
cdmVersion = “5”, vocabDatabaseSchema=“cdmschema”)

this should be suffice. I do have the entries done in source and source daimon table.

@preetham_kotian,

Yes, for Achilles tables that’s correct. Also, there’s a documentation article that describes the process in great detail.

The SQL generated by WebAPI is required for ATLAS, but those tables are different and are not managed by Achilles, they’re created separately.

thanks @rookie_crewkie ;
there will be around 4 to 5 achilles tables created with execution of achilles command.
rest of the table we create from web UI is for atlas.

also thanks for pointing out the error i had in my table creation link. it worked as soon as i removed the extra /

Hi All,

i am getting the below error for one of the analysis.
could you tell me if there is any fix for this.

DBMS:
redshift

Error:
java.sql.SQLException: Amazon Invalid operation:
This type of correlated subquery pattern is not supported due to internal error;

SQL:
–HINT DISTRIBUTE_ON_KEY(stratum_1)

CREATE TABLE #s_tmpach_11

because of this error the final table achilles_results dint get created.

@Chris_Knoll Hi Chris, our team is facing problems setting up Atlas on AWS. Much appreciate it if you have any suggestions. Thank you!

@preetham_kotian

@rookie_crewkie @Chris_Knoll

I have a data source as redshift. with in datasource there are 3 CDM schemas which needs to be processed and show the data in ATLAS. I tried processing source schema 1 one it worked fine. when i process source schema 2 it gets processed but when i go and see in ATLAS both 1 and 2 has same records. Here my result schema remain the same only the source schema will change.

could you tell me if this works. Or should I create separate result schema for separate source schema.

what I am seeing here is once i process the source schema 2 data the source schema 1 data in ATLAS gets replaced by Source schema 2 data. and hence both will have same.

Hello @preetham_kotian,

Please use one result schema per data source, not one result schema for everything. It is intended that the results obtained from different data sources (Achilles, cohort generations, etc) are kept separate from each other, for better maintainability.

@rookie_crewkie
thanks for suggesting will try to do this.
really helps a lot

@rookie_crewkie @Chris_Knoll
we are using the git version that is currently present in

it has version 1.x in pom file for log4j
Do we need to change anything as of now for log4j issues.
if so do we need reinstall from the beginning.
could you please suggest us here.

t