OHDSI Home | Forums | Wiki | Github

The Book of OHDSI, Second Edition

As mentioned when announcing the Book of OHDSI at the OHDSI US Symposium 2019, our goal is to have a new edition of the book roughly every year. I would like to aim to have the Second Edition ready by the OHDSI US Symposium 2020.

We need your help! Please contribute to any or all of these tasks:

  1. Find problems in the current version of the book. Any typo or other error that needs fixing.

  2. Fix problems that have been reported.

  3. Propose additional content / updates to existing contents. For example, I have two topics I’d like to add:

    a. Update the Data Quality chapter. The Data Quality Dashboard has matured, and we’re about to launch the CohortDiagnostics package, which I’d like to describe here

    b. Update the Network Research chapter with information about the new ohdsi-studies organization and app.

    I’m sure others will have other topics/updates they’d like to propose.

Post any issues and proposal in the issue tracker. We’ll also set up a section in the preface to record all major changes since the first edition.

4 Likes

Is there any desire to add installation details to the book?

Could you help me understand what you mean by ‘installation details’? There are some installation instructions here and here. Did you mean you would like to see those expanded?

I set out to build a separate database with Synthea data in it, and started from the beginning creating a new database and schemas, loading vocabulary, installing WebAPI, the results schema, etc. I found the top-level that ties the different projects, wiki pages and schema together was hard to find. For example, at first I was surprised that ETL-Synthea was running the CDM ddl. I have been reading the WebAPI, CDM-Configuration, WebAPI-Installation-Guide, PostgreSQL-Installation-Guide wiki pages.

I’m finishing a bash script that runs through most of it through Achilles and ATLAS. It explains a lot, though it might be more useful that way in English than bash.

I was basically looking for how to do what Broadsea and OHDSI-in-a-Box have done. I’ve used Broadsea, but wanted to see under the hood. I’ve pretty much got it figured out. The toughest parts where finding the different DDL parts, what runs them, how the different schema should be named, and where those names get configured:

  • an ETL gets CDM from the CommonDataModel and sets that up along with its data.
  • Achilles queries its REST endpoint and applies it for the results schema.
  • WebAPI uses flyway for the webapi schema.

At a higher level, the idea that you can have a single setup but many CDM datasources configured with source and source_daimon was a bit cloudy, though I didn’t start right with the sources you mentioned, and read some older forum posts that may confused things a bit.

1 Like
t