We have a new community! Please visit support.timextender.com
Follow

How to Query ODX local Backlog SQLite database

Note: Making modifications to the backlog database may lead to inconsistent behavior in ODX.  Do not make any modifications to this file unless TimeXtender Support gives specific instructions to do so.

The ODX server uses a local backlog (SQLite database file) to keep meta data about data sources, storage and tasks etc.  This backlog is continuously synced with the cloud environment.  Analyzing the backlog may help troubleshoot some performance issues.

Note: Only metadata is stored in the backlog or cloud environment. Tasks, data source configuration, incremental load settings, table/field selection etc. The connection strings (user names, passwords etc.) are all encrypted.  

Download SQLite Studio for Windows (note: this is a 3rd party, open source tool)

Decompress the .zip file and run SQLiteStudio.exe

Navigate to the ODX backlog folder

In ODX 20.5 and earlier versions, the file is stored in:
C:\Program Files\TimeXtender\ODX <version>\BACKLOG\Backlog.sqlite

In ODX 20.10 and later versions, the file is stored in:
C:\ProgramData\TimeXtender\ODX\<version>\BACKLOG\Backlog.sqlite

Make a (backup) copy of Backlog.sqlite file (give it a descriptive name like Copy_Backlog.sqlite)

Note: Do not make extra connection to Backlog.sqlite file in use, since it may interfere with ODX activity in progress.  For analytics, always make a copy of the current backlog file and use the copy for analysis.

From SQLiteStudio UI, Database menu -> Connect to the database (e.g. the Copy_Backlog.sqlite file)

mceclip0.png

 

Useful Queries

Once you have successfully connected to the ODX Backlog SQLite file, you can download and open the below queries to retrieve useful info from the metadata database. 

Task Last Run Status

Download TaskLastRunStatus.sql query

mceclip0.png

 

Task Execution History

Download TaskExecutions.sql query

mceclip1.png

 

List of Data Source Tables

Download DataSourceTables.sql query

mceclip2.png

 

Was this article helpful?
1 out of 1 found this helpful

10 Comments

  • 0
    Avatar
    Andri Páll Heiðberg

    Is there any way (or plans) to see transfer progress on a table level?
    Currently, it's a bit of a black box as you only see Starting-Running-Completed for the transfer of the entire source.

  • 0
    Avatar
    Joseph Treadwell

    Hi Andri, Currently no plans around this as the ODX takes a bulk approach to ingesting data. If you need this, you can split out the table in question into a single task and examine it this way. 

  • 0
    Avatar
    Andri Páll Heiðberg

    Hi Joseph. The reason is more to be able to analyze the task itself. Similar to what you can do with the standard TX execution packages where you have the Gantt chart and you can easily spot what tables take significantly longer to extract. Even if it is a bulk approach, it must be done table by table.

  • 0
    Avatar
    Joseph Treadwell

    Hi Andri, this article simply covers a workaround until we have the full execution overview available in the TimeXtender Portal. 

  • 0
    Avatar
    Steven Koppenol

    Hi Joseph, why do you so strongly advise against reading directly from the sqlite file? After reading the docs, sqlite seems to support multiple concurrent read operations perfectly.

    Our ODX is doing more than 100 batches per day (8 sources * 12 imports) and the builtin reports are completely useless for keeping track of errors. (Typical TX UI problems - the default sort is wrong way around so the reports don't show the last executions)

    Not to mention the complete lack of active error notifications.

    How do other cusotmers keep track of their ODX? Has anybody found a way to get immediate error notifications?

  • 0
    Avatar
    Joseph Treadwell

    Hi Steven, as the BACKLOG file is meant for the ODX Backend only, we want to ensure users don't interfere with this communication and potentially create issues. 

    As mentioned in my commend above, we are currently working on a broader Execution overview in the TX Portal that should eliminate the need for this in the future. 

  • 0
    Avatar
    Michael Cano

    Hi Joseph, Can you give any updates to this effort: "a broader Execution overview in the TX Portal"?
    The post I'm commenting to is about a year and a half old.
    This is the legacy support article:
    https://legacysupport.timextender.com/hc/en-us/articles/360054444552-How-to-Query-ODX-local-Backlog-SQLite-database
    Thanks!
    Mike C

  • 0
    Avatar
    Dror Svartzman

    Hi Joseph,

    We would like to add ODX to our monitoring system.
    To the best of my knowledge CData does not have a SQLite driver available on TX.How would you recommend to connect ODX to the .sqlite backlog. I could not find a native OLEDB driver available.
    Please note that we will connect to a copy of the ODX repository and not the live version.

  • 0
    Avatar
    Joseph Treadwell
  • 0
    Avatar
    Dror Svartzman

    Hi Joseph,

    I did try the ODBC driver of ch-werner.de however I'm struggling with connecting CData with the driver. The lack of credential setup on de ODBC side might be the problem

    Is there KB article which explains the process? Current ODX logging is quite limited and we would like to get an integrated view of both ODX and DWH through a dashboard

Please sign in to leave a comment.