Skip to content

Health check

Health check is useful to see if all the components like Database, FileSystem, SparkCluster, etc. are working properly.

Health checks should be the first diagnostic check to make sure all components:

  • Are available
  • Can be connected to
  • And can be used correctly

The follow components support this command: corridor-api, corridor-app, corridor-worker.

This check can be done by running the command:

$ corridor-api check --help

Usage: corridor-api check [OPTIONS]

  Command to check Corridor connectivity with other services.

Options:
  --db / --no-db          Whether to check database connection
  --celery / --no-celery  Whether to check celery connection
  --fs / --no-fs          Whether to check filesystem connection
  --spark / --no-spark    Whether to check spark connection
  --help                  Show this message and exit.

To check the options available for other components:

  • corridor-app
corridor-app check --help
  • corridor-worker
corridor-worker check --help

Example

Example PASS output:

$ corridor-api check --db --fs --celery --spark

[PASS] DB Connection

[PASS] DB Read

[PASS] DB Write
[I 2020-01-01 15:00:36,678 alembic.runtime.migration] Context impl OracleImpl.
[I 2020-01-01 15:00:36,678 alembic.runtime.migration] Will assume non-transactional DDL.

[PASS] DB latest

[PASS] Celery broker read

[PASS] Celery broker write

[PASS] Celery results read

[PASS] Celery results write

[PASS] Spark connection: AppID=application_1459542433815_0002

[PASS] Spark Write

[PASS] Spark Read

[PASS] FS Connected

[PASS] FS Write

[PASS] FS Read

Example FAIL output:

$ corridor-api check --db --fs --celery --spark

[PASS] DB Connection

[PASS] DB Read

[PASS] DB Write
[I 2021-07-16 11:10:16,951 alembic.runtime.migration] Context impl OracleImpl.
[I 2021-07-16 11:10:16,951 alembic.runtime.migration] Will assume non-transactional DDL.

[PASS] DB latest

[PASS] Celery broker read

[PASS] Celery broker write

[PASS] Celery results read

[PASS] Celery results write

[FAIL] Spark connection: Exception occured - No module named 'pyspark'

[FAIL] Spark Write: Exception occured - No module named 'pyspark'

[FAIL] Spark Read: Exception occured - No module named 'pyspark'

[PASS] FS Connected

[FAIL] FS Write: Exception occured - permission denied

[FAIL] FS Read: Exception occured - permission denied

[FAIL] Spark connection: Exception occured - No module named 'pyspark'
[FAIL] Spark Write: Exception occured - No module named 'pyspark'
[FAIL] Spark Read: Exception occured - No module named 'pyspark'
[FAIL] FS Write: Exception occured - permission denied
[FAIL] FS Read: Exception occured - permission denied